CN105718192A - Mobile terminal and touch input method therefor - Google Patents

Mobile terminal and touch input method therefor Download PDF

Info

Publication number
CN105718192A
CN105718192A CN201610025558.4A CN201610025558A CN105718192A CN 105718192 A CN105718192 A CN 105718192A CN 201610025558 A CN201610025558 A CN 201610025558A CN 105718192 A CN105718192 A CN 105718192A
Authority
CN
China
Prior art keywords
described
area
touch
gesture
input
Prior art date
Application number
CN201610025558.4A
Other languages
Chinese (zh)
Inventor
甘大勇
Original Assignee
联想(北京)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 联想(北京)有限公司 filed Critical 联想(北京)有限公司
Priority to CN201110150810.1A priority Critical patent/CN102819331B/en
Priority to CN201610025558.4A priority patent/CN105718192A/en
Publication of CN105718192A publication Critical patent/CN105718192A/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for entering handwritten data, e.g. gestures, text
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04803Split screen, i.e. subdividing the display area or the window area into separate subareas

Abstract

The present invention provides a mobile terminal and a touch input method therefor. The touch input method is applied to the mobile terminal. The mobile terminal comprises a display unit and a touch-sensing unit. The touch-sensing unit is arranged above the display unit; a touch region of the touch-sensing unit overlaps with a display region of the display unit; the display unit is used for displaying an object in the mobile terminal on the display region; and the touch region is divided into a first region and a second region, which do not overlap with each other. The touch input method comprises: detecting a gesture input; determining whether a start point of the gesture input is located within the first region or the second region; when it is determined that the start point of the gesture input is located within the second region, generating a system management command corresponding to the gesture input; when it is determined that the start point of the gesture input is located within the first region, generating an object operating command used for operating the object and corresponding to the gesture input; and executing the system management command or the object operating command.

Description

Mobile terminal and touch inputting method thereof

The application is the divisional application of following application for a patent for invention:

Application number: 201110150810.1

The applying date: on June 7th, 2011

Denomination of invention: mobile terminal and touch inputting method thereof

Technical field

The present invention relates to the field of mobile terminal, more particularly it relates to an mobile terminal and touch inputting method thereof.

Background technology

In recent years, there is the mobile terminal of touch screen obtain and develop rapidly.In such mobile terminal, generally touch sensing unit stacking is arranged on the top of display unit to form touch display screen.User by carrying out the gesture input as touched or slide to touch display screen so that mobile terminal performs operation accordingly.

Generally, when user performs slide, the position of slip gesture starting point on described touch display screen is random.In existing mobile terminal, the starting point of gesture of no matter sliding zone line beyond the edge touching display screen or flash trimming edge, described mobile terminal is all treated as identical to.In other words, described mobile terminal performs different operations not according to the difference of the starting point of slip gesture.

Summary of the invention

Because above-mentioned situation, the invention provides a kind of mobile terminal and touch inputting method thereof, it can according to the difference of the starting point of slip gesture (more specifically, boundary slip operation and middle slip operation) and perform different operations, thus facilitating user to send various operational order by simple gesture, improve Consumer's Experience.

According to one embodiment of the invention, provide a kind of touch inputting method, it is applied in mobile terminal, described mobile terminal includes display unit and touch sensing unit, described touch sensing unit is located at above described display unit, the touch area of described touch sensing unit overlaps with the viewing area of described display unit, described display unit is for showing the object in described mobile terminal in described viewing area, described touch area is divided into first area and the second area of non-overlapping copies, and described touch inputting method includes: detect a gesture input;Judge that the starting point that described gesture inputs is positioned at described first area or described second area, to produce a judged result;When described judged result represents that the starting point that described gesture inputs is positioned at described second area, produce the system management command corresponding with the input of described gesture;When described judged result represents that the starting point that described gesture inputs is positioned at described first area, producing the Object Operations order corresponding with the input of described gesture, wherein, described Object Operations order is used for operating described object;And perform described system management command or described Object Operations order;Wherein, described second area is the edge of described first area;Described touch sensing unit is made up of the multiple touch sensors being disposed in an array, and the touch sensor array of described first area and the touch sensor array of described second area do not share touch sensor;Described second area is corresponding to sensor peripherally located in described touch sensor array, and described first area is corresponding to sensor centrally located in described touch sensor array;Described second area is region or line.

According to another embodiment of the present invention, provide a kind of mobile terminal, including display unit, for showing the object in described mobile terminal in described viewing area, described mobile terminal also includes: touch sensing unit, detects a gesture input, wherein, described touch sensing unit is located at above described display unit, and the touch area of described touch sensing unit overlaps with the viewing area of described display unit, and described touch area is divided into first area and the second area of non-overlapping copies;Judging unit, it is judged that the starting point of described gesture input is positioned at described first area or described second area, to produce a judged result;Order generation unit, when described judged result represents that the starting point that described gesture inputs is positioned at described second area, produces the system management command corresponding with the input of described gesture;When described judged result represents that the starting point that described gesture inputs is positioned at described first area, producing the Object Operations order corresponding with the input of described gesture, wherein, described Object Operations order is used for operating described object;And command executing unit, perform described system management command or described Object Operations order;Wherein, described second area is the edge of described first area;Described touch sensing unit is made up of the multiple touch sensors being disposed in an array, and the touch sensor array of described first area and the touch sensor array of described second area do not share touch sensor;Described second area is corresponding to sensor peripherally located in described touch sensor array, and described first area is corresponding to sensor centrally located in described touch sensor array;Described second area is region or line.

According to another embodiment of the present invention, it is provided that a kind of mobile terminal, including: display unit, for showing the object in described mobile terminal in described viewing area;Touch sensing unit, is located at above described display unit, is used for detecting a gesture input, and the touch area of described touch sensing unit overlaps with the viewing area of described display unit, and described touch area is divided into first area and the second area of non-overlapping copies;Processor;Wherein, described processor is configured to: judge that the starting point that described gesture inputs is positioned at described first area or described second area, to produce a judged result;When described judged result represents that the starting point that described gesture inputs is positioned at described second area, produce the system management command corresponding with the input of described gesture;When described judged result represents that the starting point that described gesture inputs is positioned at described first area, producing the Object Operations order corresponding with the input of described gesture, wherein, described Object Operations order is used for operating described object;And perform described system management command or described Object Operations order;Wherein, described second area is the edge of described first area;Described touch sensing unit is made up of the multiple touch sensors being disposed in an array, and the touch sensor array of described first area and the touch sensor array of described second area do not share touch sensor;Described second area is corresponding to sensor peripherally located in described touch sensor array, and described first area is corresponding to sensor centrally located in described touch sensor array;Described second area is region or line.

According to another embodiment of the present invention, provide a kind of touch inputting method, it is applied to touch sensing unit, described touch sensing unit has an input area, described input area is divided into first area and the second area of non-overlapping copies, and the second coincident of the first edge of described input area and described second area, wherein, described second area is capable of identify that at least some of input operation with described second EDGE CONTACT of operating body, described first area is capable of identify that described operating body and the described second discontiguous input operation in edge, described touch inputting method includes: detect a gesture input;Judge whether the starting point that described gesture inputs is positioned at second area, to produce a judged result;When described judged result represents that the starting point that described gesture inputs is positioned at described first area, produce the first order;When described judged result represents that the starting point that described gesture inputs is positioned at described second area, produce second order different from described first order;And perform described first order or described second order;Wherein, described second area is the edge of described first area;Described touch sensing unit is made up of the multiple touch sensors being disposed in an array, and the touch sensor array of described first area and the touch sensor array of described second area do not share touch sensor;Described second area is corresponding to sensor peripherally located in described touch sensor array, and described first area is corresponding to sensor centrally located in described touch sensor array;Described second area is region or line.

According to one embodiment of the invention, provide a kind of touch inputting method, it is applied in mobile terminal, described mobile terminal includes display unit and touch sensing unit, described touch sensing unit is located at above described display unit, the touch area of described touch sensing unit overlaps with the viewing area of described display unit, described display unit is for showing the object in described mobile terminal in described viewing area, described touch area is divided into first area and the second area of non-overlapping copies, and described touch inputting method includes: detect a gesture input;Judge that the starting point that described gesture inputs is positioned at described first area or described second area, to produce a judged result;When described judged result represents that the starting point that described gesture inputs is positioned at described second area, produce the system management command corresponding with the input of described gesture;When described judged result represents that the starting point that described gesture inputs is positioned at described first area, producing the Object Operations order corresponding with the input of described gesture, wherein, described Object Operations order is used for operating described object;And perform described system management command or described Object Operations order.

The end point of described gesture input may be located at described first area.

The end point of described gesture input may be located at described second area.

Described second area can be the edge of described first area.

Described generation inputs corresponding system management command with described gesture and may include that the type identifying the input of described gesture;And when identifying the slide to the left that the input of described gesture is positioned at second area for starting point, produce backward command.

According to another embodiment of the present invention, provide a kind of mobile terminal, including display unit and touch sensing unit, described touch sensing unit is located at above described display unit, the touch area of described touch sensing unit overlaps with the viewing area of described display unit, described display unit is for showing the object in described mobile terminal in described viewing area, described touch area is divided into first area and the second area of non-overlapping copies, described mobile terminal includes: detection unit, detects a gesture input;Judging unit, it is judged that the starting point of described gesture input is positioned at described first area or described second area, to produce a judged result;Order generation unit, when described judged result represents that the starting point that described gesture inputs is positioned at described second area, produces the system management command corresponding with the input of described gesture;When described judged result represents that the starting point that described gesture inputs is positioned at described first area, producing the Object Operations order corresponding with the input of described gesture, wherein, described Object Operations order is used for operating described object;And command executing unit, perform described system management command or described Object Operations order.

Described order generation unit may include that recognition unit, identifies the type that described gesture inputs;And backward command generation unit, when identifying the slide to the left that the input of described gesture is positioned at second area for starting point, produce backward command.

According to another embodiment of the present invention, it is provided that a kind of mobile terminal, including: display unit, for showing the object in described mobile terminal in described viewing area;Touch sensing unit, is located at above described display unit, is used for detecting a gesture input, and the touch area of described touch sensing unit overlaps with the viewing area of described display unit, and described touch area is divided into first area and the second area of non-overlapping copies;Processor;Wherein, described processor is configured to: judge that the starting point that described gesture inputs is positioned at described first area or described second area, to produce a judged result;When described judged result represents that the starting point that described gesture inputs is positioned at described second area, produce the system management command corresponding with the input of described gesture;When described judged result represents that the starting point that described gesture inputs is positioned at described first area, producing the Object Operations order corresponding with the input of described gesture, wherein, described Object Operations order is used for operating described object;And perform described system management command or described Object Operations order.

According to another embodiment of the present invention, it is provided that a kind of mobile terminal, including a touch input region, described touch input region includes multiple edge, and described mobile terminal includes: detection unit, detects a gesture input;Judging unit, it is judged that whether the starting point of described gesture input is positioned at one of the plurality of edge, to produce a judged result;Order generation unit, when described judged result represents that the starting point that described gesture inputs is positioned at the plurality of edge for the moment, produces the system management command corresponding with the input of described gesture;When described judged result represents that the starting point that described gesture inputs is not at the plurality of edge arbitrary, producing to input corresponding Object Operations order with described gesture, wherein, described Object Operations order is used for operating described object;And command executing unit, perform described system management command or described Object Operations order.

According to another embodiment of the present invention, provide a kind of touch inputting method, it is applied to touch sensing unit, described touch sensing unit has an input area, described input area is divided into first area and the second area of non-overlapping copies, and the second coincident of the first edge of described input area and described second area, wherein, described second area is capable of identify that at least some of input operation with described second EDGE CONTACT of operating body, described first area is capable of identify that described operating body and the described second discontiguous input operation in edge, described touch inputting method includes: detect a gesture input;Judge whether the starting point that described gesture inputs is positioned at second area, to produce a judged result;When described judged result represents that the starting point that described gesture inputs is positioned at described first area, produce the first order;When described judged result represents that the starting point that described gesture inputs is positioned at described second area, produce second order different from described first order;And perform described first order or described second order.

In mobile terminal according to embodiments of the present invention and touch inputting method thereof, slide the position of starting point of gesture by detecting user, and perform different orders according to the difference of the position of described starting point, allow users to operate described mobile terminal more easily and perform various orders, improve Consumer's Experience.

Accompanying drawing explanation

Fig. 1 is the flow chart illustrating touch inputting method according to embodiments of the present invention;

Fig. 2 is the flow chart illustrating touch inputting method according to another embodiment of the present invention;

Fig. 3 is the block diagram of the main configuration illustrating mobile terminal according to embodiments of the present invention;

Fig. 4 is the block diagram illustrating the main configuration of mobile terminal according to another embodiment of the present invention;

Fig. 5 is the block diagram illustrating the main configuration of mobile terminal according to another embodiment of the present invention;And

Fig. 6 A to Fig. 6 C is the schematic diagram of indicative icon operating body operation on touch sensitive display unit.

Detailed description of the invention

The embodiment of the present invention is described in detail below with reference to accompanying drawing.

First, with reference to Fig. 1, touch inputting method according to embodiments of the present invention is described.

Touch inputting method according to embodiments of the present invention is applied in mobile terminal.Described mobile terminal includes arranged stacked to form display unit and the touch sensing unit of touch sensitive display unit.Such as, described touch sensing unit can be arranged in the top of described display unit.Described touch sensing unit is made up of the multiple touch sensors being disposed in an array.The touch area of described touch sensing unit overlaps with the viewing area of described display unit.In other words, the area equation of the area of described touch area and described viewing area.Described display unit is for showing the object in described mobile terminal in described viewing area.The icon etc. of described object such as picture, webpage, audio frequency or application program.

Additionally, described touch area is divided into first area and the second area of non-overlapping copies.Described second area is such as the marginal area of described touch area.Described first area is such as the central area in described touch area except described marginal area.Such as, when described touch sensitive display unit is rectangle, described second area is such as the four edges edge of described touch sensitive display unit, and described first area is such as region except described four edges edge in described touch sensitive display unit.More specifically, as it has been described above, described touch sensing unit is made up of the multiple touch sensors being disposed in an array.Described first area and described second area do not have intersection point, say, that the touch sensor array of described first area and the touch sensor array of described second area do not share touch sensor.Described second area is such as corresponding to sensor peripherally located in described touch sensor array, and described first area is such as corresponding to sensor centrally located in described touch sensor array.Described second area can be a region, it is also possible to be a line.Alternatively, such as, described second area can region residing for the sensor of outermost a line and/or string in described touch sensor array, and such as region residing for the sensor except the sensor except this outermost a line and/or string, described first area.

As it is shown in figure 1, in the touch inputting method of the embodiment of the present invention, first, in step S101, described touch inputting method detects a gesture input by described touch sensing unit.

Hereafter, in step S102, described touch inputting method judges that the starting point that described gesture inputs is positioned at described first area or described second area, to produce a judged result.Specifically, for instance, described touch inputting method can sense, by described touch sensing unit, a series of tracing points that described gesture inputs.Hereafter, the starting point that the first tracing point in described a series of tracing points is inputted by described touch inputting method as gesture, and its described starting point of position judgment according to described starting point is positioned at first area or second area, to obtain a judged result.

When described judged result represents that the starting point that described gesture inputs is positioned at described second area, described touch inputting method proceeds to step S103.In step S103, described touch inputting method produces the system management command corresponding with the input of described gesture.Described system management command is used for managing system-level operation, for instance, described system management command can be main interface command, task manager order, backward command, menucommand etc..

More specifically, the tracing point that described touch inputting method can input according to described gesture, identify the type that described gesture inputs.Its processing method is known to those skilled in the art, is not described in detail in this.

Such as, when the four edges edge that described second area is described touch sensitive display unit, when described in described touch inputting method identification, gesture input is the slide that the right genesis from described touch sensitive display unit is slided to the left, described touch inputting method produces backward command.

Again such as, when described in described touch inputting method identification, gesture input is the slide that the left side genesis from described touch sensitive display unit is slided to the right, described touch inputting method produces task manager order.

As another example, when described in described touch inputting method identification, gesture input is for during from the slide of the following genesis upward sliding of described touch sensitive display unit, described touch inputting method produces menucommand.

As another example, when gesture input described in described touch inputting method identification is for double to when carrying out the operation slided inside touch sensitive display unit from any side genesis of described touch sensitive display unit in the given time, described touch inputting method produces main interface command.

As another example, when described in described touch inputting method identification, gesture input is for its tracing point all slides in second area, it is also possible to produce reservation system administration order.

The corresponding relation that it is pointed out that between type and gesture input and the system management command of type that above-mentioned gesture inputs, system management command is only used as example and provides.Those skilled in the art can suitably change on this basis as required.

In addition it should be pointed out that, the situation of the four edges edge being described touch sensitive display unit for second area above is described.It will be understood by those skilled in the art that described second area is not limited to this, but can be any region being appropriately arranged with.Such as, described second area can be the blocked areas extended a substantial distance to the inside from each edge of described touch sensitive display unit.

After step S103 produces described system management command, described touch inputting method proceeds to step S105.

When described judged result represents that the starting point that described gesture inputs is positioned at described first area, described touch inputting method proceeds to step S104.In step S104, described touch inputting method produces the Object Operations order corresponding with the input of described gesture.Described Object Operations order for operate on described display unit display such as the object of webpage, image or control (informing or icon) in Android system etc..Such as, described Object Operations order can be object movement directive, object the Scale command, object display command etc..

More specifically, the tracing point that described touch inputting method can input according to described gesture, identify the type that described gesture inputs.Its processing method is known to those skilled in the art, is not described in detail in this.

Such as, described display unit shows a pictures, when described in described touch inputting method identification, gesture input is the slide slided to the right in described first area, described touch inputting method produces the order of next pictures for DISPLAY ORDER arrangement.

Again such as, described display unit shows webpage, when described in described touch inputting method identification, gesture input is the slide of the slide downward in described first area, described touch inputting method produces the order for webpage scrolls down through display.

The corresponding relation that it is pointed out that between type and gesture input and the Object Operations order of type that above-mentioned gesture inputs, Object Operations order is only used as example and provides.Those skilled in the art can suitably change on this basis as required.

In addition it should be pointed out that, in superincumbent description, only describe the situation judging that the starting point that gesture inputs is positioned at first area or second area, and the end point for gesture input not limits.That is, for instance, when the starting point that described gesture inputs is positioned at second area, the end point of described gesture input may be located in described first area, it is also possible to is positioned at described second area.Such as, when described touch inputting method is by including that described in a series of tracing point identifications of starting point and end point, gesture input is for the slide in second area all the time, it is possible to produce corresponding system management command.Again such as, when described touch inputting method by gesture input described in described tracing point identification for sliding into first area from second area in predetermined space, slide into again the slide of second area in the same direction time, the corresponding system management command of the equally possible generation of described touch inputting method.In addition, when described touch inputting method by gesture input described in described tracing point identification for sliding into first area from second area in predetermined space, reversely again return to again the slide of described second area time, corresponding system management command can be produced, alternatively, in the case, described touch inputting method can not also respond.

After step S104 produces described Object Operations order, described touch inputting method proceeds to step S105.

In step S105, described touch inputting method performs described system management command or described Object Operations order.

Above, touch inputting method according to embodiments of the present invention is described.By detecting gesture input, and the starting point according to gesture input is positioned at first area or described second area and produces different orders.Thus, after user distinguishes first area and second area (especially, zone line and marginal area) by simple study, it is possible to indicate mobile terminal to perform different orders by shirtsleeve operation, thus the operation of convenient user.

It is pointed out that in the touch inputting method of above-described embodiment, display unit and touch sensing unit are set to arranged stacked and the area equation of display unit and touch sensing unit.But, display unit and touch sensing unit also can be set to arranged stacked, and the area of display unit and touch sensing unit need not be equal.Below, the operation of touch inputting method according to another embodiment of the present invention will be described with reference to Fig. 2.

In the present embodiment, described mobile terminal includes touch sensing unit, and described touch sensing unit is made up of the multiple touch sensors arranged in the matrix form.Additionally, described touch sensing unit has a touch input region.Described touch input region includes multiple edge.Such as, when described touch input region is rectangle, described touch input region includes four edges edge.Each edge edge corresponds to a row or column touch sensor.

As in figure 2 it is shown, in step S201, the operation with step S101 is similar, described touch inputting method detects a gesture input by described touch sensing unit.

In step S202, described touch inputting method judges whether the starting point that described gesture inputs is positioned at one of the plurality of edge, to produce a judged result.Specifically, described touch inputting method performs described judgement by described touch sensor array.When a row or column sensor outermost in described touch sensor array senses gesture input and other sensors except this row or this sensor do not sense the input of described gesture, described touch inputting method judges that the starting point that described gesture inputs is positioned at one of described a plurality of edge.When in described touch sensor array the sensor of outermost a line and string all do not sense gesture input and any sensor except this row and this sensor except sense described gesture input time, described touch inputting method judges that the starting point that described gesture inputs is not on one of described a plurality of edge.

When described judged result represents that the starting point that described gesture inputs is positioned at the plurality of edge for the moment, described touch inputting method proceeds to step S203.In step S203, the operation with step S103 is similar, and described touch inputting method produces the system management command corresponding with the input of described gesture.Hereafter, described touch inputting method proceeds to step S205.

When described judged result represents that the starting point that described gesture inputs is not at the plurality of edge arbitrary, described touch inputting method proceeds to step S204.In step S204, the operation with step S104 is similar, and described touch inputting method produces the Object Operations order corresponding with the input of described gesture.Hereafter, described touch inputting method proceeds to step S205.

In step S205, the operation with step S105 is similar, and described touch inputting method performs described system management command or described Object Operations order.

By the touch inputting method of the embodiment of the present invention, user can indicate the different order of mobile terminal execution by the operation that boundary slip operation and middle slip operation both are different, thus the convenient operation of user.In addition it should be pointed out that, in the touch inputting method of this embodiment of the invention, display unit and touch sensing unit need not arranged stacked, and display unit need not be equal with the area of touch sensing unit.Even, described mobile terminal self can include display unit.

Above, touch inputting method according to embodiments of the present invention is described.Below, mobile terminal according to embodiments of the present invention is described reference Fig. 3-Fig. 5.

As it is shown on figure 3, mobile terminal according to embodiments of the present invention includes display unit 305, for showing the object in described mobile terminal in described viewing area.Described mobile terminal 300 also includes: touch sensing unit 301, judging unit 302, order generation unit 303 and command executing unit 304.

Wherein, described touch sensing unit 301 detects a gesture input.Such as, described touch sensing unit can detect a series of tracing point, thus recognition detection inputs to gesture.In addition, it is to be noted, described touch sensing unit 301 can be located at above described display unit, and the touch area of described touch sensing unit overlaps with the viewing area of described display unit, and described touch area is divided into first area and the second area of non-overlapping copies.

Described judging unit 302 judges that the starting point that described gesture inputs is positioned at described first area or described second area, to produce a judged result.Specifically, such as, after described detection unit 301 senses a series of tracing points that described gesture inputs, the starting point that the first tracing point in described a series of tracing points is inputted by described judging unit 302 as gesture, and its described starting point of position judgment according to described starting point is positioned at first area or second area, to obtain a judged result.

When described judged result represents that the starting point that described gesture inputs is positioned at described second area, described order generation unit 303 produces the system management command corresponding with the input of described gesture.When described judged result represents that the starting point that described gesture inputs is positioned at described first area, described order generation unit 303 produces the Object Operations order corresponding with the input of described gesture, and wherein, described Object Operations order is used for operating described object.

Wherein, described system management command is used for managing system-level operation, for instance, described system management command can be main interface command, task manager order, backward command, menucommand etc..

More specifically, described order generation unit 303 can include recognition unit, for the tracing point inputted according to described gesture, identify the type that described gesture inputs.Its processing method is known to those skilled in the art, is not described in detail in this.Additionally, described order generation unit 303 may also include as main interface command generation unit, task manager order generation unit, backward command generation unit, menucommand generation unit multiple unit.

Such as, when the four edges edge that described second area is described touch sensitive display unit, when described in described recognition unit identification, gesture input is the slide that the right genesis from described touch sensitive display unit is slided to the left, described backward command generation unit produces backward command.

Again such as, when described in described recognition unit identification, gesture input is the slide that the left side genesis from described touch sensitive display unit is slided to the right, described task manager order generation unit produces task manager order.

As another example, when described in described recognition unit identification, gesture input is for during from the slide of the following genesis upward sliding of described touch sensitive display unit, described menucommand generation unit produces menucommand.

As another example, when gesture input described in described recognition unit identification is for double to when carrying out the operation slided inside touch sensitive display unit from any side genesis of described touch sensitive display unit in the given time, described main interface command generation unit produces main interface command.

The corresponding relation that it is pointed out that between type and gesture input and the system management command of type that above-mentioned gesture inputs, system management command is only used as example and provides.Those skilled in the art can suitably change on this basis as required.

In addition it should be pointed out that, the situation of the four edges edge being described touch sensitive display unit for second area above is described.It will be understood by those skilled in the art that described second area is not limited to this, but can be any region being appropriately arranged with.Such as, described second area can be the blocked areas extended a substantial distance to the inside from each edge of described touch sensitive display unit.

On the other hand, when described judged result represents that the starting point that described gesture inputs is positioned at described first area, described order generation unit 303 produces the Object Operations order corresponding with the input of described gesture.Described Object Operations order for operate on described display unit display such as the object of webpage, image or control (informing or icon) in Android system etc..Such as, described Object Operations order can be object movement directive, object the Scale command, object display command etc..Accordingly, described order generation unit 303 can include multiple unit such as object movement directive generation unit, object the Scale command generation unit, object display command generation unit.

Such as, described display unit shows a pictures, when described in described recognition unit identification, gesture input is the slide slided to the right in described first area, described object movement directive generation unit produces the order of next pictures for DISPLAY ORDER arrangement.

Again such as, described display unit shows webpage, when described in described recognition unit identification, gesture input is the slide of the slide downward in described first area, described object movement directive generation unit produces the order for webpage scrolls down through display.

The corresponding relation that it is pointed out that between type and gesture input and the Object Operations order of type that above-mentioned gesture inputs, Object Operations order is only used as example and provides.Those skilled in the art can suitably change on this basis as required.

In addition it should be pointed out that, in superincumbent description, only describe the situation judging that the starting point that gesture inputs is positioned at first area or second area, and the end point for gesture input not limits.That is, for instance, when the starting point that described gesture inputs is positioned at second area, the end point of described gesture input may be located in described first area, it is also possible to is positioned at described second area.

Command executing unit 304 performs described system management command or described Object Operations order.Certainly, the execution result of described system management command or described Object Operations order can be displayed on described display unit 305.

Above, mobile terminal according to embodiments of the present invention is described.By described mobile terminal, user can pass through the identical operation of starting point different (such as, lay respectively at second area and first area) to indicate mobile terminal to perform different orders, thus the operation of convenient user.

Below, with reference to Fig. 4, mobile terminal according to another embodiment of the present invention is described.As shown in Figure 4, mobile terminal 400 includes display unit 401, touch sensing unit 402 and processor 403.

Wherein, display unit 401 is for showing the object in described mobile terminal in described viewing area.

Touch sensing unit 402 is located at above described display unit, is used for detecting a gesture input, and the touch area of described touch sensing unit overlaps with the viewing area of described display unit, and described touch area is divided into first area and the second area of non-overlapping copies.

Processor 403 is coupled with touch sensing unit 402 and display unit 401, and it is configured to perform following operation: based on the testing result of described touch sensing unit 402, judge that the starting point that described gesture inputs is positioned at described first area or described second area, to produce a judged result;When described judged result represents that the starting point that described gesture inputs is positioned at described second area, produce the system management command corresponding with the input of described gesture;When described judged result represents that the starting point that described gesture inputs is positioned at described first area, producing the Object Operations order corresponding with the input of described gesture, wherein, described Object Operations order is used for operating described object;And perform described system management command or described Object Operations order.The execution result of described system management command or described Object Operations order can be displayed on described display unit 301.

Above, the mobile terminal according to this embodiment of the invention is described.By described mobile terminal, user can pass through the identical operation of starting point different (such as, lay respectively at second area and first area) to indicate mobile terminal to perform different orders, thus the operation of convenient user.

It is pointed out that in the mobile terminal of above-described embodiment, display unit and touch sensing unit are set to arranged stacked and the area equation of display unit and touch sensing unit.But, display unit and touch sensing unit also can be set to arranged stacked, and the area of display unit and touch sensing unit need not be equal.Even, it may not be necessary to include described display unit.Below, with reference to Fig. 5, mobile terminal according to another embodiment of the present invention is described.In the mobile terminal of the present embodiment, described mobile terminal includes touch sensing unit, and described touch sensing unit is made up of the multiple touch sensors arranged in the matrix form.Additionally, described touch sensing unit has a touch input region.Described touch input region includes multiple edge.Such as, when described touch input region is rectangle, described touch input region includes four edges edge.Each edge edge corresponds to a row or column touch sensor.

As it is shown in figure 5, described mobile terminal 500 includes: detection unit 501, judging unit 502, order generation unit 503 and command executing unit 504.

Described detection unit 501 i.e. above-mentioned touch sensing unit, it can be made up of the multiple touch sensors arranged in the matrix form.Described detection unit 501 detects a gesture input by the plurality of touch sensor.

Described judging unit 502 judges whether the starting point that described gesture inputs is positioned at one of the plurality of edge, to produce a judged result.Specifically, in the touch sensor array of described detection unit 501, outermost a row or column sensor senses gesture input and other sensors except this row or this sensor except do not sense described gesture when inputting, and described judging unit 502 judges that the starting point that described gesture inputs is positioned at one of described a plurality of edge.In the touch sensor array of described detection unit 501, the sensor of outermost a line and string does not all sense gesture input and any sensor except this row and this sensor except senses described gesture when inputting, and described judging unit 502 judges that the starting point that described gesture inputs is not on one of described a plurality of edge.

When described judged result represents that the starting point that described gesture inputs is positioned at the plurality of edge for the moment, described order generation unit 503 produces the system management command corresponding with the input of described gesture;When described judged result represents that the starting point that described gesture inputs is not at the plurality of edge arbitrary, described order generation unit 503 produces the Object Operations order corresponding with the input of described gesture, wherein, described Object Operations order is used for operating described object.The configuration of described order generation unit 503 and operation and the configuration of described order generation unit 303 and operate similar, be not described in detail in this.

Described command executing unit 504 performs described system management command or described Object Operations order.The configuration of described command executing unit 504 and operation and the configuration of described command executing unit 304 and operate similar, be not described in detail in this.

By the mobile terminal of the embodiment of the present invention, user can indicate the different order of mobile terminal execution by the operation that boundary slip operation and middle slip operation both are different, thus the convenient operation of user.In addition it should be pointed out that, in the mobile terminal of this embodiment of the invention, display unit and touch sensing unit need not arranged stacked, and display unit need not be equal with the area of touch sensing unit.Even, described mobile terminal self can include display unit.

Below, touch inputting method according to another embodiment of the present invention is described.Described touch inputting method is applied to touch sensing unit.Described touch sensing unit has an input area.Described input area is divided into first area and the second area of non-overlapping copies and the second coincident of the first edge of described input area and described second area.

Additionally, described second area is capable of identify that at least some of input operation with described second EDGE CONTACT of operating body, and described first area is capable of identify that described operating body and the described second discontiguous input operation in edge.

Below with reference to Fig. 6 A to 6C described second area described and operation that described first area is capable of identify that.Fig. 6 A to Fig. 6 C with finger be illustrative illustrate the schematic diagram of operating body operation under three circumstances, wherein, elliptic region represents the finger of user, the input area being described touch sensing unit with the rectangular area that solid line surrounds, it is divided into two regions by a dotted line: by the first area S1 of dotted line and be clipped in the region S2 between dotted line and solid line.Additionally, the contact area that dash area is finger and touch sensing unit, and the touch point that P is the described finger identified by described touch sensing unit.

In Fig. 6 A to Fig. 6 C, Fig. 6 A and Fig. 6 B illustrates the operation that second area is capable of identify that, and Fig. 6 C illustrates the operation that first area is capable of identify that.In the case of figure 6 a, finger is from the edge of touch sensing unit described in described touch sensing unit external contact, and slid inward (not shown) subsequently.Now, the contact area of described finger and described touch sensing unit is only a bit, and this point is identified as the touch point of described finger by described touch sensing unit, namely puts P.Described some P is positioned at the edge of described touch sensing unit, and described edge is included in described second area.When Fig. 6 B, finger is from touch sensing unit described in the EDGE CONTACT of described touch sensing unit.Now, the contact area of described finger and described touch sensing unit is shadow region as depicted, and the touch point P of finger that described touch sensing unit identifies is similarly positioned in described second area.When Fig. 6 C, finger contacts described touch sensing unit non-intersectly with the edge of described touch sensing unit.Now, the contact area of described finger and described touch sensing unit is shadow region as depicted, and the touch point P of finger that described touch sensing unit identifies is positioned at described first area.

In described touch inputting method, first, a gesture input is detected.Hereafter, it is judged that whether the starting point of described gesture input is positioned at second area, to produce a judged result.When described judged result represents that the starting point that described gesture inputs is positioned at described first area, produce the first order;And when described judged result represents that the starting point that described gesture inputs is positioned at described second area, produce second order different from described first order.Hereafter, described touch inputting method performs described first order or described second order.The operation of described each step is similar with above-described embodiment, is not described in detail in this.

Above, mobile terminal according to embodiments of the present invention and touch inputting method thereof are described referring to figs. 1 through Fig. 6.

It should be noted that, in this manual, term " includes ", " comprising " or its any other variant are intended to comprising of nonexcludability, so that include the process of a series of key element, method, article or equipment not only include those key elements, but also include other key elements being not expressly set out, or also include the key element intrinsic for this process, method, article or equipment.When there is no more restriction, statement " including ... " key element limited, it is not excluded that there is also other identical element in including the process of described key element, method, article or equipment.

Finally, in addition it is also necessary to explanation, above-mentioned a series of process not only include the process performed in temporal sequence with order described here, and include the process performed parallel or respectively rather than in chronological order.

Through the above description of the embodiments, those skilled in the art is it can be understood that can add the mode of required hardware platform by software to the present invention and realize, naturally it is also possible to implement all through hardware.Based on such understanding, what background technology was contributed by technical scheme can embody with the form of software product in whole or in part, this computer software product can be stored in storage medium, such as ROM/RAM, magnetic disc, CD etc., including some instructions with so that a computer equipment (can be personal computer, server, or the network equipment etc.) perform the method described in some part of each embodiment of the present invention or embodiment.

In embodiments of the present invention, units/modules can realize with software, in order to is performed by various types of processors.For example, the executable code module of a mark can include one or more physics or the logical block of computer instruction, for example, it can be built as object, process or function.However, the executable code of identified module need not be physically located together, but the different instruction in can including being stored in not coordination, when these command logics combine, its Component units/module and realize the regulation purpose of this units/modules.

When units/modules can utilize software to realize, consider the level of existing hardware technique, so units/modules that can be implemented in software, when being left out cost, those skilled in the art can build the hardware circuit of correspondence and realize corresponding function, and described hardware circuit includes ultra-large integrated (VLSI) circuit of routine or the existing quasiconductor of gate array and such as logic chip, transistor etc or other discrete element.Module can also use programmable hardware device, and such as field programmable gate array, programmable logic array, programmable logic device etc. realize.

Above the present invention being described in detail, principles of the invention and embodiment are set forth by specific case used herein, and the explanation of above example is only intended to help to understand method and the core concept thereof of the present invention;Simultaneously for one of ordinary skill in the art, according to the thought of the present invention, all will change in specific embodiments and applications, in sum, this specification content should not be construed as limitation of the present invention.

Claims (13)

1. a touch inputting method, it is applied in mobile terminal, described mobile terminal includes display unit and touch sensing unit, described touch sensing unit is located at above described display unit, the touch area of described touch sensing unit overlaps with the viewing area of described display unit, in described viewing area, described display unit is for showing that the object in described mobile terminal, described touch area are divided into first area and the second area of non-overlapping copies, described touch inputting method includes:
Detect a gesture input;
Judge that the starting point that described gesture inputs is positioned at described first area or described second area, to produce a judged result;
When described judged result represents that the starting point that described gesture inputs is positioned at described second area, produce the system management command corresponding with the input of described gesture;
When described judged result represents that the starting point that described gesture inputs is positioned at described first area, producing the Object Operations order corresponding with the input of described gesture, wherein, described Object Operations order is used for operating described object;And
Perform described system management command or described Object Operations order;
Wherein, described second area is the edge of described first area;
Described touch sensing unit is made up of the multiple touch sensors being disposed in an array, and the touch sensor array of described first area and the touch sensor array of described second area do not share touch sensor;Described second area is corresponding to sensor peripherally located in described touch sensor array, and described first area is corresponding to sensor centrally located in described touch sensor array;Described second area is region or line.
2. touch inputting method as claimed in claim 1, wherein, described second area is the region residing for sensor of outermost a line and/or string in described touch sensor array, and described first area is the region residing for sensor except the sensor except this outermost a line and/or string.
3. touch inputting method as claimed in claim 1, wherein, when described judged result represents that the starting point that described gesture inputs is positioned at described second area, represent that the input of described gesture is for from the gesture of touch sensing unit slid inward described in described touch sensing unit external contact, producing the system management command corresponding with the input of described gesture.
4. touch inputting method as claimed in claim 1, wherein,
The end point of described gesture input is positioned at described first area.
5. touch inputting method as claimed in claim 1, wherein,
The end point of described gesture input is positioned at described second area.
6. touch inputting method as claimed in claim 1, wherein, described generation inputs corresponding system management command with described gesture and also includes:
Identify the type that described gesture inputs;And
When identifying the slide to the left that the input of described gesture is positioned at second area for starting point, produce backward command.
7. a mobile terminal, including display unit, for showing the object in described mobile terminal in described viewing area, described mobile terminal also includes:
Touch sensing unit, detecting a gesture input, wherein, described touch sensing unit is located at above described display unit, the touch area of described touch sensing unit overlaps with the viewing area of described display unit, and described touch area is divided into first area and the second area of non-overlapping copies;
Judging unit, it is judged that the starting point of described gesture input is positioned at described first area or described second area, to produce a judged result;
Order generation unit, when described judged result represents that the starting point that described gesture inputs is positioned at described second area, produces the system management command corresponding with the input of described gesture;When described judged result represents that the starting point that described gesture inputs is positioned at described first area, producing the Object Operations order corresponding with the input of described gesture, wherein, described Object Operations order is used for operating described object;And
Command executing unit, performs described system management command or described Object Operations order;
Wherein, described second area is the edge of described first area;
Described touch sensing unit is made up of the multiple touch sensors being disposed in an array, and the touch sensor array of described first area and the touch sensor array of described second area do not share touch sensor;Described second area is corresponding to sensor peripherally located in described touch sensor array, and described first area is corresponding to sensor centrally located in described touch sensor array;Described second area is region or line.
8. mobile terminal as claimed in claim 7, wherein, described second area is the region residing for sensor of outermost a line and/or string in described touch sensor array, and described first area is the region residing for sensor except the sensor except this outermost a line and/or string.
9. mobile terminal as claimed in claim 7, wherein, when described judged result represents that the starting point that described gesture inputs is positioned at described second area, representing that the input of described gesture is for from the gesture of touch sensing unit slid inward described in described touch sensing unit external contact, described order generation unit produces the system management command corresponding with the input of described gesture.
10. mobile terminal as claimed in claim 7, wherein, described order generation unit includes:
Recognition unit, identifies the type that described gesture inputs;And
Backward command generation unit, when identifying the slide to the left that the input of described gesture is positioned at second area for starting point, produces backward command.
11. a mobile terminal, including:
Display unit, for showing the object in described mobile terminal in described viewing area;
Touch sensing unit, is located at above described display unit, is used for detecting a gesture input, and the touch area of described touch sensing unit overlaps with the viewing area of described display unit, and described touch area is divided into first area and the second area of non-overlapping copies;
Processor;
Wherein, described processor is configured to:
Judge that the starting point that described gesture inputs is positioned at described first area or described second area, to produce a judged result;
When described judged result represents that the starting point that described gesture inputs is positioned at described second area, produce the system management command corresponding with the input of described gesture;
When described judged result represents that the starting point that described gesture inputs is positioned at described first area, producing the Object Operations order corresponding with the input of described gesture, wherein, described Object Operations order is used for operating described object;And
Perform described system management command or described Object Operations order;
Wherein, described second area is the edge of described first area;
Described touch sensing unit is made up of the multiple touch sensors being disposed in an array, and the touch sensor array of described first area and the touch sensor array of described second area do not share touch sensor;Described second area is corresponding to sensor peripherally located in described touch sensor array, and described first area is corresponding to sensor centrally located in described touch sensor array;Described second area is region or line.
12. a mobile terminal, including a touch input region, described touch input region includes multiple edge, and described mobile terminal includes:
Detection unit, detects a gesture input;
Judging unit, it is judged that whether the starting point of described gesture input is positioned at one of the plurality of edge, to produce a judged result;
Order generation unit, when described judged result represents that the starting point that described gesture inputs is positioned at the plurality of edge for the moment, produces the system management command corresponding with the input of described gesture;When described judged result represents that the starting point that described gesture inputs is not at the plurality of edge arbitrary, producing the Object Operations order corresponding with the input of described gesture, wherein, described Object Operations order is used for the object operating in described mobile terminal;And
Command executing unit, performs described system management command or described Object Operations order.
13. a touch inputting method, it is applied to touch sensing unit, described touch sensing unit has an input area, described input area is divided into first area and the second area of non-overlapping copies, and the second coincident of the first edge of described input area and described second area, wherein, described second area is capable of identify that at least some of input operation with described second EDGE CONTACT of operating body, described first area is capable of identify that described operating body and the described second discontiguous input operation in edge, and described touch inputting method includes:
Detect a gesture input;
Judge whether the starting point that described gesture inputs is positioned at second area, to produce a judged result;
When described judged result represents that the starting point that described gesture inputs is positioned at described first area, produce the first order;
When described judged result represents that the starting point that described gesture inputs is positioned at described second area, produce second order different from described first order;And
Perform described first order or described second order;
Wherein, described second area is the edge of described first area;
Described touch sensing unit is made up of the multiple touch sensors being disposed in an array, and the touch sensor array of described first area and the touch sensor array of described second area do not share touch sensor;Described second area is corresponding to sensor peripherally located in described touch sensor array, and described first area is corresponding to sensor centrally located in described touch sensor array;Described second area is region or line.
CN201610025558.4A 2011-06-07 2011-06-07 Mobile terminal and touch input method therefor CN105718192A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201110150810.1A CN102819331B (en) 2011-06-07 2011-06-07 Mobile terminal and touch inputting method thereof
CN201610025558.4A CN105718192A (en) 2011-06-07 2011-06-07 Mobile terminal and touch input method therefor

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610025558.4A CN105718192A (en) 2011-06-07 2011-06-07 Mobile terminal and touch input method therefor

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN201110150810.1A Division CN102819331B (en) 2011-06-07 2011-06-07 Mobile terminal and touch inputting method thereof

Publications (1)

Publication Number Publication Date
CN105718192A true CN105718192A (en) 2016-06-29

Family

ID=47303469

Family Applications (2)

Application Number Title Priority Date Filing Date
CN201110150810.1A CN102819331B (en) 2011-06-07 2011-06-07 Mobile terminal and touch inputting method thereof
CN201610025558.4A CN105718192A (en) 2011-06-07 2011-06-07 Mobile terminal and touch input method therefor

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN201110150810.1A CN102819331B (en) 2011-06-07 2011-06-07 Mobile terminal and touch inputting method thereof

Country Status (1)

Country Link
CN (2) CN102819331B (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2939088A4 (en) * 2012-12-28 2016-09-07 Nokia Technologies Oy Responding to user input gestures
CA2949088A1 (en) * 2014-05-15 2015-11-19 Federal Express Corporation Wearable devices for courier processing and methods of use thereof
CN105718183A (en) * 2014-12-03 2016-06-29 天津富纳源创科技有限公司 Operation method of touch device
CN104657073A (en) * 2015-01-22 2015-05-27 上海华豚科技有限公司 Half-screen operating method of mobile phone interface
CN104702795B (en) * 2015-03-27 2017-03-01 努比亚技术有限公司 Mobile terminal and its shortcut operation method
CN104935990B (en) * 2015-06-01 2018-04-10 天脉聚源(北京)传媒科技有限公司 A kind of control method and device of switching channels
CN105487805A (en) * 2015-12-01 2016-04-13 小米科技有限责任公司 Object operating method and device
CN105681594B (en) * 2016-03-29 2019-03-01 努比亚技术有限公司 A kind of the edge interactive system and method for terminal

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101308416A (en) * 2007-05-15 2008-11-19 宏达国际电子股份有限公司 User interface operation method and its recording medium
CN101414229A (en) * 2007-10-19 2009-04-22 集嘉通讯股份有限公司;技嘉科技股份有限公司 Method and apparatus for controlling switch of handhold electronic device touch control screen
US20100027854A1 (en) * 2008-07-31 2010-02-04 Manjirnath Chatterjee Multi-purpose detector-based input feature for a computing device
CN101943962A (en) * 2009-07-03 2011-01-12 深圳富泰宏精密工业有限公司;奇美通讯股份有限公司 Portable electronic device with touch key

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101304461B1 (en) * 2006-12-04 2013-09-04 삼성전자주식회사 Method and apparatus of gesture-based user interface
US20090174679A1 (en) * 2008-01-04 2009-07-09 Wayne Carl Westerman Selective Rejection of Touch Contacts in an Edge Region of a Touch Surface
CN102023735B (en) * 2009-09-21 2016-03-30 联想(北京)有限公司 A kind of touch input device, electronic equipment and mobile phone

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101308416A (en) * 2007-05-15 2008-11-19 宏达国际电子股份有限公司 User interface operation method and its recording medium
CN101414229A (en) * 2007-10-19 2009-04-22 集嘉通讯股份有限公司;技嘉科技股份有限公司 Method and apparatus for controlling switch of handhold electronic device touch control screen
US20100027854A1 (en) * 2008-07-31 2010-02-04 Manjirnath Chatterjee Multi-purpose detector-based input feature for a computing device
CN101943962A (en) * 2009-07-03 2011-01-12 深圳富泰宏精密工业有限公司;奇美通讯股份有限公司 Portable electronic device with touch key

Also Published As

Publication number Publication date
CN102819331A (en) 2012-12-12
CN102819331B (en) 2016-03-02

Similar Documents

Publication Publication Date Title
JP5955861B2 (en) Touch event prediction in computer devices
CN101727179B (en) Object execution method and apparatus
US20130155018A1 (en) Device and method for emulating a touch screen using force information
US9075095B2 (en) Device and method for localized force sensing
US20120102437A1 (en) Notification Group Touch Gesture Dismissal Techniques
KR20140138147A (en) Sensing user input at display area edge
US9870109B2 (en) Device and method for localized force and proximity sensing
US20080309630A1 (en) Techniques for reducing jitter for taps
US9122947B2 (en) Gesture recognition
RU2635285C1 (en) Method and device for movement control on touch screen
KR20150014083A (en) Method For Sensing Inputs of Electrical Device And Electrical Device Thereof
US20130154999A1 (en) Multi-Surface Touch Sensor Device With User Action Detection
US20130154955A1 (en) Multi-Surface Touch Sensor Device With Mode of Operation Selection
US8212782B2 (en) Apparatus, method, and medium of sensing movement of multi-touch point and mobile apparatus using the same
US9916051B2 (en) Device and method for proximity sensing with force imaging
US20090289902A1 (en) Proximity sensor device and method with subregion based swipethrough data entry
TWI569171B (en) Gesture recognition
WO2014004642A1 (en) Enrollment using synthetic fingerprint image and fingerprint sensing systems
US8378989B2 (en) Interpreting ambiguous inputs on a touch-screen
US20120249475A1 (en) 3d user interface control
CN108845719A (en) It is sensed using the power of bottom side power map
US20110157040A1 (en) Touchpanel device, and control method and program for the device
US9442650B2 (en) Systems and methods for dynamically modulating a user interface parameter using an input device
WO2015084665A1 (en) User interface adaptation from an input source identifier change
CN103927082A (en) Gesture-based user interface method and apparatus

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination