CN105335007B - Method of toch control, user equipment, input processing method and mobile terminal - Google Patents
Method of toch control, user equipment, input processing method and mobile terminal Download PDFInfo
- Publication number
- CN105335007B CN105335007B CN201510819757.8A CN201510819757A CN105335007B CN 105335007 B CN105335007 B CN 105335007B CN 201510819757 A CN201510819757 A CN 201510819757A CN 105335007 B CN105335007 B CN 105335007B
- Authority
- CN
- China
- Prior art keywords
- incoming event
- touch
- edge
- event
- area
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Telephone Function (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The invention discloses a kind of method of toch control, user equipment, input processing method and mobile terminal, the method for toch control includes: the touch signal for detecting and resulting from touch panel;Touch point is identified according to touch signal;Detect the rotation angle of touch panel;According to the touch point and the rotation angle identified, judge that touch point is located at edge touch area or normal touch area;Corresponding instruction is executed based on judging result.The beneficial effects of the practice of the present invention is, it can be achieved that edge touch area is converted accordingly according to the rotation of touch screen, preferably to adapt to the operation of user, raising user experience;The operation in the area A and the area C is just distinguished in application framework floor, and carries out the foundation of virtual unit in application framework layer, avoids and distinguishes the dependence of the area A and the area C to hardware in driving floor;It is numbered by setting touch point, it can be achieved that differentiation finger, compatible A agreement and B agreement.
Description
Technical field
The present invention relates to communication fields, more specifically to a kind of method of toch control, user equipment, input processing
Method and mobile terminal.
Background technique
With the development of mobile terminal technology, terminal frame becomes narrow.In order to improve the input experience of user, edge is defeated
Enter technology (for example, edge touch-control) to come into being.
The edge of the prior art inputs, after detected touch point information (touch info), in driving layer i.e. according to touch
Point information judges whether touch-control occurs the region inputted at edge.
However, in practice since input chip is there are diversity, layer is driven to obtain the method for touch point information also all band
There is extremely strong specific aim, this is resulted in when judging event type (whether being edge incoming event), needs to input core to each money
Piece makes the modification and transplanting of differentiation, larger workload and easy error.
On the other hand, driving layer can choose two kinds of implementations of A agreement or B agreement, wherein B in reported event
Consultation distinguishes finger ID.And the realization of edge input needs to rely on finger ID, in Random seismic field for comparing same finger
The data that front and back is clicked twice.Therefore, the input scheme of the prior art is only capable of supporting B agreement, and does not use the driving of A agreement then not
It can be supported.
Furthermore the edge input area of existing mobile terminal be it is fixed, cannot with the rotation of mobile terminal and into
The corresponding transformation of row, poor user experience.
Therefore, that there are hardware-dependences is strong for the input scheme of the prior art, cannot support A agreement and B agreement simultaneously, and use
The defect of family experience difference, needs to improve.
Summary of the invention
The technical problem to be solved in the present invention is that for the prior art above-mentioned mobile terminal edge input mode not
The defect that can be converted accordingly according to the rotation of mobile terminal provides a kind of method of toch control, user equipment, input
Processing method and mobile terminal.
The technical solution adopted by the present invention to solve the technical problems is:
In a first aspect, providing a kind of method of toch control, comprising:
Detection results from the touch signal on touch panel;
Touch point is identified according to touch signal;
Detect the rotation angle of touch panel;
According to the touch point and the rotation angle identified, judge that touch point is located at edge touch area or normal touching
Touch region;
Corresponding instruction is executed based on judging result.
In one embodiment, the rotation angle includes: 0 degree of rotation, rotates clockwise 90 degree, rotates clockwise 180
It spends, rotates clockwise 270 degree, is rotated by 90 ° counterclockwise, rotating 180 degree and counterclockwise 270 degree of rotation counterclockwise.
In one embodiment, the touch point and the rotation angle that the basis identifies, judge that touch point is located at side
Edge touch area or normal touch area include:
If rotating angle is 0 degree, as Wc < x < (W-Wc), touch point is located at normal touch area, otherwise, touch point
Positioned at edge touch area;
If rotating angle is 90 degree clockwise, as Wc < y < H-Wc, touch point is located at normal touch area, otherwise, touching
It touches and is a little located at edge touch area;
If rotation angle is 180 degree clockwise, as Wc < x < (W-Wc), touch point is located at normal touch area, no
Then, touch point is located at edge touch area;
If rotating angle is 270 degree clockwise, as Wc < y < H-Wc, touch point is located at normal touch area, otherwise,
Touch point is located at edge touch area;
Wherein, x is the horizontal axis coordinate for being located at touch panel place coordinate system of touch point, and x is to touch being located at for touch point
The horizontal axis coordinate of coordinate system where panel, W are the width of touch panel, and Wc is the width of edge Petting Area.
Second aspect provides a kind of user equipment, comprising: touch screen, action sensor and processor;
Touch screen, comprising: touch panel and touch controller, in which:
Touch panel, for detecting the touch signal resulted from touch panel;
Touch controller, for identifying touch point according to touch signal;
Action sensor, for detecting the rotation angle of the user equipment;
Processor, comprising: drive module, application framework module and application module, in which:
The drive module for obtaining incoming event according to the touch signal, and is reported to the application framework mould
Block;
The application framework module judges to touch for the touch point position according to rotation angle and the incoming event reported
It touches and is a little located at edge touch area or normal touch area;
Application module, for executing corresponding instruction based on judging result.
The third aspect provides a kind of input processing method, comprising:
Driving layer obtains the incoming event that user is generated by input equipment, and is reported to application framework layer;
Current state and the incoming event that reports of the application framework layer according to mobile terminal, judge that incoming event is that edge is defeated
Incoming event or normal incoming event then carry out processing identification to normal incoming event if normal incoming event, and will identification
As a result it is reported to application layer, processing identification then is carried out to edge incoming event if edge incoming event, and will be on recognition result
Offer application layer;
Application layer executes corresponding instruction according to the recognition result reported.
In one embodiment, the method also includes:
For the one input equipment object with device identification of each incoming event creation.
In one embodiment, described for the one input equipment object packet with device identification of each incoming event creation
It includes:
Normal incoming event is corresponding with having the touch screen of the first device identification;
One the second input equipment object with the second device identification of application framework layer setting is opposite with edge incoming event
It answers.
In one embodiment, the driving layer obtains the incoming event that user is generated by input equipment, and is reported to
Application framework layer includes:
The driving layer is that each touch point assigns one for distinguishing the number of finger, and A protocol is used to report institute
State incoming event.
In one embodiment, the driving layer obtains the incoming event that user is generated by input equipment, and is reported to
Application framework layer includes:
The driving layer reports the incoming event using B agreement;
The method also includes:
The application framework layer is that each touch point in the incoming event assigns the number for distinguishing finger.
In one embodiment, the current state of the mobile terminal includes: 0 degree of rotation, rotates clockwise 90 degree, up time
Needle rotation 180 degree rotates clockwise 270 degree, is rotated by 90 ° counterclockwise, rotating 180 degree and counterclockwise 270 degree of rotation counterclockwise.
In one embodiment, if rotation angle is 0 degree, as Wc < x < (W-Wc), then application framework layer judgement input
Event is normal incoming event, is edge incoming event otherwise;
If rotating angle is 90 degree clockwise, as Wc < y < H-Wc, then application framework layer judges that incoming event is normal
Otherwise incoming event is edge incoming event;
If rotation angle is 180 degree clockwise, as Wc < x < (W-Wc), then application framework layer judges that incoming event is
Otherwise normal incoming event is edge incoming event;
If rotating angle is 270 degree clockwise, as Wc < y < H-Wc, then application framework layer judges that incoming event is positive
Otherwise normal incoming event is edge incoming event;
Wherein, x is the horizontal axis coordinate for being located at touch panel place coordinate system of touch point, and x is to touch being located at for touch point
The horizontal axis coordinate of coordinate system where panel, W are the width of touch panel, and Wc is the width of edge Petting Area.
Fourth aspect provides a kind of mobile terminal, comprising:
Input equipment;
Action sensor, for detecting the current state of the mobile terminal;
Layer, the incoming event generated for obtaining user by input equipment are driven, and is reported to application framework layer;
Application framework layer, for judging that incoming event is according to the current state of mobile terminal and the incoming event reported
Edge incoming event or normal incoming event then carry out processing identification to normal incoming event if normal incoming event, and
Recognition result is reported to application layer, processing identification then is carried out to edge incoming event if edge incoming event, and will identification
As a result it is reported to application layer;
Application layer, for executing corresponding instruction according to the recognition result reported.
In one embodiment, the normal incoming event and the first input equipment object phase with the first device identification
It is corresponding;
The application framework layer is also used to be arranged a second input equipment object with the second device identification, is used for and institute
It is corresponding to state edge incoming event.
In one embodiment, the driving layer reports incoming event using A agreement or B agreement, reports according to A agreement
Incoming event, then the event obtains module and is also used to assign one for distinguishing the number of finger for each touch point;
Incoming event is reported according to B agreement, then the application framework floor is also used to assign for each touch point for area
The number that departure refers to.
In one embodiment, the driving layer includes that event obtains module, is produced for obtaining user by input equipment
Raw incoming event.
In one embodiment, the application framework layer includes input reader;
The mobile terminal further includes the device node being set between the driving layer and the input reader, for leading to
Know that the input reader obtains incoming event;
The input reader, for traversing device node, obtaining incoming event and reporting.
In one embodiment, the current state of the mobile terminal includes: 0 degree of rotation, rotates clockwise 90 degree, up time
Needle rotation 180 degree rotates clockwise 270 degree, is rotated by 90 ° counterclockwise, rotating 180 degree and counterclockwise 270 degree of rotation counterclockwise.
In one embodiment, the application framework layer further include: first event processing module, for reading the input
The incoming event for taking device to report reports after carrying out coordinate calculating;
First judgment module, for being reported according to the current state and the first event processing module of the mobile terminal
Coordinate value judge whether incoming event is edge incoming event, if not then incoming event is reported.
In one embodiment, the application framework layer further include:
Second event processing module, after the incoming event for reporting to the input reader carries out coordinate calculating
Report;
Second judgment module, for being reported according to the current state and the second event processing module of the mobile terminal
Coordinate value judge whether incoming event is edge incoming event, if then incoming event is reported.
In one embodiment, if rotation angle is 0 degree, as Wc < x < (W-Wc), then judging result is incoming event
It is otherwise edge incoming event for normal incoming event;
If rotating angle is 90 degree clockwise, as Wc < y < H-Wc, then it is normally to input that judging result, which is incoming event,
Otherwise event is edge incoming event;
If rotation angle is 180 degree clockwise, as Wc < x < (W-Wc), then it is normal that judging result, which is incoming event,
Otherwise incoming event is edge incoming event;
If rotating angle is 270 degree clockwise, as Wc < y < H-Wc, then it is normal defeated that judging result, which is incoming event,
Otherwise incoming event is edge incoming event;
Wherein, x is the horizontal axis coordinate for being located at touch panel place coordinate system of touch point, and x is to touch being located at for touch point
The horizontal axis coordinate of coordinate system where panel, W are the width of touch panel, and Wc is the width of edge Petting Area.
In one embodiment, the application framework layer further include:
Event distributes module, and the event for reporting second judgment module and the first judgment module carries out
Report.
In one embodiment, the application framework layer further include:
First application module;
Second application module;
Third judgment module judges thing for distributing the device identification for including in the event that module reports according to the event
Whether part is edge incoming event, if belonging to, is reported to first application module, is otherwise reported to when second application
Module;
First application module, for being identified according to the relevant parameter of normal incoming event to normal incoming event
And recognition result is reported into application layer;
Second application module, for being identified according to the relevant parameter of edge incoming event to edge incoming event
And the application layer for reporting recognition result.
In one embodiment, the input equipment is the touch screen of mobile terminal;
The touch screen includes at least one edge input area and at least one normal input area.
In one embodiment, the input equipment is the touch screen of mobile terminal;
The touch screen includes at least one edge input area, at least one normal input area and at least one transition region.
Implement method of toch control of the invention, user equipment, input processing method and mobile terminal, it can be achieved that according to touching
The rotation of screen is touched, it is corresponding to convert edge touch area, preferably to adapt to the operation of user, improve user experience;Another party
Face carries out the foundation of virtual unit due to just distinguishing the operation in the area A and the area C in application framework floor, and in application framework layer,
It avoids and distinguishes the dependence of the area A and the area C to hardware in driving floor;It is numbered by setting touch point, it can be achieved that differentiation finger, compatible
A agreement and B agreement;And can be integrated into the operating system of mobile terminal, applicable different hardware, it is different types of it is mobile eventually
End is portable good;All elements (coordinate, number of touch point etc.) of touch point are stored, can subsequent judgement edge input
(for example, FIT) provides convenience.
Detailed description of the invention
Present invention will be further explained below with reference to the attached drawings and examples, in attached drawing:
Fig. 1 is the hardware structural diagram of the mobile terminal of one embodiment of the invention;
Fig. 2 is that the touch screen zone of the mobile terminal of first embodiment of the invention divides schematic diagram;
Fig. 3 is that the touch screen of the mobile terminal of the embodiment of the present invention rotates schematic diagram when angle is 0 degree;
Fig. 4 is that the touch screen of the mobile terminal of the embodiment of the present invention rotates schematic diagram when angle is 90 degree clockwise;
Fig. 5 is that the touch screen of the mobile terminal of the embodiment of the present invention rotates schematic diagram when angle is 180 degree clockwise;
Fig. 6 is that the touch screen of the mobile terminal of the embodiment of the present invention rotates schematic diagram when angle is 270 degree clockwise;
Fig. 7 is the flow diagram of the method for toch control of the embodiment of the present invention;
Fig. 8 is the software architecture schematic diagram of the mobile terminal of one embodiment of the invention;
Fig. 9 is the structural schematic diagram of the mobile terminal of one embodiment of the invention;
Figure 10 is the flow diagram that edge incoming event is judged in the embodiment of the present invention;.
Figure 11 is the flow diagram that the embodiment of the present invention judges incoming event according to device identification;
Figure 12 is the flow chart of the input processing method of the embodiment of the present invention;
Figure 13 is the input processing method using the embodiment of the present invention to the camera for rotating mobile terminal when angle is 0 degree
Using the effect diagram opened;
Figure 14 is the input processing method using the embodiment of the present invention to the mobile end rotated when angle is 90 degree clockwise
The effect diagram that the camera applications at end are opened;
Figure 15 is that the touch screen zone of the mobile terminal of second embodiment of the invention divides schematic diagram;
Figure 16 is the hardware structural diagram of the user equipment of one embodiment of the invention.
Specific embodiment
For a clearer understanding of the technical characteristics, objects and effects of the present invention, now control attached drawing is described in detail
A specific embodiment of the invention.
Referring to Fig. 1, the mobile terminal of one embodiment of the invention includes: input equipment, processor 903 and display screen 904.?
In one embodiment, input equipment is touch screen 2010.Touch screen 2010 includes touch panel 901 and touch controller 902.This
Outside, input equipment can be also non-touch input equipment (for example, infrared input equipment etc.) etc..
Touch controller 902 can be single asic (ASIC), may include one or more processors
Subsystem, processor subsystem may include one or more arm processors or other places with similar functions and performance
Manage device.
Touch controller 902 is mainly used for receiving the touch signal for resulting from touch panel 901, and transmits after being handled
To the processor 903 of mobile terminal.This processing is for example, for physical input signal is carried out analog-to-digital conversion, processing is touched
Point coordinate, processing obtain touching duration etc..
Processor 903 receives the output of touch controller 902, executes movement based on the output after being handled.It is described dynamic
Work includes but is not limited to, it is mobile such as by the object of table or indicator, rolling or panning, adjustment control setting, open file or
Document, check menu, alternatively, execute instruction, operate the peripheral equipment for being couple to host equipment, answer calls, dial
Make a phone call, termination telephone calling, change volume or audio setting, be stored in the relevant information of telephone communication (for example, address, often
With number, connect calling, missed call), log into thr computer or computer network, allow to authorize individual access computer or calculating
The confined area of machine network, record user profiles associated with the configuration of the user preferences of computer desktop allow to access network
Content, starting specific program, encryption or decoding message, etc..
Processor 903 is also connect with display screen 904.Display screen 904 is used to provide UI to the user of equipment.
In some embodiments, processor 903 can be the component separated with touch controller 902.In other embodiments
In, processor 903 can be the component of a synthesis with touch controller 902.
In one embodiment, touch panel 901 is provided with discrete capacitive sensor, resistance sensor, power and passes
Sensor, optical sensor or similar sensor etc..
It include to be made of an electrically conducting material horizontal and vertical electrod-array in touch panel 901.For a M row and N
The single-touch screen (only can determine that the coordinate of single-touch) of column electrod-array, touch controller 902 are scanned using self-capacitance, then
It can be calculated according to every a line and each column signal after scanning M row and N column respectively, positioning finger is on the touchscreen
Coordinate.Scanning times are M+N times.
(coordinate of multiple spot, i.e. multiple spot can be detected and parsed for the multi-contact touch screen of a M row and N column electrod-array
Touch-control), touch controller 902 is scanned using multiconductor mutual capacitance, the intersection spot scan to row and column, and scanning times are M as a result,
× n times.
When the finger touch panel of user, touch panel generates touch signal (for electric signal) and is sent to touch controller
902.The coordinate that touch controller 902 passes through the available touch point of scanning.In one embodiment, the touch of touch screen 2010
Panel 901 is physically a set of independent coordinate positioning, and the touch point coordinate touched every time is reported to processor 903
Afterwards, the pixel coordinate for being adapted to display screen 904 is converted to by processor 903, is inputted and is operated with correct identification.
It is referring to fig. 2 the region division schematic diagram of the touch panel of first embodiment of the invention.In this embodiment, in order to
It realizes edge false-touch prevention and new interactive mode is provided, the touch panel of touch screen is divided into three regions, wherein the area C
101 be edge input area, and the area A 100 is normal input area.
In an embodiment of the present invention, the input operation in the area A, is handled, example according to existing normal processing mode
Such as, certain application icon is clicked in the area A 100 open the application etc..For the input operation in the area C 101, it is defeated to may be defined as edge
Enter processing mode, for example, can define bilateral sliding in the area C 101 carries out terminal acceleration etc..
In an embodiment of the present invention, fixed form division or customized division can be used in the area C.Fixed partition is arranged
Regular length, fixed broadband region as the area C 101.The area C 101 may include the partial region and the right side on the left of touch panel
The partial region of side, position are fixed at the both sides of the edge of touch panel, as shown in Figure 1.It certainly, can also be only in one side edge
Place divides the area C 101.
Customized division, the i.e. number in the region in the area C 101, position and size, the setting that can customize, for example, can by with
Family is set, and can also adjust quantity, position and the size in the region in the area C 101 by mobile terminal according to self-demand.In general,
The fundamental figure in the area C 101 is designed as rectangle, if diagonal two apex coordinates of tablet pattern be can determine the area C position and
Size.
To meet different user to the use habit of different application, also settable more sets applied under different application scene
The plan of establishment of the area C.For example, because icon occupy-place is more, relative narrower is arranged to obtain in the C sector width of two sides under system desktop;And
After click camera icon enters camera applications, the area C quantity, position, size under this settable scene are not influencing focusing
In the case of, settable relatively wide of C sector width.
The embodiment of the present invention to the division in the area C, set-up mode with no restriction.
Referring to Fig. 3, touch panel upper left corner T0 is arranged to coordinate origin, and coordinate value is (0,0).And the right side of touch panel
The coordinate value of inferior horn is T7 (W, H), wherein W is the width of touch panel, and H is the height of touch panel.
In one embodiment of the invention, touch screen is divided into the area A and the area C as described above, the area A and the area C belong to together
One coordinate system.After the touch panel of mobile terminal is divided into multiple regions, coordinate also correspondence is divided.For example, if
The width of touch panel is W, and C sector width is Wc, then coordinate is located at the touch point in region defined by T0, T1, T4 and T5,
And/or coordinate is located at the touch point in region defined by T2, T3, T6 and T7, is defined as edge touch point;And coordinate is located at
Touch point in region defined by T1, T2, T5 and T6 is defined as normal touch point.
Referring to fig. 4, touch screen is rotated by 90 ° clockwise as initial orientation using touch screen orientation described in above-mentioned Fig. 3, this
When, coordinate system does not change.For ease of operation, C zone position is changed, and referring to fig. 4, touch screen rotates clockwise
After 90 degree, coordinate is located at touch point in region defined by T0, S2, S4 and T3 and/or coordinate is located at T4, S1, T7 and S3
Defined by touch point in region, be defined as edge touch point;And coordinate is located at region defined by S1, S2, S3 and S4
Interior touch point is defined as normal touch point.
Referring to Fig. 5, touch screen is rotated clockwise as initial orientation by 180 degree using touch screen orientation described in above-mentioned Fig. 3,
At this point, coordinate system does not change, C zone position does not change.
Referring to Fig. 6, using touch screen orientation described in above-mentioned Fig. 3 as initial orientation, touch screen is rotated 270 degree clockwise,
At this point, coordinate system does not change, the position in the area C and above-mentioned shown in Fig. 4 identical.
Touch screen state as shown in figures 3 to 6, the coordinate system of touch screen do not change, i.e., no matter mobile terminal
Touch screen be in above-mentioned Fig. 3-Fig. 6 any state or it is other rotation angle state (these rotation status can by movement pass
The detection of sensor 906 obtains), when touch panel 901 receives touch signal, the coordinate for the touch point that touch controller 902 reports
All it is to be reported according to coordinate system shown in Fig. 3, the rotation status of touch screen will not be paid close attention to.And since touch screen 2010 is sent out
After raw rotation, display screen 904 is also rotated accordingly, and processor 903 carries out the coordinate that touch controller 902 is reported
The conversion of adaptability is to adapt to the pixel coordinate of display screen 904.It is stored in memory 905 between rotation angle and conversion method
Corresponding relationship, such conversion will be introduced subsequent.
Referring to Fig. 7, be based on above-mentioned mobile terminal, the method for toch control of the embodiment of the present invention the following steps are included:
S100, detection result from the touch signal on touch panel.
S101, touch point is identified according to touch signal.
Specifically, generating touch signal, touch controller when finger or other object touch panels generate touch gestures
The signal is detected, and obtains the physical coordinates of touch point by modes such as scannings.In embodiments of the present invention, using such as Fig. 3-figure
Coordinate system shown in 6.
From the above mentioned, the touch screen of the mobile terminal of the embodiment of the present invention is divided into edge Petting Area and normal touch
Therefore area is defined the touch gestures of not same district respectively.In one embodiment, the touch gestures packet of normal Petting Area
It includes: click, double click, slide etc..The touch gestures of edge Petting Area include: sliding, left side edge downslide, right edge in left side edge
Cunning on edge, bilateral upper cunning, bilateral downslide, holds mobile phone quadrangle, unilateral slide, holds back and forth, being hold by one hand right side edge downslide
Deng.
It should be understood that " left side " and " right side " here is in contrast, for example, if shown in Fig. 3, then where M point
Region is " left side ", and opposite side is " right side ".If shown in Fig. 4, then the region where M point is " left side ", opposite side
As " right side ".I.e. in the embodiment of the present invention, " left side " and " right side " changes with the rotation of touch screen.
S102, the rotation angle for detecting touch panel judge to touch according to the touch point and the rotation angle identified
Point is located at edge touch area or normal touch area.
Specifically, the rotation angle of touch panel can be detected the rotation angle of mobile terminal by action sensor to obtain
Out.
The physical coordinates that processor is reported according to touch controller judge region belonging to touch point.In implementation of the invention
In example, the coordinate range in each region is stored in memory.
Referring to Fig. 3 and Fig. 5, the coordinate range of edge touch area are as follows: coordinate is located at area defined by T0, T1, T4 and T5
In domain and/or coordinate is located in region defined by T2, T3, T6 and T7.The coordinate range of normal touch area are as follows: coordinate bit
In the region defined by T1, T2, T5 and T6.
Referring to fig. 4 and Fig. 6, it is rotated or when 270 degree of rotations clockwise when touch screen occurs 90 degree clockwise, edge Petting Area
The coordinate range in domain are as follows: coordinate is located in region defined by T0, S2, S4 and T3 and/or coordinate is located at T4, S1, T7 and S3 institute
In the region of restriction.The coordinate range of normal touch area are as follows: coordinate is located in region defined by S1, S2, S3 and S4.
S103, corresponding instruction is executed based on judging result.
Specifically, therefore, it is necessary to will since the coordinate of touch panel and the coordinate of display screen are two independent coordinate systems
The physical coordinates of touch panel are mapped as the pixel coordinate of display screen, to realize that correct display contact effect, identification touch hand
Gesture.Specifically, transformation rule are as follows:
Rotating angle is 0, that is, when being in state shown in Fig. 3, for touch point M, the coordinate that touch controller reports is
(xc, yc), then without being converted, i.e. the coordinate of display screen is similarly (xc, yc).
When to rotate angle be 90 degree clockwise, that is, when being in state shown in Fig. 4, for touch point M, on touch controller
The coordinate of report is (xc, yc), then the coordinate after converting is (yc, W-xc).
When rotation angle is 180 degree clockwise, that is, when being in state shown in fig. 5, for touch point M, touch controller
The coordinate reported is (xc, yc), then the coordinate after converting is (W-xc, H-yc).
When rotation angle is 270 degree clockwise, that is, when being in state shown in fig. 6, for touch point M, touch controller
The coordinate reported is (xc, yc), then the coordinate after converting is (H-yc, xc).
It should be understood that above-mentioned transformation rule is built upon the size of display screen coordinate system and the size phase of touch panel coordinate system
(for example, being 1080 × 1920 pixels) on the basis of, if the size of the coordinate system of display screen and touch panel coordinate system
It is not identical, then after above-mentioned conversion, it to be also adjusted to adapt to the coordinate of display screen, specifically, by the seat of touch panel
Scalar multiplication is with corresponding conversion coefficient.The ratio of the size of conversion coefficient, that is, display screen and touch panel.For example, if touch panel is
720 × 1280, and display screen is 1080 × 1920, then the ratio of display screen and touch panel is 1.5, the touching that will be reported as a result,
The abscissa and ordinate for touching the physical coordinates of panel were (xc, yc), are then converted to display screen coordinate originally respectively multiplied by 1.5
Shi Ze becomes (1.5 × xc, 1.5 × yc), or (1.5 × yc, 1.5 × W-xc) etc..
After coordinate conversion and adjustment, accurate display can be realized, identify correct touch control gesture, thus executes and touch
Control the corresponding instruction of gesture.In an embodiment of the present invention, touch control gesture is corresponded and is stored in memory with instruction.
The method of toch control of the embodiment of the present invention can be realized converts edge Petting Area according to the rotation of touch screen accordingly
User experience is improved preferably to adapt to the operation of user in domain.
Referring to Fig. 8, the software architecture schematic diagram of the mobile terminal of one embodiment of the invention.The mobile end of the embodiment of the present invention
The software architecture at end includes: input equipment 201, driving layer 202, application framework layer 203 and application layer 204.Wherein, layer is driven
202, the function of application framework layer 203 and application layer 204 is executed by processor 903.In one embodiment, input equipment 201
It is the touch screen for including touch panel and touch controller.
Input equipment 201 receives the input operation of user, is changed into touch signal for being physically entered, touch signal is passed
It is handed to driving layer 202;The position of 202 pairs of layer inputs of driving parses, and obtains the ginseng such as specific coordinate, duration of touch point
The parameter is uploaded to application framework layer 203 by number, and application framework layer 203 can pass through corresponding interface with the communication of driving layer 202
To realize.Application framework layer 203 receives the parameter that reports of driving layer 202, is parsed, and edge incoming event and normal is distinguished
Incoming event, and which the specific application for being passed up to application layer 204 will be effectively inputted, to meet application layer 204
Different input operation instruction is executed according to different input operations.
It is the structural schematic diagram of the mobile terminal of one embodiment of the invention referring to Fig. 9.In one embodiment of the present of invention
In, input equipment includes touch screen 2010 described above.Driving layer 202 includes that event obtains module 2020.In driving layer 202
Device node 2021 is provided between application framework layer 203.Application framework layer 203 includes input reader 2030, the first thing
Part processing module 2031, second event processing module 2032, first judgment module 2033, the second judgment module 2034 and event group
Send out module 2035, third judgment module 2036, the first application module 2037, second application module 2038 etc..
Wherein, driving layer 202 includes that event obtains module 2010, passes through what input equipment 201 generated for obtaining user
Incoming event, for example, the input action event carried out by touch screen.In an embodiment of the present invention, incoming event includes: just
Normal incoming event (area A incoming event) and edge incoming event (area C incoming event).Normal incoming event is included in the progress of the area A
The inputs operation such as click, double click, slide.Edge incoming event includes sliding in the left side edge that the area C carries out, under left side edge
Cunning in sliding, right side edge, bilateral upper cunning, bilateral downslide, gripping mobile phone quadrangle, unilateral cunning back and forth, holds one at right side edge downslides
The inputs operation such as hold, be hold by one hand.
In addition, event, which obtains module 2010, is also used to obtain the correlations such as coordinate, the duration of touch point of input operation
Parameter.Incoming event is reported according to A agreement, then event obtains module 2010 and is also used to assign one for area for each touch point
The number (ID) that departure refers to.Incoming event is reported according to A agreement as a result, then the data reported include the coordinate of touch point, hold
The parameters such as continuous time and the number of touch point.
It is provided with device node 2011 between driving layer 202 and input reader 2030, for notifying application framework layer 203
It inputs reader (input reader) 2030 and obtains incoming event.
Reader 2030 is inputted, for traversing device node, obtaining incoming event and reporting.If layer 202 is driven to assist using B
View reports incoming event, then inputs reader 2030 and be also used to assign the number (ID) for distinguishing finger for each touch point.
In an embodiment of the present invention, input reader 2030 be also used to by all element informations of touch point (coordinate, the duration,
Number etc.) it is stored.
In an embodiment of the present invention, different incoming events is distinguished for the ease of application layer 204 to be responded, it is each
The one input equipment object with device identification of incoming event creation.In one embodiment, it can be created for normal incoming event
First input equipment object, with first identifier.First input equipment object is corresponding with actual hardware touch screen.
In addition, application framework layer 203 further includes one second input equipment object 2031.The second input equipment object 2031
(for example, edge input equipment, FIT device) is virtual unit, and an as null device has a second identifier, is used for and side
Edge incoming event is corresponding.It should be understood that can also be by edge incoming event and the first input equipment object phase with first identifier
It is corresponding, and normal control event is corresponding with having the second input equipment object of second identifier.
First event processing module 2031, the incoming event for reporting to input reader 2030 are handled, for example,
The coordinate of touch point calculates.
Second event processing module 2032, the incoming event for reporting to input reader 2030 are handled, for example,
The coordinate of touch point calculates.
First judgment module 2033 is used to according to coordinate value (X value) judge whether event is edge incoming event, if not
Event is then uploaded into event and distributes module 2035.
Second judgment module 2034 is used to judge whether event is edge incoming event according to coordinate value (X value), if then
Event is uploaded into event and distributes module 2035.
Referring to Figure 10, first judgment module 2033 obtains touch point when whether judge event is edge incoming event
Horizontal axis coordinate compares the horizontal axis coordinate (i.e. X axis coordinate) (x) of touch point with C sector width (Wc) and touch screen width (W)
Compared with.Specifically, touch point is located at the area A if Wc < x < (W-Wc), event is normal incoming event;Otherwise, event is edge input
Event;Event is reported to distribute module 2035 event if event is not edge incoming event (as normal incoming event).
Likewise, the second judgment module 2034 is sentenced when whether judge event is edge incoming event according to mode shown in Fig. 4
It is disconnected, if it is edge incoming event that judging result, which is event, be reported to event to distribute module 2035 event.
It should be understood that judgement process shown in Fig. 10 is built upon on the basis of the touch screen of mobile terminal as shown in Figure 2,
I.e. mobile terminal includes the area C 101 for being located at left and right sides edge, and is located in the middle the area A 100.Therefore, when along shown in Fig. 3
Coordinate system carry out setting coordinate when, can determine that touch point is located at the area A if Wc < x < (W-Wc).In other embodiments, judge
Formula (Wc < x < (W-Wc)) can be adjusted according to the division of mobile terminal area, for example, if mobile terminal only includes a position
In the area C 101 of left side edge, and its width is Wc, then as Wc < x < W, touch point is located at the area A;Otherwise, touch point is located at C
Area.If mobile terminal only includes the area C 101 for being located at right side edge, and its width is Wc, then as x < (W-Wc), touches
Point is located at the area A;Otherwise, touch point is located at the area C.
It should be understood that this rotation can be detected in action sensor, and rotation information is passed when mobile terminal rotates
Pass processor.In the embodiment of the present invention, the testing result of processor combination action sensor carries out sentencing for incoming event region
It is disconnected.Specifically, rotating to be state shown in Fig. 4, then first judgment module and second is sentenced if rotation angle is 90 degree clockwise
The judgment basis of disconnected module becomes: if Wc < y < H-Wc, touch point is located at the area A, and otherwise, touch point is located at the area C.Wherein, y is
The Y axis coordinate of touch point.
If rotation angle is 180 degree clockwise, that is, state shown in fig. 5 is rotated to be, then first judgment module and second is sentenced
The judgment basis of disconnected module are as follows: if Wc < x < (W-Wc), touch point is located at the area A, and otherwise, touch point is located at the area C.
If rotating angle is 270 degree clockwise, that is, state shown in fig. 6 is rotated to be, then first judgment module and second is sentenced
The judgment basis of disconnected module becomes: if Wc < y < H-Wc, touch point is located at the area A, and otherwise, touch point is located at the area C.Wherein, y is
The Y axis coordinate of touch point.
If should be understood that, only in the side of touch screen or the area a certain region division C of side, incoming event region is sentenced
It is disconnected to be adjusted correspondingly, it is whole to judge thinking are as follows: no matter whether touch screen rotates, and determines the length and width in the area C, determines it
Coordinate range is excluded when judging according to coordinate range, to determine the region where incoming event.
Event distributes module 2035 for edge incoming event and/or the area A incoming event to be reported to third judgment module
2036.In one embodiment, channel used by edge incoming event and the area A incoming event report is not identical.Edge input
Event is reported using designated lane.
In addition, event distributes the current state that module 2035 is also used to obtain mobile terminal, according to current state to reporting
Coordinate converted and reported after being adjusted.
In the embodiment of the present invention, the current state of mobile terminal is obtained according to the testing result of action sensor.Current shape
State include: rotation angle be 0 degree, 90 degree clockwise, 180 degree clockwise, 270 degree clockwise etc..It should be understood that if rotation counterclockwise
Turn, then 90 degree counterclockwise it is identical as 270 degree clockwise, 180 degree counterclockwise is identical as 180 degree clockwise, 270 degree counterclockwise with it is suitable
90 degree of hour hands identical.
To the specific implementation that coordinate is converted and is adjusted referring to the description in above-mentioned steps S103, details are not described herein.
In one embodiment, it is real by inputdispatcher::dispatchmotion () to distribute module 2036 for event
It is existing.
Third judgment module 2036 is used to judge whether event is edge incoming event according to device identification (ID), if belonging to
In being then reported to the first application module 2037, be otherwise reported to when the second application module 2038.
Specifically, third judgment module 2036 is when judging, acquisition device identification first, according to equipment mark referring to Figure 11
Knowledge judges whether it is touch screen type equipment;If so, further judging whether device identification is the area C device identification i.e. above-mentioned second
The mark of input equipment object, if so, being judged as edge incoming event, if it is not, being then judged as normal incoming event.Ying Li
Solution can also further judge whether device identification is i.e. above-mentioned first input of the area A device identification after being judged as touch screen class equipment
The corresponding mark of equipment, if so, being judged as normal incoming event, if it is not, being then judged as edge incoming event.
In an embodiment of the present invention, the first application module 2037 is for handling incoming event relevant to the input of the area A, tool
Body, this processing includes: to carry out processing identification according to the touch point coordinate of input operation, duration, number etc., and will know
Other result is reported to application layer.Second application module 2038 is used to handle incoming event relevant to the input of the area C, specifically, this
Kind of processing includes: to carry out processing identification according to the touch point coordinate of processing operation, duration, number, and will be on recognition result
Registration application layer.For example, according to the coordinate of touch point, duration and number may recognize that input operation be the clicking of the area A,
The unilateral cunning etc. back and forth in sliding or the area C.
Application layer 204 includes the application such as camera, picture library, screen locking (using 1, using 2 ...).It is defeated in the embodiment of the present invention
Entering operation includes application layer and system-level, and system-level gesture processing is also classified as application layer.Wherein, application layer is to correspond to
With the manipulation of program, for example, unlatching, closing, volume control etc..The system-level manipulation for mobile terminal, for example, being switched on, adding
Speed, using a switching, global return etc..Application layer can obtain the incoming event in the area C by registering the Listener of the area C event
It is handled, the incoming event that the area A can also be obtained by registering the Listener of the area A event is handled.
In one embodiment, mobile terminal, which is arranged and is stored with, operates corresponding instruction from different inputs, wherein wrapping
Include and the corresponding instruction of edge input operation and instruction corresponding with normal input operation.It is defeated that application layer receives the edge reported
The recognition result of incoming event inputs operation calls according to edge and is instructed accordingly to respond edge input operation.Application layer
The recognition result of the normal incoming event reported is received, i.e., is instructed accordingly according to normal input operation calls to respond this just
Often input operation.
It should be understood that the incoming event of the embodiment of the present invention includes only in the input operation in the area A, only in the input operation in the area C
And the input operation in the area A and the area C is resulted from simultaneously.Instructing as a result, also includes instruction corresponding with these three types of incoming events.This
Inventive embodiments can realize that the area A and the combination of the area C input operation control mobile terminal, for example, input operation is simultaneously
Click the corresponding position in the area A and the area C, corresponding instruction is closes a certain application, therefore, by clicking the area A simultaneously and the area C is corresponding
The input of position operates the closing, it can be achieved that application.
The mobile terminal of the embodiment of the present invention, it can be achieved that convert edge touch area accordingly according to the rotation of touch screen,
Preferably to adapt to the operation of user, user experience is improved;On the other hand, due to just distinguishing the area A and C in application framework floor
The operation in area, and the foundation of virtual unit is carried out in application framework layer, it avoids and distinguishes the area A and the area C to hardware in driving floor
It relies on;It is numbered by setting touch point, it can be achieved that differentiation finger, compatible A agreement and B agreement;And due to input reader 2030,
First event processing module 2031, second event processing module 2032, first judgment module 2033,2034 and of the second judgment module
The function that event distributes module 2035, third judgment module 2036, the first application module 2037, second application module 2038 etc. can
It is integrated into the operating system of mobile terminal, applicable different hardware, different types of mobile terminal, it is portable good;Input
Reader (Input Reader) can automatically save all elements (coordinate, number of touch point etc.) of a touch point
Come, inputs (for example, FIT) for subsequent judgement edge and convenience is provided.
Referring to the flow chart for the input processing method that Figure 12 is the embodiment of the present invention, comprising the following steps:
S1, driving layer obtain the incoming event that user is generated by input equipment, and are reported to application framework layer.
Specifically, input equipment receives the input operation (i.e. incoming event) of user, it is changed into telecommunications for being physically entered
Number, and electric signal is transferred to driving layer.In embodiments of the present invention, incoming event includes the area A incoming event and the area C input thing
Part.The area A incoming event includes the inputs such as the click, double click, slide carried out in the area A operation.The area C incoming event be included in the area C into
Sliding, left side edge glides, slides in right side edge in capable left side edge, right side edge glides, bilateral upper cunning, bilateral downslide, list
The inputs operation such as slide, hold, being hold by one hand back and forth in side.
Driving layer based on the received electrical signal parses input position, obtains the specific coordinate of touch point, continues
The relevant parameters such as time.The relevant parameter is reported to application framework layer.
In addition, if driving layer reports incoming event using A agreement, step S1 further include:
One is assigned for distinguishing the number (ID) of finger for each touch point.
If driving layer reports incoming event using A agreement as a result, the data reported include above-mentioned relevant parameter, and
The number of touch point.
S2, application framework layer judge that incoming event is edge incoming event or normal incoming event, if normal input
Event thens follow the steps S3, thens follow the steps S4 if edge incoming event.
Specifically, application framework layer can determine whether it for edge incoming event according to the coordinate in the relevant parameter of incoming event
Or normal incoming event.Referring to above-mentioned Figure 10, the horizontal axis coordinate of touch point is obtained first, then by the horizontal axis coordinate of touch point
(i.e. X axis coordinate) (x) is compared with C sector width (Wc) and touch screen width (W).Point is touched if Wc < x < (W-Wc)
In the area A, event is normal incoming event;Otherwise, event is edge incoming event.If driving layer reports input thing using B agreement
Part, then step S2 is also specifically included: assigning the number (ID) for distinguishing finger for each touch point;All by touch point want
Prime information (coordinate, duration, number etc.) is stored.
It should be understood that judging that, referring to foregoing description, details are not described herein accordingly when touch screen rotates.
The embodiment of the present invention is numbered by setting touch point, it can be achieved that differentiation finger, compatible A agreement and B agreement as a result,;
And all elements (coordinate, number of touch point etc.) of touch point are stored, can subsequent judgement edge input (for example, FIT) mention
For convenience.
In one embodiment, channel used by edge incoming event and normal incoming event report is not identical.Edge
Incoming event uses designated lane.
S3, application framework layer carry out processing identification to normal incoming event, and recognition result is reported to application layer.
S4, application framework layer carry out processing identification to edge incoming event, and recognition result is reported to application layer.
Specifically, processing identification includes: to be handled according to the touch point coordinate of input operation, duration, number etc.
Identification, to determine input operation.For example, according to the coordinate of touch point, the duration and number i.e. may recognize that be the area A list
Inputs operation or the unilateral inputs operation sliding etc. back and forth in the area C such as hit, slide.
S5, application layer execute corresponding instruction according to the recognition result reported.
Specifically, application layer includes the application such as camera, picture library, screen locking.Input operation in the embodiment of the present invention includes answering
With grade and system-level, system-level gesture processing is also classified as application layer.Wherein, application layer is the behaviour to application program
Control, for example, unlatching, closing, volume control etc..The system-level manipulation for mobile terminal, for example, booting, accelerating, using cutting
It changes, global return etc..
In one embodiment, mobile terminal, which is arranged and is stored with, operates corresponding instruction from different inputs, wherein wrapping
Include and the corresponding instruction of edge input operation and instruction corresponding with normal input operation.It is defeated that application layer receives the edge reported
The recognition result of incoming event inputs operation calls according to edge and is instructed accordingly to respond edge input operation;Application layer
The recognition result of the normal incoming event reported is received, i.e., is instructed accordingly according to normal input operation calls to respond this just
Often input operation.
It should be understood that the incoming event of the embodiment of the present invention includes only in the input operation in the area A, only in the input operation in the area C
And the input operation in the area A and the area C is resulted from simultaneously.Instructing as a result, also includes instruction corresponding with these three types of incoming events.This
Inventive embodiments can realize that the area A and the combination of the area C input operation control mobile terminal, for example, input operation is simultaneously
Click the corresponding position in the area A and the area C, corresponding instruction is closes a certain application, therefore, by clicking the area A simultaneously and the area C is corresponding
The input of position operates the closing, it can be achieved that application.
In one embodiment, the input processing method of the embodiment of the present invention further include:
S11, the input equipment object for each incoming event creation one with device identification.
Specifically, in one embodiment, the first input equipment object can be created for normal incoming event, have first
Mark.First input equipment object is corresponding with input equipment touch screen.One second input equipment object is arranged in application framework layer.
The second input equipment object (for example, being FIT device) is virtual unit, an as null device, has one second mark
Know, for corresponding with edge incoming event.It should be understood that can also be by edge incoming event and the first input with first identifier
Device object is corresponding, and normal control event is corresponding with having the second input equipment object of second identifier.
In one embodiment, the input processing method of the embodiment of the present invention further include:
S21, application framework layer convert the coordinate reported according to current state according to the current state of mobile terminal
It is reported with after adjustment.
Specifically, the current state of mobile terminal include: rotation angle be 0 degree, it is 90 degree clockwise, 180 degree clockwise, suitable
270 degree of hour hands etc..
It should be understood that in an embodiment of the present invention, if rotation counterclockwise, then 90 degree counterclockwise with 270 degree of phases clockwise
Together, 180 degree counterclockwise is identical as 180 degree clockwise, and 270 degree counterclockwise identical as 90 degree clockwise.
To the specific implementation that coordinate is converted and is adjusted referring to the description in above-mentioned steps S103 and application framework layer,
This is repeated no more.
In one embodiment, step S21 can be realized by inputdispatcher::dispatchmotion ().
S22, judge whether incoming event is edge incoming event according to device identification, if belonging to, upper execution step S3,
If being not belonging to then follow the steps S4.
Specifically, referring to above-mentioned Figure 11, when judging whether incoming event is edge incoming event according to device identification, first
Device identification is obtained, touch screen type equipment is judged whether it is according to device identification;If so, further whether judging device identification
For the area C device identification, that is, above-mentioned second input equipment object mark, if so, being judged as edge incoming event, if it is not, then sentencing
Break as normal incoming event.It should be understood that further can also judge whether device identification is the area A after being judged as touch screen class equipment
Device identification, that is, corresponding the mark of above-mentioned first input equipment, if so, being judged as normal incoming event, if it is not, being then judged as
Edge incoming event.
The input processing method of the embodiment of the present invention, it can be achieved that convert edge Petting Area according to the rotation of touch screen accordingly
User experience is improved preferably to adapt to the operation of user in domain;On the other hand, due to just distinguishing the area A in application framework floor
With the operation in the area C, and the foundation of virtual unit is carried out in application framework layer, avoids and distinguish the area A and the area C to hardware in driving floor
Dependence;It is numbered by setting touch point, it can be achieved that differentiation finger, compatible A agreement and B agreement;And it can be integrated into mobile terminal
Operating system in, applicable different hardware, different types of mobile terminal are portable good;All elements (touching of touch point
Touch coordinate, number etc. a little) it is stored, it can subsequent judgement edge input (for example, FIT) offer convenience.
It is that the camera applications of mobile terminal are opened using the input processing method of the embodiment of the present invention referring to Figure 13
Effect diagram.Wherein, the figure on the left side Figure 13 is the main interface schematic diagram of mobile terminal, wherein region 1010 is at edge
The touch point of the pre-set achievable input operation for opening camera function of input area (region C 101).Specifically, clicking
Region 1010 can realize unlatching camera.Then in the terminal, it is stored with instruction are as follows: camera is opened, with clicking on region 1010
Input operation it is corresponding.
When needing using camera, the region 1010 of user click touch screen, driving layer obtains the incoming event, and reports
To application framework layer.Application framework layer can determine whether out that the incoming event is edge incoming event according to the coordinate of touch point.Using
Ccf layer carries out processing identification to the edge incoming event, according to touch point coordinate, duration and coding, identifies the input
Operation is clicking on region 1010.Recognition result is reported to application layer by application framework layer, and application layer is to execute the finger for opening camera
It enables.
Referring to Figure 14, when mobile terminal rotates clockwise after 90s, the region C 101 and the touch that can realize unlatching camera function
Corresponding change has occurred in point.Click the similar of the process and above-mentioned Figure 13 that camera is opened behind the region 1010 in Figure 14.
It should be understood that after opening camera function, be not shown the area C in Figure 13 and Figure 14, but its there are still or according to the present invention
The above-mentioned description to C Division of embodiment, after opening camera, settable relatively wide etc. of C sector width, this can be by this field
Technical staff is understood.
Schematic diagram is divided referring to the touch screen for the mobile terminal that Figure 15 is second embodiment of the invention.In this embodiment,
Deviateing the region that input starts in user's input process in order to prevent causes accuracy rate to decline, in the touch surface edges of boards of mobile terminal
Edge increases transition region 103 (area T).
In this embodiment, it if incoming event is since the area C, is offset to the area T and still thinks that this sliding is edge hand
Gesture;If incoming event since the area C, is offset to the area A, then it is assumed that this edge gesture terminates, and starts normal incoming event;If defeated
No matter incoming event slides into any region of touch panel later since the area T or the area A, all thinks that this sliding is normal defeated
Incoming event.
The report flow of the incoming event of the embodiment is identical with interaction control method described in above-described embodiment, and difference is only
It is: when application framework layer carries out processing identification to edge incoming event, needs to be judged according to above-mentioned three kinds of situations, with true
Fixed accurate incoming event.For example, application framework layer judges to obtain input thing according to the touch point that certain incoming event reports
Part is offset to the area A (touch point coordinate when i.e. input starts is located at the area C, and a certain touch in input process since the area C
The coordinate of point is located at the area A), then it is side that first judgment module and the second judgment module, which are incoming event according to the result that coordinate judges,
Edge incoming event, and this edge incoming event terminates, and starts normal incoming event, driving layer starts incoming event next time
Report.
The mobile terminal of the embodiment of the present invention can be implemented in a variety of manners.For example, terminal described in the present invention can
With include such as mobile phone, mobile phone, smart phone, laptop, digit broadcasting receiver, PDA (personal digital assistant),
The mobile terminal of PAD (tablet computer), PMP (portable media player), navigation device etc. and such as number TV, platform
The fixed terminal of formula computer etc..
Correspondingly, the embodiment of the present invention also provides a kind of user equipment, it is its hardware structural diagram referring to Figure 16.Referring to
Figure 16, user equipment 1000 include touch screen 2010, controller 200, storage device 310, GPS chip 320, communicator 330, view
Frequency processor 340, audio processor 350, button 360, microphone 370, camera 380, loudspeaker 390 and action sensor 906.
Touch screen 2010 can be divided into the area A and the area C or the area A, the area C and the area T as described above.Touch screen 2010 can be real
Now be various types of displays, LCD (liquid crystal display), OLED (Organic Light Emitting Diode) display and PDP (wait from
Daughter display board).Touch screen 2010 may include driving circuit, can be implemented as, such as (low temperature is more by a-si TFT, LTPS
Crystal silicon) TFT and OTFT (organic tft) and back light unit.
Meanwhile touch screen 2010 may include the touch sensor for sensing the touch gestures of user.Touch sensor
It can be implemented as various types of sensors, such as capacity type, resistance type or piezo type.Capacity type, which passes through to work as, to be used
On a part (for example, finger of user) touch-surface of family body coated with conductive material touch screen surface when sensing by
The micro-current of the body excitation of user calculates touch coordinate value.According to resistance type, touch screen includes two electrode plates, and is worked as
By sensing the electric current flowed when upper plate and the lower plate contact at touch point when user touches touch panel, sat to calculate to touch
Scale value.In addition, touch screen 2010 can be sensed for use in addition to user hand when the support input function of user equipment 1000
The user gesture of the input unit of such as pen etc except finger.When input unit is the writing pencil (stylus pen) for including coil
When, user equipment 1000 may include the magnetic sensor (not shown) for sensing magnetic field, and the magnetic field is according in writing pencil
Coil changes the degree of approach of magnetic sensor.As a result, other than sensing touch gestures, user equipment 1000 can also be felt
Close gesture is surveyed, i.e. writing pencil hovers over 1000 top of user equipment.
Storage device 310 can store various programs and data needed for the operation of user equipment 1000.For example, storage dress
Setting 310 can store the program and data for being used for constituting the various screens that will be shown in each area (for example, the area A, the area C).
Controller 200 is by using the program and data being stored in storage device 310 in each area of touch screen 2010
Show content.
Controller 200 includes RAM 210, ROM 220, CPU 230, GPU (graphics processing unit) 240 and bus 250.
RAM 210, ROM 220, CPU 230 and GPU 240 can be connected to each other by bus 250.
CPU (processor) 230 accesses storage device 310 and using the operating system being stored in storage device 310
(OS) starting is executed.Moreover, CPU 230 is executed by using the various programs, content and data being stored in storage device 310
Various operations.
ROM 220 stores the command set for system starting.When open command is entered and electric power is provided, CPU
The OS being stored in storage device 310 is copied to RAM 210 according to being stored in ROM 220 command set by 230, and passes through fortune
Row OS activation system.When start completion, CPU 230 is by the various program copies being stored in storage device 310 to RAM
210, and various operations are executed by the reproducer in operation RAM 210.Specifically, GPU 240 can be by using
Calculator (not shown) and renderer (not shown) generate the screen including various objects as such as icon, image and text
Curtain.Calculator calculates characteristic value as such as coordinate value, format, size and color, wherein being used respectively according to the layout of screen
Color mark object.
GPS chip 320 is and to calculate user equipment from the unit of GPS (global positioning system) satellite reception GPS signal
1000 current location.When using Navigator or when requesting the current location of user, controller 200 can be by making
The position of user is calculated with GPS chip 320.
Communicator 330 is to execute the unit communicated with various types of external equipments according to various types of communication means.
Communicator 330 includes WiFi chip 331, Bluetooth chip 332, wireless communication chips 333 and NFC chip 334.Controller 200 is logical
Cross the communication executed using communicator 330 with various external equipments.
WiFi chip 331 and Bluetooth chip 332 execute communication according to WiFi method and bluetooth approach respectively.When use WiFi
When chip 331 or Bluetooth chip 332, such as service set identifier (service set identifier, SSID) and session
Various link informations as key can be received and dispatched first, can be by using link information connection communication, and can be received
Send out information various.Wireless communication chips 333 are according to such as IEEE, Zigbee, 3G (third generation), 3GPP (third generation cooperation item
Mesh) and LTE (long term evolution) as various communication standards execute the chip of communication.NFC chip 334 is various according to using
The chip that NFC (near-field communication) method of 13.56 gigahertz bandwidths is operated in RFID frequency bandwidth, various RFID bandwidths
Spend such as 135 kHz, 13.56 megahertzs, 433 megahertzs, 860~960 megahertzs and 2.45 gigahertz (GHZ)s.
Video processor 340 is that processing includes in the content received by communicator 330 or being stored in storage device
The unit of the video data in content in 310.Video processor 340 can execute at the various images for video data
Reason, such as decoding, scaling, noise filtering, frame rate transformation and resolution transformation.
Audio processor 350 is that processing includes in the content received by communicator 330 or being stored in storage device
The unit of the audio data in content in 310.Audio processor 350 can execute the various processing for audio data, all
Such as decoding, amplification and noise filtering.
When running reproduction program for multimedia content, controller 200 can be by driving video processor 340 and sound
Frequency processor 350 reproduces corresponding contents.
Loudspeaker 390 exports the audio data generated in audio processor 350.
Button 360 can be various types of buttons, such as mechanical button or main outer as user equipment 1000
The touch pads or touch-wheel formed on some regions as the front of body, side or the back side.
Microphone 370 is to receive user speech or other sound and the unit for converting them into audio data.Control
The user speech inputted during calling procedure by microphone 370 can be used in device 200 processed, or converts them into audio
It data and is stored in storage device 310.
Camera 380 is according to the control capturing still image of user or the unit of video image.Camera 380 may be implemented
For multiple units, such as front camera and back side camera.As described below, camera 380 may be used as the sight in tracking user
Exemplary embodiment in obtain user images device.
When providing camera 380 and microphone 370, controller 200 can be according to the user's inputted by microphone 370
Sound or the user action executive control operation identified by camera 380.Therefore, user equipment 1000 can be in action control mould
It is operated under formula or voice control mode.When operating in the action control mode, controller 200 is clapped by activation camera 380
User is taken the photograph, tracks the change of user action, and execute corresponding operation.When operating in the voice control mode, controller
200 can operate under speech recognition mode to analyze the voice inputted by microphone 370 and according to user's language of analysis
Sound executive control operation.
In the user equipment 1000 for supporting action control mode or voice control mode, above-mentioned various exemplary real
It applies and uses speech recognition technology or action recognition technology in example.For example, when user executes as selection marks in home screen
To movement like this or when saying the voice command corresponding to object, can determine and select corresponding object and can be with
The control with the object matching is executed to operate.
Action sensor 906 is the unit for sensing the movement of main body of user equipment 1000.User equipment 1000 can revolve
Turn or is tilted along various directions.Action sensor 906 can by using such as geomagnetic sensor, gyro sensor and add
To sense, such as direction of rotation, angle and slope are such to be moved one or more of various sensors as velocity sensor
Dynamic feature.It should be understood that when user equipment rotation, correspondingly, touch screen is also rotated, and the rotation angle with user equipment
Degree is identical.
Although according to one exemplary embodiment, user equipment 1000 can also include energy moreover, being not shown in Figure 16
Enough USB ports connecting with USB connector as earphone, mouse, LAN and receive and process DMB (digital multimedia for connection
Broadcast) signal DMB chip as various outer members various input ports and various other sensors.
As described above, storage device 310 can store various programs.
Based on user equipment shown in Figure 16, in an embodiment of the present invention, touch screen results from touch surface for detecting
Touch signal on plate, and for identifying touch point according to touch signal.
Action sensor, for detecting the rotation angle of user equipment.
Processor, comprising: drive module, application framework module and application module;
Wherein, drive module for obtaining incoming event according to touch signal, and is reported to application framework module;
Application framework module, for judging touch point according to the touch point position of incoming event and rotation angle reported
Positioned at edge touch area or normal touch area, carried out recognition result if being located at edge touch area after processing identification
It is reported to application module;It carries out that recognition result is reported to application module after processing identification if being located at normal touch area;
Application module, for executing corresponding instruction according to the recognition result reported.
It should be understood that the phase of working principle and details and above-described embodiment description of each module of the user equipment of the embodiment
Together, details are not described herein.
Method of toch control, user equipment, input processing method and the mobile terminal of the embodiment of the present invention, it can be achieved that according to
The rotation of touch screen converts edge touch area accordingly, preferably to adapt to the operation of user, improves user experience;Another party
Face carries out the foundation of virtual unit due to just distinguishing the operation in the area A and the area C in application framework floor, and in application framework layer,
It avoids and distinguishes the dependence of the area A and the area C to hardware in driving floor;It is numbered by setting touch point, it can be achieved that differentiation finger, compatible
A agreement and B agreement;And can be integrated into the operating system of mobile terminal, applicable different hardware, it is different types of it is mobile eventually
End is portable good;All elements (coordinate, number of touch point etc.) of touch point are stored, can subsequent judgement edge input
(for example, FIT) provides convenience.
Any process described otherwise above or method description can be by flow chart or in an embodiment of the present invention
It is interpreted as, expression includes the steps that one or more codes for realizing specific logical function or the executable instruction of process
Module, segment or part, and the range of embodiment of the present invention includes other realization, wherein can not by shown or
The sequence of discussion, including according to related function by it is basic simultaneously in the way of or in the opposite order, Lai Zhihang function, this is answered
By the embodiment of the present invention, the technical personnel in the technical field understand.
The embodiment of the present invention is described with above attached drawing, but the invention is not limited to above-mentioned specific
Embodiment, the above mentioned embodiment is only schematical, rather than restrictive, those skilled in the art
Under the inspiration of the present invention, without breaking away from the scope protected by the purposes and claims of the present invention, it can also make very much
Form, all of these belong to the protection of the present invention.
Claims (21)
1. a kind of method of toch control characterized by comprising
Detection results from the touch signal on touch panel;
Touch point is identified according to touch signal;
Detect the rotation angle of touch panel;
Incoming event is obtained according to the touch signal by driving layer, and is reported to application framework layer;
By the application framework layer according to the touch point and the rotation angle identified, judge that touch point is located at edge touch
Region or normal touch area;
Corresponding instruction is executed based on judging result,
The driving layer reports incoming event using A agreement or B agreement, reports incoming event according to A agreement, then the driving
Layer is also used to assign one for distinguishing the number of finger for each touch point;Report incoming event according to B agreement, then it is described to answer
It is also used to assign the number for distinguishing finger for each touch point with ccf layer, it is real by the way that the number of the touch point is arranged
Now distinguish finger, compatible A agreement and B agreement.
2. method of toch control according to claim 1, which is characterized in that the rotation angle include: rotation 0 degree, it is suitable
Hour hands are rotated by 90 °, rotate clockwise 180 degree, rotate clockwise 270 degree, be rotated by 90 ° counterclockwise, rotate 180 degree counterclockwise and
270 degree of rotation counterclockwise.
3. method of toch control according to claim 2, which is characterized in that touch point that the basis identifies and described
Angle is rotated, judges that touch point is located at edge touch area or normal touch area and includes:
If rotating angle is 0 degree, as Wc < x < (W-Wc), touch point is located at normal touch area, and otherwise, touch point is located at
Edge touch area;
If rotating angle is 90 degree clockwise, as Wc < y < H-Wc, touch point is located at normal touch area, otherwise, touch point
Positioned at edge touch area;
If rotation angle is 180 degree clockwise, as Wc < x < (W-Wc), touch point is located at normal touch area, otherwise, touching
It touches and is a little located at edge touch area;
If rotating angle is 270 degree clockwise, as Wc < y < H-Wc, touch point is located at normal touch area, otherwise, touches
Point is located at edge touch area;
Wherein, x is the horizontal axis coordinate for being located at touch panel place coordinate system of touch point, and y is touch point positioned at touch panel
The horizontal axis coordinate of place coordinate system, W are the width of touch panel, and Wc is the width of edge Petting Area.
4. a kind of user equipment characterized by comprising touch screen, action sensor and processor;
Touch screen, comprising: touch panel and touch controller, in which:
Touch panel, for detecting the touch signal resulted from touch panel;
Touch controller, for identifying touch point according to touch signal;
Action sensor, for detecting the rotation angle of the user equipment;
Processor, comprising: drive module, application framework module and application module, in which:
The drive module for obtaining incoming event according to the touch signal, and is reported to the application framework module;
The application framework module judges touch point for the touch point position according to rotation angle and the incoming event reported
Positioned at edge touch area or normal touch area;
Application module, for executing corresponding instruction based on judging result,
The drive module reports incoming event using A agreement or B agreement, reports incoming event according to A agreement, then the drive
Dynamic model block is also used to assign one for distinguishing the number of finger for each touch point;Incoming event is reported according to B agreement, then institute
It states application framework module to be also used to assign the number for distinguishing finger for each touch point, by the volume that the touch point is arranged
Number, it realizes and distinguishes finger, compatible A agreement and B agreement.
5. a kind of input processing method characterized by comprising
Driving layer obtains the incoming event that user is generated by input equipment, and is reported to application framework layer;
Current state and the incoming event that reports of the application framework layer according to mobile terminal, judge that incoming event is edge input thing
Part or normal incoming event then carry out processing identification to normal incoming event if normal incoming event, and by recognition result
It is reported to application layer, processing identification then is carried out to edge incoming event if edge incoming event, and recognition result is reported to
Application layer;
Application layer executes corresponding instruction according to the recognition result reported,
The driving layer reports incoming event using A agreement or B agreement, reports incoming event according to A agreement, then the driving
Layer is also used to assign one for distinguishing the number of finger for each touch point;Report incoming event according to B agreement, then it is described to answer
It is also used to assign the number for distinguishing finger for each touch point with ccf layer, it is real by the way that the number of the touch point is arranged
Now distinguish finger, compatible A agreement and B agreement.
6. input processing method according to claim 5, which is characterized in that the method also includes:
For the one input equipment object with device identification of each incoming event creation.
7. input processing method according to claim 6, which is characterized in that described to have for each incoming event creation one
The input equipment object of device identification includes:
Normal incoming event is corresponding with having the touch screen of the first device identification;
One the second input equipment object with the second device identification of application framework layer setting is corresponding with edge incoming event.
8. according to the described in any item input processing methods of claim 5-7, which is characterized in that the current shape of the mobile terminal
State include: rotation 0 degree, rotate clockwise 90 degree, rotate clockwise 180 degree, rotate clockwise 270 degree, be rotated by 90 ° counterclockwise,
Rotation 180 degree and counterclockwise 270 degree of rotation counterclockwise.
9. input processing method according to claim 8, which is characterized in that if rotation angle is 0 degree, as Wc < x < (W-
When Wc), then application framework layer judges that incoming event is edge incoming event otherwise for normal incoming event;
If rotating angle is 90 degree clockwise, as Wc < y < H-Wc, then application framework layer judges incoming event for normal input
Otherwise event is edge incoming event;
If rotation angle is 180 degree clockwise, as Wc < x < (W-Wc), then application framework layer judges that incoming event is normal
Otherwise incoming event is edge incoming event;
If rotation angle be 270 degree clockwise, as Wc < y < H-Wc, then application framework layer judge incoming event for normally it is defeated
Otherwise incoming event is edge incoming event;
Wherein, x is the horizontal axis coordinate for being located at touch panel place coordinate system of touch point, and y is touch point positioned at touch panel
The horizontal axis coordinate of place coordinate system, W are the width of touch panel, and Wc is the width of edge Petting Area.
10. a kind of mobile terminal characterized by comprising
Input equipment;
Action sensor, for detecting the current state of the mobile terminal;
Layer, the incoming event generated for obtaining user by input equipment are driven, and is reported to application framework layer;
Application framework layer, for judging that incoming event is edge according to the current state of mobile terminal and the incoming event reported
Incoming event or normal incoming event then carry out processing identification to normal incoming event if normal incoming event, and will know
Other result is reported to application layer, then carries out processing identification to edge incoming event if edge incoming event, and by recognition result
It is reported to application layer;
Application layer, for executing corresponding instruction according to the recognition result reported,
The driving layer reports incoming event using A agreement or B agreement, reports incoming event according to A agreement, then the event
Module is obtained to be also used to assign one for distinguishing the number of finger for each touch point;Incoming event is reported according to B agreement, then
The application framework layer is also used to assign the number for distinguishing finger for each touch point, by the volume that the touch point is arranged
Number, it realizes and distinguishes finger, compatible A agreement and B agreement.
11. mobile terminal according to claim 10, which is characterized in that the normal incoming event with have the first equipment
First input equipment object of mark is corresponding;
The application framework layer is also used to be arranged a second input equipment object with the second device identification, is used for and the side
Edge incoming event is corresponding.
12. mobile terminal according to claim 10, which is characterized in that the driving layer includes that event obtains module, is used
In the incoming event that acquisition user is generated by input equipment.
13. mobile terminal according to claim 10, which is characterized in that the application framework layer includes input reader;
The mobile terminal further includes the device node being set between the driving layer and the input reader, for notifying
It states input reader and obtains incoming event;
The input reader, for traversing device node, obtaining incoming event and reporting.
14. mobile terminal according to claim 10, which is characterized in that the current state of the mobile terminal includes: rotation
Turn 0 degree, rotates clockwise 90 degree, rotate clockwise 180 degree, rotate clockwise 270 degree, be rotated by 90 °, revolve counterclockwise counterclockwise
Turnback and counterclockwise 270 degree of rotation.
15. mobile terminal according to claim 14, which is characterized in that the application framework layer further include: first event
Processing module, the incoming event for reporting to the input reader report after carrying out coordinate calculating;
First judgment module, the seat reported for the current state and the first event processing module according to the mobile terminal
Scale value judges whether incoming event is edge incoming event, if not then reporting incoming event.
16. mobile terminal according to claim 15, which is characterized in that the application framework layer further include:
Second event processing module, the incoming event for reporting to the input reader report after carrying out coordinate calculating;
Second judgment module, the seat reported for the current state and the second event processing module according to the mobile terminal
Scale value judges whether incoming event is edge incoming event, if then reporting incoming event.
17. mobile terminal according to claim 16, which is characterized in that if rotation angle is 0 degree, as Wc < x < (W-
When Wc), then it is normal incoming event that judging result, which is incoming event, is edge incoming event otherwise;
If rotating angle is 90 degree clockwise, as Wc < y < H-Wc, then it is normal input thing that judging result, which is incoming event,
Otherwise part is edge incoming event;
If rotation angle is 180 degree clockwise, as Wc < x < (W-Wc), then it is normally to input that judging result, which is incoming event,
Otherwise event is edge incoming event;
If rotating angle is 270 degree clockwise, as Wc < y < H-Wc, then it is normal input thing that judging result, which is incoming event,
Otherwise part is edge incoming event;
Wherein, x is the horizontal axis coordinate for being located at touch panel place coordinate system of touch point, and y is touch point positioned at touch panel
The horizontal axis coordinate of place coordinate system, W are the width of touch panel, and Wc is the width of edge Petting Area.
18. mobile terminal according to claim 17, which is characterized in that the application framework layer further include:
Event distributes module, and the event for reporting second judgment module and the first judgment module reports.
19. mobile terminal according to claim 18, which is characterized in that the application framework layer further include:
First application module;
Second application module;
Third judgment module judges that event is for distributing the device identification for including in the event that module reports according to the event
It is no to be reported to first application module if belonging to for edge incoming event, it is otherwise reported to when described second using mould
Block;
First application module identifies normal incoming event for the relevant parameter according to normal incoming event and is incited somebody to action
Recognition result is reported to application layer;
Second application module identifies edge incoming event for the relevant parameter according to edge incoming event and is incited somebody to action
The application layer that recognition result reports.
20. mobile terminal according to claim 10, which is characterized in that the input equipment is the touch of mobile terminal
Screen;
The touch screen includes at least one edge input area and at least one normal input area.
21. mobile terminal according to claim 10, which is characterized in that the input equipment is the touch of mobile terminal
Screen;
The touch screen includes at least one edge input area, at least one normal input area and at least one transition region.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510819757.8A CN105335007B (en) | 2015-11-20 | 2015-11-20 | Method of toch control, user equipment, input processing method and mobile terminal |
PCT/CN2016/102777 WO2017084469A1 (en) | 2015-11-20 | 2016-10-20 | Touch control method, user equipment, input processing method and mobile terminal |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201510819757.8A CN105335007B (en) | 2015-11-20 | 2015-11-20 | Method of toch control, user equipment, input processing method and mobile terminal |
Publications (2)
Publication Number | Publication Date |
---|---|
CN105335007A CN105335007A (en) | 2016-02-17 |
CN105335007B true CN105335007B (en) | 2019-10-08 |
Family
ID=55285599
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201510819757.8A Active CN105335007B (en) | 2015-11-20 | 2015-11-20 | Method of toch control, user equipment, input processing method and mobile terminal |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN105335007B (en) |
WO (1) | WO2017084469A1 (en) |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105335007B (en) * | 2015-11-20 | 2019-10-08 | 努比亚技术有限公司 | Method of toch control, user equipment, input processing method and mobile terminal |
CN107479745B (en) * | 2017-07-31 | 2020-07-21 | 北京雷石天地电子技术有限公司 | Method and module for configuring touch screen and operating system |
CN107844220B (en) | 2017-11-29 | 2020-02-11 | 广州视源电子科技股份有限公司 | Touch signal processing method, system and device and electronic equipment |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101676843A (en) * | 2008-09-18 | 2010-03-24 | 联想(北京)有限公司 | Touch inputting method and touch inputting device |
CN102236468A (en) * | 2010-04-26 | 2011-11-09 | 宏达国际电子股份有限公司 | Sensing method, computer program product and portable device |
CN104583903A (en) * | 2013-11-26 | 2015-04-29 | 华为技术有限公司 | Method, system and terminal for preventing faulty touch operation |
CN104735256A (en) * | 2015-03-27 | 2015-06-24 | 努比亚技术有限公司 | Method and device for judging holding mode of mobile terminal |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105511675B (en) * | 2015-11-20 | 2020-07-24 | 重庆桔子科技发展有限公司 | Touch control method, user equipment, input processing method, mobile terminal and intelligent terminal |
CN105335007B (en) * | 2015-11-20 | 2019-10-08 | 努比亚技术有限公司 | Method of toch control, user equipment, input processing method and mobile terminal |
-
2015
- 2015-11-20 CN CN201510819757.8A patent/CN105335007B/en active Active
-
2016
- 2016-10-20 WO PCT/CN2016/102777 patent/WO2017084469A1/en active Application Filing
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101676843A (en) * | 2008-09-18 | 2010-03-24 | 联想(北京)有限公司 | Touch inputting method and touch inputting device |
CN102236468A (en) * | 2010-04-26 | 2011-11-09 | 宏达国际电子股份有限公司 | Sensing method, computer program product and portable device |
CN104583903A (en) * | 2013-11-26 | 2015-04-29 | 华为技术有限公司 | Method, system and terminal for preventing faulty touch operation |
CN104735256A (en) * | 2015-03-27 | 2015-06-24 | 努比亚技术有限公司 | Method and device for judging holding mode of mobile terminal |
Also Published As
Publication number | Publication date |
---|---|
WO2017084469A1 (en) | 2017-05-26 |
CN105335007A (en) | 2016-02-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN105511675B (en) | Touch control method, user equipment, input processing method, mobile terminal and intelligent terminal | |
US11054986B2 (en) | Apparatus including a touch screen under a multi-application environment and controlling method thereof | |
CN105487705B (en) | Mobile terminal, input processing method and user equipment | |
KR101515620B1 (en) | User termincal device and methods for controlling the user termincal device thereof | |
KR101881925B1 (en) | Method and apparatus for implementing multi-vision system using multiple portable terminals | |
US10185456B2 (en) | Display device and control method thereof | |
US10088991B2 (en) | Display device for executing multiple applications and method for controlling the same | |
US11604580B2 (en) | Configuration of application execution spaces and sub-spaces for sharing data on a mobile touch screen device | |
US20130300684A1 (en) | Apparatus and method for executing multi applications | |
US10067666B2 (en) | User terminal device and method for controlling the same | |
KR102102157B1 (en) | Display apparatus for executing plurality of applications and method for controlling thereof | |
EP3133483A1 (en) | Touchscreen apparatus user interface processing method and touchscreen apparatus | |
KR20130054074A (en) | Apparatus displaying event view on splited screen and method for controlling thereof | |
CN103677711A (en) | Method for connecting mobile terminal and external display and apparatus implementing the same | |
AU2014312481A1 (en) | Display apparatus, portable device and screen display methods thereof | |
AU2013356799A1 (en) | Display device and method of controlling the same | |
CN108920069A (en) | A kind of touch operation method, device, mobile terminal and storage medium | |
CN105335007B (en) | Method of toch control, user equipment, input processing method and mobile terminal | |
US9794396B2 (en) | Portable terminal and method for controlling multilateral conversation | |
WO2017088694A1 (en) | Gesture calibration method and apparatus, gesture input processing method and computer storage medium | |
US11455071B2 (en) | Layout method, device and equipment for window control bars | |
CN102890606A (en) | Information processing device, information processing method, and program | |
CN103729130A (en) | Unlocking method of touch sensitive equipment and touch sensitive equipment | |
KR20140084966A (en) | Display apparatus and method for controlling thereof | |
CN113296647A (en) | Interface display method and device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |