CN105335007A - Touch control method, user equipment, input processing method and mobile terminal - Google Patents

Touch control method, user equipment, input processing method and mobile terminal Download PDF

Info

Publication number
CN105335007A
CN105335007A CN201510819757.8A CN201510819757A CN105335007A CN 105335007 A CN105335007 A CN 105335007A CN 201510819757 A CN201510819757 A CN 201510819757A CN 105335007 A CN105335007 A CN 105335007A
Authority
CN
China
Prior art keywords
incoming event
touch
edge
event
degree
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201510819757.8A
Other languages
Chinese (zh)
Other versions
CN105335007B (en
Inventor
李鑫
迟建华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nubia Technology Co Ltd
Original Assignee
Nubia Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nubia Technology Co Ltd filed Critical Nubia Technology Co Ltd
Priority to CN201510819757.8A priority Critical patent/CN105335007B/en
Publication of CN105335007A publication Critical patent/CN105335007A/en
Priority to PCT/CN2016/102777 priority patent/WO2017084469A1/en
Application granted granted Critical
Publication of CN105335007B publication Critical patent/CN105335007B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Telephone Function (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses a touch control method, user equipment, an input processing method and a mobile terminal. The touch control method comprises the following steps: detecting a touch signal generated from a touch panel; identifying a touch point according to the touch signal; detecting the rotation angle of the touch panel; judging whether the touch point is in an edge touch area or a normal touch area according to the identified touch point and the rotation angle; and performing corresponding instructions according to the judgment result. The touch control method has the beneficial effects that edge touch areas can be correspondingly changed according to rotation of the touch screen, then operation of a user can be relatively well adapted, and the user experience can be improved; operation in an area A and an area C is distinguished in an application framework layer, virtual equipment is established in the application framework layer, and dependence of distinguishment of the area A and the area B in a driving layer on hardware can be avoided; due to arrangement of touch point numbers, fingers can be distinguished, and protocol A and protocol B can be compatible.

Description

Method of toch control, subscriber equipment, input processing method and mobile terminal
Technical field
The present invention relates to communication field, more particularly, relate to a kind of method of toch control, subscriber equipment, input processing method and mobile terminal.
Background technology
Along with the development of mobile terminal technology, terminal frame does narrower and narrower.In order to the input improving user is experienced, edge input technology (such as, edge touch-control) arises at the historic moment.
According to touch point information, the edge input of prior art, when after detected touch point information (touchinfo), namely judges whether touch-control occurs in the region of edge input driving layer.
But, in practice because input chip exists diversity, layer is driven to obtain the method for touch point information also all with extremely strong specific aim, whether this just causes when decision event type (being edge incoming event), need amendment and the transplanting of each money input chip being made to differentiation, workload more greatly and easily make mistakes.
On the other hand, drive layer when reported event, can select A agreement or B agreement two kinds of implementations, wherein B consultation distinguishes finger ID.And the realization of edge input needs to rely on finger ID, when Random seismic field for contrasting the data of twice click before and after same finger.Therefore, the input scheme of prior art only can support B agreement, adopts the driving of A agreement then can not be supported.
Moreover the edge input area of existing mobile terminal is fixing, can not convert accordingly, poor user experience along with the rotation of mobile terminal.
Therefore, there is hardware-dependence by force in the input scheme of prior art, can not support A agreement and B agreement simultaneously, and the defect of poor user experience, need to improve.
Summary of the invention
The technical problem to be solved in the present invention is, edge input mode for the above-mentioned mobile terminal of prior art can not carry out the defect of corresponding conversion according to the rotation of mobile terminal, provide a kind of method of toch control, subscriber equipment, input processing method and mobile terminal.
The technical solution adopted for the present invention to solve the technical problems is:
First aspect, provides a kind of method of toch control, comprising:
Detect the touch signal resulted from touch panel;
According to touch signal identification touch point;
Detect the anglec of rotation of touch panel;
According to the touch point identified and the described anglec of rotation, judge that touch point is positioned at touch area, edge or normal touch region;
Corresponding instruction is performed based on judged result.
In one embodiment, the described anglec of rotation comprises: rotate 0 degree, dextrorotation turn 90 degrees, dextrorotation turnback, turn clockwise 270 degree, be rotated counterclockwise 90 degree, be rotated counterclockwise 180 degree and be rotated counterclockwise 270 degree.
In one embodiment, the touch point that described basis identifies and the described anglec of rotation, judge that touch point is positioned at touch area, edge or normal touch region and comprises:
If the anglec of rotation is 0 degree, then as Wc<x< (W-Wc), touch point is positioned at normal touch region, otherwise touch point is positioned at touch area, edge;
If the anglec of rotation is clockwise 90 degree, then, as Wc<y<H-Wc, touch point is positioned at normal touch region, otherwise touch point is positioned at touch area, edge;
If the anglec of rotation is clockwise 180 degree, then as Wc<x< (W-Wc), touch point is positioned at normal touch region, otherwise touch point is positioned at touch area, edge;
If the anglec of rotation is clockwise 270 degree, then, as Wc<y<H-Wc, touch point is positioned at normal touch region, otherwise touch point is positioned at touch area, edge;
Wherein, x is the transverse axis coordinate being positioned at touch panel place coordinate system of touch point, and x is the transverse axis coordinate being positioned at touch panel place coordinate system of touch point, and W is the width of touch panel, and Wc is the width of edge Petting Area.
Second aspect, provides a kind of subscriber equipment, comprising: touch-screen, action sensor and processor;
Touch-screen, comprising: touch panel and touch controller, wherein:
Touch panel, for detecting the touch signal resulted from touch panel;
Touch controller, for according to touch signal identification touch point;
Action sensor, for detecting the anglec of rotation of described subscriber equipment;
Processor, comprising: driver module, application framework module and application module, wherein:
Described driver module, for obtaining incoming event according to described touch signal, and is reported to described application framework module;
Described application framework module, for according to the anglec of rotation and the position, touch point of incoming event that reports, judges that touch point is positioned at touch area, edge or normal touch region;
Application module, for performing corresponding instruction based on judged result.
The third aspect, provides a kind of input processing method, comprising:
Drive the incoming event that layer acquisition user is produced by input equipment, and be reported to application framework layer;
Application framework layer is according to the current state of mobile terminal and the incoming event reported, judge that incoming event is edge incoming event, or normal incoming event, if normal incoming event then carries out processing and identification to normal incoming event, and recognition result is reported application layer, if edge incoming event then edge incoming event carries out processing and identification, and recognition result is reported application layer;
Application layer performs corresponding instruction according to the recognition result reported.
In one embodiment, described method also comprises:
For each incoming event creates the input equipment object that has device identification.
In one embodiment, described for each incoming event create one there is device identification input equipment object comprise:
Normal incoming event is corresponding with the touch-screen with the first device identification;
Application framework layer arranges one, and to have the second input equipment object of the second device identification corresponding with edge incoming event.
In one embodiment, described driving layer obtains the incoming event that user is produced by input equipment, and is reported to application framework layer to comprise:
Described driving layer is that each touch point gives one for distinguishing the numbering of finger, and adopts A protocol to report described incoming event.
In one embodiment, described driving layer obtains the incoming event that user is produced by input equipment, and is reported to application framework layer to comprise:
Described driving layer adopts B agreement to report described incoming event;
Described method also comprises:
Described application framework layer is the numbering that each touch point in described incoming event gives for distinguishing finger.
In one embodiment, the current state of described mobile terminal comprises: rotate 0 degree, dextrorotation turn 90 degrees, dextrorotation turnback, turn clockwise 270 degree, be rotated counterclockwise 90 degree, be rotated counterclockwise 180 degree and be rotated counterclockwise 270 degree.
In one embodiment, if the anglec of rotation is 0 degree, then as Wc<x< (W-Wc), then application framework layer judges that incoming event is normal incoming event, otherwise, be edge incoming event;
If the anglec of rotation is clockwise 90 degree, then as Wc<y<H-Wc, then application framework layer judges that incoming event is normal incoming event, otherwise, be edge incoming event;
If the anglec of rotation is clockwise 180 degree, then as Wc<x< (W-Wc), then application framework layer judges that incoming event is normal incoming event, otherwise, be edge incoming event;
If the anglec of rotation is clockwise 270 degree, then as Wc<y<H-Wc, then application framework layer judges that incoming event is normal incoming event, otherwise, be edge incoming event;
Wherein, x is the transverse axis coordinate being positioned at touch panel place coordinate system of touch point, and x is the transverse axis coordinate being positioned at touch panel place coordinate system of touch point, and W is the width of touch panel, and Wc is the width of edge Petting Area.
Fourth aspect, provides a kind of mobile terminal, comprising:
Input equipment;
Action sensor, for detecting the current state of described mobile terminal;
Driving layer, for obtaining the incoming event that user is produced by input equipment, and being reported to application framework layer;
Application framework layer, for according to the current state of mobile terminal and the incoming event that reports, judge that incoming event is edge incoming event, or normal incoming event, if normal incoming event then carries out processing and identification to normal incoming event, and recognition result is reported application layer, if edge incoming event then edge incoming event carry out processing and identification, and recognition result is reported application layer;
Application layer, for performing corresponding instruction according to the recognition result reported.
In one embodiment, described normal incoming event is corresponding with the first input equipment object with the first device identification;
Described application framework layer also for arranging the second input equipment object that has the second device identification, for corresponding with described edge incoming event.
In one embodiment, described driving layer adopts A agreement or B agreement to report incoming event, reports incoming event according to A agreement, then described event acquisition module is also for giving one for distinguishing the numbering of finger for each touch point;
Report incoming event according to B agreement, then described application framework layer is also for giving the numbering for distinguishing finger for each touch point.
In one embodiment, described driving layer comprises event acquisition module, for obtaining the incoming event that user is produced by input equipment.
In one embodiment, described application framework layer comprises input reader;
Described mobile terminal also comprises the device node be arranged between described driving layer and described input reader, for notifying that described input reader obtains incoming event;
Described input reader, for traveling through device node, obtaining incoming event and reporting.
In one embodiment, the current state of described mobile terminal comprises: rotate 0 degree, dextrorotation turn 90 degrees, dextrorotation turnback, turn clockwise 270 degree, be rotated counterclockwise 90 degree, be rotated counterclockwise 180 degree and be rotated counterclockwise 270 degree.
In one embodiment, described application framework layer also comprises: the first event processing module, reports after the incoming event for reporting described input reader carries out coordinate calculating;
For the coordinate figure reported according to current state and described first event processing module of described mobile terminal, first judge module, judges whether incoming event is edge incoming event, if not then reported by incoming event.
In one embodiment, described application framework layer also comprises:
Second event processing module, reports after the incoming event for reporting described input reader carries out coordinate calculating;
For the coordinate figure reported according to current state and the described second event processing module of described mobile terminal, second judge module, judges whether incoming event is edge incoming event, if then reported by incoming event.
In one embodiment, if the anglec of rotation is 0 degree, then as Wc<x< (W-Wc), then judged result is incoming event is normal incoming event, otherwise, be edge incoming event;
If the anglec of rotation is clockwise 90 degree, then as Wc<y<H-Wc, then judged result is incoming event is normal incoming event, otherwise, be edge incoming event;
If the anglec of rotation is clockwise 180 degree, then as Wc<x< (W-Wc), then judged result is incoming event is normal incoming event, otherwise, be edge incoming event;
If the anglec of rotation is clockwise 270 degree, then as Wc<y<H-Wc, then judged result is incoming event is normal incoming event, otherwise, be edge incoming event;
Wherein, x is the transverse axis coordinate being positioned at touch panel place coordinate system of touch point, and x is the transverse axis coordinate being positioned at touch panel place coordinate system of touch point, and W is the width of touch panel, and Wc is the width of edge Petting Area.
In one embodiment, described application framework layer also comprises:
Event distributes module, reports for the event described second judge module and described first judge module reported.
In one embodiment, described application framework layer also comprises:
First application module;
Second application module;
3rd judge module, for distributing according to described event whether the device identification decision event comprised in event that module reports is edge incoming event, if belong to, then reporting described first application module, otherwise reporting when described second application module;
Described first application module, for identify normal incoming event according to the correlation parameter of normal incoming event and recognition result be reported to application layer;
Described second application module, for carrying out identifying and the application layer reported by recognition result according to the correlation parameter edge incoming event of edge incoming event.
In one embodiment, described input equipment is the touch-screen of mobile terminal;
Described touch-screen comprises at least one input field, edge and at least one normal input field.
In one embodiment, described input equipment is the touch-screen of mobile terminal;
Described touch-screen comprises at least one input field, edge, at least one normal input field and at least one zone of transition.
Implement method of toch control of the present invention, subscriber equipment, input processing method and mobile terminal, can realize the rotation according to touch-screen, corresponding conversion touch area, edge, better to adapt to the operation of user, improves Consumer's Experience; On the other hand, owing to just carrying out the operation distinguishing A district and C district at application framework floor, and carry out the foundation of virtual unit at application framework layer, avoid and driving floor differentiation A district and C district to the dependence of hardware; By arranging touch point numbering, can realize distinguishing finger, compatible A agreement and B agreement; And accessible site is in the operating system of mobile terminal, applicable different hardware, different types of mobile terminal are portable good; All key elements (coordinate, numbering etc. of touch point) of touch point are stored, and follow-up judgement edge input (such as, FIT) can provide convenient.
Accompanying drawing explanation
Below in conjunction with drawings and Examples, the invention will be further described, in accompanying drawing:
Fig. 1 is the hardware configuration schematic diagram of the mobile terminal of one embodiment of the invention;
Fig. 2 is that the touch screen zone of the mobile terminal of first embodiment of the invention divides schematic diagram;
The schematic diagram that Fig. 3 is the touch-screen anglec of rotation of the mobile terminal of the embodiment of the present invention when being 0 degree;
The schematic diagram that Fig. 4 is the touch-screen anglec of rotation of the mobile terminal of the embodiment of the present invention when being clockwise 90 degree;
The schematic diagram that Fig. 5 is the touch-screen anglec of rotation of the mobile terminal of the embodiment of the present invention when being clockwise 180 degree;
The schematic diagram that Fig. 6 is the touch-screen anglec of rotation of the mobile terminal of the embodiment of the present invention when being clockwise 270 degree;
Fig. 7 is the schematic flow sheet of the method for toch control of the embodiment of the present invention;
Fig. 8 is the software architecture schematic diagram of the mobile terminal of one embodiment of the invention;
Fig. 9 is the structural representation of the mobile terminal of one embodiment of the invention;
Figure 10 is the schematic flow sheet judging edge incoming event in the embodiment of the present invention; .
Figure 11 is that the embodiment of the present invention judges the schematic flow sheet of incoming event according to device identification;
Figure 12 is the process flow diagram of the input processing method of the embodiment of the present invention;
Figure 13 is the effect schematic diagram that the camera applications of mobile terminal when utilizing the input processing method of the embodiment of the present invention to be 0 degree to the anglec of rotation is opened;
Figure 14 is the effect schematic diagram that the camera applications of mobile terminal when utilizing the input processing method of the embodiment of the present invention to be clockwise 90 degree to the anglec of rotation is opened;
Figure 15 is that the touch screen zone of the mobile terminal of second embodiment of the invention divides schematic diagram;
Figure 16 is the hardware configuration schematic diagram of the subscriber equipment of one embodiment of the invention.
Embodiment
In order to there be understanding clearly to technical characteristic of the present invention, object and effect, now contrast accompanying drawing and describe the specific embodiment of the present invention in detail.
See Fig. 1, the mobile terminal of one embodiment of the invention comprises: input equipment, processor 903 and display screen 904.In one embodiment, input equipment is touch-screen 2010.Touch-screen 2010 comprises touch panel 901 and touch controller 902.In addition, input equipment also can be non-touch input equipment (such as, infrared input equipment etc.) etc.
Touch controller 902 can be single asic (ASIC), and it can comprise one or more processor subsystem, and processor subsystem can comprise one or more arm processor or other has the processor of similar functions and performance.
Touch controller 902 is mainly used in receiving the touch signal resulting from touch panel 901, is transferred to the processor 903 of mobile terminal after row relax of going forward side by side.This processing example as, for physical input signal being carried out analog to digital conversion, process obtains touch point coordinate, process obtains touching duration etc.
Processor 903 receives the output of touch controller 902, performs an action after processing based on this output.Described action includes but not limited to, mobile such as by showing or the object of designator, roll or pan, adjustment controls to arrange, open file or document, check menu, as selection, perform instruction, operation is couple to the peripherals of main process equipment, answer calls, call, termination telephone is called out, change volume or audio setting, be stored in the relevant information of telephone communication (such as, address, conventional number, connect calling, missed call), log into thr computer or computer network, allow the confined area of authorizing individual access computing machine or computer network, record and configure with the user preferences of computer desktop the user profiles be associated, allow accesses network content, start specific program, encryption or decode messages, etc..
Processor 903 is also connected with display screen 904.Display screen 904 provides UI for the user to equipment.
In certain embodiments, processor 903 can be the parts separated with touch controller 902.In other embodiments, processor 903 can be the parts of a synthesis with touch controller 902.
In one embodiment, touch panel 901 is provided with discrete capacitive sensor, resistance sensor, force snesor, optical sensor or similar sensor.
Include in touch panel 901 and make horizontal and vertical electrod-array by conductive material.The single-point touch-screen (only can determine the coordinate that single-point touches) of capable for M of N row electrod-array, touch controller 902 adopts self-capacitance to scan, just can calculate according to every a line and each column signal after then scanning M capable and N row respectively, coordinate is on the touchscreen pointed in location.Scanning times is M+N time.
The multi-contact touch screen of capable for M of N row electrod-array (can detect and resolve the coordinate of multiple spot, i.e. multi-point touch), touch controller 902 adopts multiconductor mutual capacitance to scan, and scans the point of crossing of row and column, thus, scanning times is M × N time.
When the finger touch panel of user, touch panel produces touch signal (for electric signal) and sends to touch controller 902.Touch controller 902 can obtain the coordinate of touch point by scanning.In one embodiment, the touch panel 901 of touch-screen 2010 is a set of independently coordinate positioning physically, after the touch point coordinate of each touch is reported to processor 903, be converted to the pixel coordinate being adapted to display screen 904 by processor 903, correctly to identify input operation.
It is the Region dividing schematic diagram of the touch panel of first embodiment of the invention see Fig. 2.In this embodiment, in order to realize edge false-touch prevention and provide new interactive mode, the touch panel of touch-screen is divided into three regions, wherein, C district 101 is input field, edge, and A district 100 is normal input field.
In an embodiment of the present invention, the input operation in A district, processes according to existing normal process mode, such as, clicks certain application icon and namely open this application etc. in A district 100.For the input operation in C district 101, may be defined as edge input processing mode, such as, in definable C district 101, namely bilateral slip carries out terminal acceleration etc.
In an embodiment of the present invention, C district can adopt fixed form to divide or self-defined division.Fixed partition, namely arrange regular length, fixed broadband region as C district 101.C district 101 can comprise the subregion being positioned at subregion on the left of touch panel and right side, and its position is fixed at the both sides of the edge of touch panel, as shown in Figure 1.Certainly, also only C district 101 can be divided in side edge.
Self-defined division, the i.e. number in the region in C district 101, position and size, the setting that can customize, such as, can be set by user, also can by mobile terminal according to self-demand, the quantity in the region in adjustment C district 101, position and size.Usually, the fundamental figure in C district 101 is designed to rectangle, as long as two of tablet pattern diagonal angle apex coordinates can determine position and the size in C district.
For meeting the use habit of different user to different application, the Duo Tao C district's plan of establishment be applied under different application scene also can be set.Such as, under system desktop, because icon occupy-place is more, the C sector width of both sides arranges to obtain relative narrower; And after click camera icon enters camera applications, can arrange the C district quantity under this scene, position, size, when not affecting focusing, it is relatively wide that C sector width can be arranged.
The embodiment of the present invention is not restricted the division in C district, set-up mode.
See Fig. 3, touch panel upper left corner T0 is set to true origin, and coordinate figure is (0,0).And the coordinate figure in the lower right corner of touch panel is T7 (W, H), wherein, W is the width of touch panel, and H is the height of touch panel.
In one embodiment of the invention, touch-screen is divided into A district and C district as mentioned above, A district and C district belong to the same coordinate system.After the touch panel of mobile terminal is divided into multiple region, coordinate also correspondence is divided.Such as, if the width of touch panel is W, C sector width is Wc, then coordinate is positioned at the touch point in the region that T0, T1, T4 and T5 limit, and/or coordinate is positioned at the touch point in the region that T2, T3, T6 and T7 limit, is defined as touch point, edge; And touch point coordinate being positioned at the region that T1, T2, T5 and T6 limit is defined as normal touch point.
See Fig. 4, with the touch-screen orientation described in above-mentioned Fig. 3 for initial orientation, clockwise by touch-screen 90-degree rotation, now, coordinate system does not change.For the ease of operation, C zone position there occurs change, see Fig. 4, after touch-screen dextrorotation turn 90 degrees, coordinate is positioned at the touch point in the region that T0, S2, S4 and T3 limit, and/or coordinate is positioned at the touch point in the region that T4, S1, T7 and S3 limit, is defined as touch point, edge; And touch point coordinate being positioned at the region that S1, S2, S3 and S4 limit is defined as normal touch point.
See Fig. 5, with the touch-screen orientation described in above-mentioned Fig. 3 for initial orientation, clockwise touch-screen is revolved turnback, now, coordinate system does not change, and C zone position does not change.
See Fig. 6, with the touch-screen orientation described in above-mentioned Fig. 3 for initial orientation, clockwise touch-screen is rotated 270 degree, now, coordinate system does not change, the position in C district and identical shown in above-mentioned Fig. 4.
Touch-screen state as shown in figures 3 to 6, the coordinate system of its touch-screen does not all change, namely no matter the touch-screen of mobile terminal is in any state of above-mentioned Fig. 3-Fig. 6 or the state (these rotation status can be detected by action sensor 906 and obtain) of other anglec of rotation, when touch panel 901 receives touch signal, the coordinate of the touch point that touch controller 902 reports is all carry out reporting according to the coordinate system shown in Fig. 3, can not pay close attention to the rotation status of touch-screen.And due to after touch-screen 2010 rotates, display screen 904 also there occurs rotation accordingly, and the coordinate that touch controller 902 is reported is carried out adaptive conversion to adapt to the pixel coordinate of display screen 904 by processor 903.Store the corresponding relation between the anglec of rotation and conversion method in storer 905, such conversion will be introduced follow-up.
See Fig. 7, based on above-mentioned mobile terminal, the method for toch control of the embodiment of the present invention comprises the following steps:
The touch signal that S100, detection result from touch panel.
S101, according to touch signal identification touch point.
Concrete, when pointing or other object touch panel produces touch gestures, generate touch signal, touch controller detects this signal, and obtains the physical coordinates of touch point by modes such as scannings.In embodiments of the present invention, the coordinate system as shown in Fig. 3-Fig. 6 is adopted.
From the above mentioned, the touch-screen of the mobile terminal of the embodiment of the present invention is divided into edge Petting Area and normal touch district, therefore, defines respectively the touch gestures of not same district.In one embodiment, the touch gestures in normal touch district comprises: click, double-click, slip etc.The touch gestures of edge Petting Area comprises: sliding in left side edge, left side edge glides, cunning in right side edge, right side edge downslide, bilateral upper cunning, bilateral downslide, gripping mobile phone corner, monolaterally to slide back and forth, hold, be hold by one hand.
Should be understood that " left side " and " right side " is here comparatively speaking, such as, if shown in Fig. 3, then the region at M point place is " left side ", and relative side is on " right side ".If shown in Fig. 4, then the region at M point place is " left side ", and relative side is on " right side ".Namely, in the embodiment of the present invention, " left side " and " right side " changes along with the rotation of touch-screen.
The anglec of rotation of S102, detection touch panel, according to the touch point identified and the described anglec of rotation, judges that touch point is positioned at touch area, edge or normal touch region.
Concrete, the anglec of rotation of touch panel can detect the anglec of rotation of mobile terminal by action sensor thus draw.
Processor judges the region belonging to touch point according to the physical coordinates that touch controller reports.In an embodiment of the present invention, the coordinate range in each region is stored in storer.
See Fig. 3 and Fig. 5, the coordinate range of touch area, edge for: coordinate is positioned at the region that T0, T1, T4 and T5 limit, and/or coordinate is positioned at the region that T2, T3, T6 and T7 limit.The coordinate range in normal touch region for: coordinate is positioned at the region that T1, T2, T5 and T6 limit.
See Fig. 4 and Fig. 6, when there is clockwise 90 degree of rotations or clockwise 270 degree of rotations in touch-screen, the coordinate range of touch area, edge for: coordinate is positioned at the region that T0, S2, S4 and T3 limit, and/or coordinate is positioned at the region that T4, S1, T7 and S3 limit.The coordinate range in normal touch region for: coordinate is positioned at the region that S1, S2, S3 and S4 limit.
S103, perform corresponding instruction based on judged result.
Concrete, because the coordinate of touch panel and the coordinate of display screen are two independently coordinate systems, therefore, need the pixel coordinate physical coordinates of touch panel being mapped as display screen, to realize correctly showing contact effect, identifying touch gestures.Concrete, transformation rule is:
The anglec of rotation is 0, and when being namely in the state shown in Fig. 3, for touch point M, the coordinate that touch controller reports is (xc, yc), then without the need to changing, namely the coordinate of display screen is similarly (xc, yc).
When the anglec of rotation is clockwise 90 degree, when being namely in the state shown in Fig. 4, for touch point M, the coordinate that touch controller reports is (xc, yc), then the coordinate after conversion is (yc, W-xc).
When the anglec of rotation is clockwise 180 degree, when being namely in the state shown in Fig. 5, for touch point M, the coordinate that touch controller reports is (xc, yc), then the coordinate after conversion is (W-xc, H-yc).
When the anglec of rotation is clockwise 270 degree, when being namely in the state shown in Fig. 6, for touch point M, the coordinate that touch controller reports is (xc, yc), then the coordinate after conversion is (H-yc, xc).
Should understand, above-mentioned transformation rule be based upon on the size of the display screen coordinate system basis identical with the size of touch panel coordinate system (such as, be 1080 × 1920 pixels), if the coordinate system of display screen is not identical with the size of touch panel coordinate system, then after above-mentioned conversion, also to be adjusted to the coordinate being adapted to display screen, concrete, the coordinate of touch panel is multiplied by corresponding conversion coefficient.The ratio of the size of conversion coefficient and display screen and touch panel.Such as, if touch panel is 720 × 1280, and display screen is 1080 × 1920, then the ratio of display screen and touch panel is 1.5, thus, the horizontal ordinate of the physical coordinates of the touch panel reported and ordinate are multiplied by 1.5 respectively, are (xc, yc) originally, then be converted to display screen coordinate time and then become (1.5 × xc, 1.5 × yc), or (1.5 × yc, 1.5 × W-xc) etc.
After coordinate conversion and adjustment, can realize showing accurately, identify correct touch control gesture, perform the instruction corresponding with touch control gesture thus.In an embodiment of the present invention, touch control gesture and instruction one_to_one corresponding being stored in storer.
The method of toch control of the embodiment of the present invention can realize converting touch area, edge accordingly according to the rotation of touch-screen, better to adapt to the operation of user, improves Consumer's Experience.
See Fig. 8, the software architecture schematic diagram of the mobile terminal of one embodiment of the invention.The software architecture of the mobile terminal of the embodiment of the present invention comprises: input equipment 201, driving layer 202, application framework layer 203 and application layer 204.Wherein, the function of layer 202, application framework layer 203 and application layer 204 is driven to be performed by processor 903.In one embodiment, input equipment 201 is the touch-screen comprising touch panel and touch controller.
Input equipment 201 receives the input operation of user, changes physics input into touch signal, is passed to by touch signal and drives layer 202; Drive the position of layer 202 to input to resolve, obtain the parameter such as concrete coordinate, duration of touch point, this parameter is uploaded to application framework layer 203, application framework layer 203 realizes by corresponding interface to driving the communication of layer 202.Application framework layer 203 receives the parameter driving layer 202 to report, resolve, distinguish edge incoming event and normal incoming event, and by effectively inputting which application concrete upwards passing to application layer 204, perform different entering the operating instructions to meet application layer 204 according to different input operations.
See Fig. 9, it is the structural representation of the mobile terminal of one embodiment of the invention.In one embodiment of the invention, input equipment comprises touch-screen 2010 described above.Layer 202 is driven to comprise event acquisition module 2020.Device node 2021 is provided with between driving layer 202 and application framework layer 203.Application framework layer 203 comprises input reader 2030, first event processing module 2031, second event processing module 2032, first judge module 2033, second judge module 2034 and event and distributes module 2035, the 3rd judge module 2036, first application module 2037, second application module 2038 etc.
Wherein, layer 202 is driven to comprise event acquisition module 2010, for obtaining the incoming event that user is produced by input equipment 201, such as, the input operation event of being undertaken by touch-screen.In an embodiment of the present invention, incoming event comprises: normal incoming event (A district incoming event) and edge incoming event (C district incoming event).The input operations such as what normal incoming event was included in that A district carries out clicks, double-click, slip.Edge incoming event be included in sliding in the left side edge of carrying out in C district, left side edge glides, cunning in right side edge, right side edge downslide, bilateral upper cunning, bilateral downslide, gripping mobile phone corner, monolaterally to slide back and forth, hold, the input operation such as to be hold by one hand.
In addition, event acquisition module 2010 is also for obtaining the correlation parameter such as coordinate, duration of the touch point of input operation.Report incoming event according to A agreement, then event acquisition module 2010 is also for giving one for distinguishing the numbering (ID) of finger for each touch point.Thus, report incoming event according to A agreement, then the data reported comprise the parameter such as coordinate, duration of touch point, and the numbering of touch point.
Drive between layer 202 and input reader 2030 and be provided with device node 2011, for notifying that the input reader (inputreader) 2030 of application framework layer 203 obtains incoming event.
Input reader 2030, for traveling through device node, obtaining incoming event and reporting.If drive layer 202 to adopt B agreement to report incoming event, then input reader 2030 also for giving the numbering (ID) for distinguishing finger for each touch point.In an embodiment of the present invention, reader 2030 is inputted also for being stored by all element informations (coordinate, duration, numbering etc.) of touch point.
In an embodiment of the present invention, distinguish different incoming events to respond for the ease of application layer 204, each incoming event creates the input equipment object that has device identification.In one embodiment, can be normal incoming event and create the first input equipment object, it has the first mark.First input equipment object is corresponding with actual hardware touch-screen.
In addition, application framework layer 203 also comprises one second input equipment object 2031.This second input equipment object 2031 (such as, edge input equipment, FITdevice) is virtual unit, and be a null device, it has one second to identify, for corresponding with edge incoming event.Should be understood that also can be corresponding with the first input equipment object with the first mark by edge incoming event, and corresponding with the second input equipment object with the second mark by normally controlling event.
First event processing module 2031, processes for the incoming event reported input reader 2030, and such as, the coordinate of touch point calculates.
Second event processing module 2032, processes for the incoming event reported input reader 2030, and such as, the coordinate of touch point calculates.
First judge module 2033 for whether being edge incoming event according to coordinate figure (X value) decision event, if not the event that then event uploaded to distributes module 2035.
Second judge module 2034 for whether being edge incoming event according to coordinate figure (X value) decision event, if the event that then event uploaded to distributes module 2035.
See Figure 10, first judge module 2033 is when whether decision event is edge incoming event, obtain the transverse axis coordinate of touch point, the transverse axis coordinate (i.e. X-axis coordinate) (x) of touch point and C sector width (Wc) and touch-screen width (W) are compared.Concrete, if Wc<x< (W-Wc), touch point is positioned at A district, and event is normal incoming event; Otherwise event is edge incoming event; If event is not edge incoming event (being normal incoming event), reporting events is distributed module 2035 to event.Same, the second judge module 2034, when whether decision event is edge incoming event, judges according to the mode shown in Fig. 4, if judged result is event is edge incoming event, then reporting events is distributed module 2035 to event.
Should be understood that the judgement flow process shown in Figure 10 be the mobile terminal be based upon as shown in Figure 2 touch-screen basis on, namely mobile terminal comprises the C district 101 being positioned at edge, the left and right sides, and is positioned at middle A district 100.Therefore, when carrying out setting coordinate along the coordinate system shown in Fig. 3, if Wc<x< (W-Wc), can determine that touch point is positioned at A district.In other embodiments, judgment formula (Wc<x< (W-Wc)) can adjust according to the division of mobile terminal area, such as, if mobile terminal only comprises the C district 101 that is positioned at left side edge, and its width is Wc, then as Wc<x<W, touch point is positioned at A district; Otherwise touch point is positioned at C district.If mobile terminal only comprises the C district 101 that is positioned at right side edge, and its width is Wc, then, as x< (W-Wc), touch point is positioned at A district; Otherwise touch point is positioned at C district.
Should be understood that, when mobile terminal rotates, action sensor can detect this rotation, and rotation information is passed to processor.In the embodiment of the present invention, processor carries out the judgement in incoming event region in conjunction with the testing result of action sensor.Concrete, if the anglec of rotation is clockwise 90 degree, namely rotate to be the state shown in Fig. 4, then the basis for estimation of the first judge module and the second judge module becomes: if Wc<y<H-Wc, then touch point is positioned at A district, otherwise touch point is positioned at C district.Wherein, y is the Y-axis coordinate of touch point.
If the anglec of rotation is clockwise 180 degree, namely rotate to be the state shown in Fig. 5, then the basis for estimation of the first judge module and the second judge module is: if Wc<x< (W-Wc), then touch point is positioned at A district, otherwise touch point is positioned at C district.
If the anglec of rotation is clockwise 270 degree, namely rotate to be the state shown in Fig. 6, then the basis for estimation of the first judge module and the second judge module becomes: if Wc<y<H-Wc, then touch point is positioned at A district, otherwise touch point is positioned at C district.Wherein, y is the Y-axis coordinate of touch point.
Should understand, if only in a certain Region dividing C district of the side of touch-screen or side, the judgement of incoming event region adjusts accordingly, entirety judges that thinking is: no matter whether touch-screen rotates, determining length and the width in C district, determining its coordinate range, when judging, get rid of according to coordinate range, to determine the region at incoming event place.
Event distributes module 2035 for edge incoming event and/or A district incoming event are reported to the 3rd judge module 2036.In one embodiment, edge incoming event reports adopted passage not identical with A district incoming event.Edge incoming event adopts designated lane to report.
In addition, event distributes module 2035 also for obtaining the current state of mobile terminal, reports after changing and adjust according to current state to the coordinate reported.
In the embodiment of the present invention, obtain the current state of mobile terminal according to the testing result of action sensor.Current state comprises: the anglec of rotation is 0 degree, clockwise 90 degree, clockwise 180 degree, clockwise 270 degree etc.If should be understood that and be rotated counterclockwise, then counterclockwise 90 degree with clockwise 270 degree identical, counterclockwise 180 degree with clockwise 180 degree identical, counterclockwise 270 degree with clockwise 90 degree identical.
The specific implementation changed coordinate and adjust, see the description in above-mentioned steps S103, does not repeat them here.
In one embodiment, event distributes module 2036 and is realized by inputdispatcher::dispatchmotion ().
3rd judge module 2036, for whether being edge incoming event according to device identification (ID) decision event, if belong to, then reports the first application module 2037, otherwise reports when the second application module 2038.
Concrete, see Figure 11, the 3rd judge module 2036, when judging, first obtains device identification, determines whether touch screen type equipment according to device identification; If so, then further whether judgment device mark is the mark of the device identification of C district and above-mentioned second input equipment object, is if so, then judged as edge incoming event, if not, is then judged as normal incoming event.Should be understood that also can after being judged as touch screen kind equipment, and whether further judgment device mark is the mark that the device identification of A district and above-mentioned first input equipment are corresponding, is if so, then judged as normal incoming event, if not, is then judged as edge incoming event.
In an embodiment of the present invention, first application module 2037 inputs relevant incoming event for the treatment of to A district, concrete, this process comprises: touch point coordinate, duration, numbering etc. according to input operation carry out processing and identification, and recognition result is reported to application layer.Second application module 2038 inputs relevant incoming event for the treatment of to C district, concrete, and this process comprises: carry out processing and identification according to the touch point coordinate of process operation, duration, numbering, and recognition result is reported to application layer.Such as, input operation can be identified according to the coordinate of touch point, duration and numbering and be the clicking of A district, slide, or C district is monolateral sliding etc. back and forth.
Application layer 204 comprises the application such as camera, picture library, screen locking (application 1, application 2 ...).Input operation in the embodiment of the present invention comprises application layer and system-level, and system-level gesture process is also classified as application layer.Wherein, application layer is the manipulation of application programs, such as, and unlatching, closedown, volume control etc.System-level is manipulation to mobile terminal, such as, start, accelerate, switch between application, the overall situation returns.Application layer can be processed by the incoming event in the Listener acquisition C district of registration C district event, also can be processed by the incoming event in the Listener acquisition A district of registration A district event.
In one embodiment, mobile terminal arranges and stores the instruction corresponding from different input operations, comprising the instruction corresponding with edge input operation and the instruction corresponding with normal input operation.Application layer receives the recognition result of the edge incoming event reported, and namely calls corresponding instruction to respond this edge input operation according to edge input operation.Application layer receives the recognition result of the normal incoming event reported, and namely calls corresponding instruction to respond this normal input operation according to normal input operation.
Should be understood that the incoming event of the embodiment of the present invention comprises the input operation only in A district, the input operation only in C district and results from the input operation in A district and C district simultaneously.Thus, instruction also comprises the instruction corresponding with this three classes incoming event.The combination that the embodiment of the present invention can realize A district and the input operation of C district controls mobile terminal, such as, input operation is the relevant position of simultaneously clicking A district and C district, corresponding instruction is for closing a certain application, therefore, by clicking the input operation of A district and relevant position, C district simultaneously, the closedown to application can be realized.
The mobile terminal of the embodiment of the present invention, can realize converting touch area, edge accordingly according to the rotation of touch-screen, better to adapt to the operation of user, improves Consumer's Experience; On the other hand, owing to just carrying out the operation distinguishing A district and C district at application framework floor, and carry out the foundation of virtual unit at application framework layer, avoid and driving floor differentiation A district and C district to the dependence of hardware; By arranging touch point numbering, can realize distinguishing finger, compatible A agreement and B agreement; And distribute the function accessible site of module 2035, the 3rd judge module 2036, first application module 2037, second application module 2038 etc. in the operating system of mobile terminal owing to inputting reader 2030, first event processing module 2031, second event processing module 2032, first judge module 2033, second judge module 2034 and event, applicable different hardware, different types of mobile terminal are portable good; All key elements (coordinate, numbering etc. of touch point) of a touch point can save by input reader (InputReader) automatically, for follow-up judgement edge input (such as, FIT) provides convenient.
Be the process flow diagram of the input processing method of the embodiment of the present invention see Figure 12, comprise the following steps:
S1, the incoming event driving layer acquisition user to be produced by input equipment, and be reported to application framework layer.
Concrete, input equipment receives the input operation (i.e. incoming event) of user, changes physics input into electric signal, and by electrical signal transfer to driving layer.In embodiments of the present invention, incoming event comprises A district incoming event and C district incoming event.The input operations such as what A district incoming event was included in that A district carries out clicks, double-click, slip.C district incoming event be included in sliding in the left side edge of carrying out in C district, left side edge glides, cunning in right side edge, right side edge downslide, bilateral upper cunning, bilateral downslide, monolaterally to slide back and forth, hold, the input operation such as to be hold by one hand.
Drive layer to resolve input position according to the electric signal received, obtain the correlation parameter such as concrete coordinate, duration of touch point.This correlation parameter is reported to application framework layer.
In addition, if drive layer to adopt A agreement to report incoming event, then this step S1 also comprises:
For each touch point gives one for distinguishing the numbering (ID) of finger.
Thus, if drive layer to adopt A agreement to report incoming event, then the data reported comprise above-mentioned correlation parameter, and the numbering of touch point.
S2, application framework layer judge that incoming event is edge incoming event, or normal incoming event, if normal incoming event then performs step S3, if edge incoming event then performs step S4.
Concrete, according to the coordinate in the correlation parameter of incoming event, application framework layer can judge that it is edge incoming event or normal incoming event.See above-mentioned Figure 10, first obtain the transverse axis coordinate of touch point, then the transverse axis coordinate (i.e. X-axis coordinate) (x) of touch point and C sector width (Wc) and touch-screen width (W) are compared.If Wc<x< (W-Wc), touch point is positioned at A district, and event is normal incoming event; Otherwise event is edge incoming event.If drive layer to adopt B agreement to report incoming event, then step S2 also specifically comprises: for the numbering (ID) for distinguishing finger is given in each touch point; All element informations (coordinate, duration, numbering etc.) of touch point are stored.
Should be understood that when touch-screen rotates, judge, see foregoing description, not repeat them here accordingly.
Thus, the embodiment of the present invention, by arranging touch point numbering, can realize distinguishing finger, compatible A agreement and B agreement; And all key elements of touch point (coordinate, numbering etc. of touch point) are stored, follow-up judgement edge input (such as, FIT) can provide convenient.
In one embodiment, edge incoming event and normal incoming event report adopted passage not identical.Edge incoming event adopts designated lane.
S3, application framework layer carry out processing and identification to normal incoming event, and recognition result are reported application layer.
S4, application framework layer edge incoming event carry out processing and identification, and recognition result are reported application layer.
Concrete, processing and identification comprises: touch point coordinate, duration, numbering etc. according to input operation carry out processing and identification, to determine input operation.Such as, can identify according to the coordinate of touch point, duration and numbering is the input operations such as the clicking of A district, slip, or the monolateral input operation such as sliding back and forth in C district.
S5, application layer perform corresponding instruction according to the recognition result reported.
Concrete, application layer comprises the application such as camera, picture library, screen locking.Input operation in the embodiment of the present invention comprises application layer and system-level, and system-level gesture process is also classified as application layer.Wherein, application layer is the manipulation of application programs, such as, and unlatching, closedown, volume control etc.System-level is manipulation to mobile terminal, such as, start, accelerate, switch between application, the overall situation returns.
In one embodiment, mobile terminal arranges and stores the instruction corresponding from different input operations, comprising the instruction corresponding with edge input operation and the instruction corresponding with normal input operation.Application layer receives the recognition result of the edge incoming event reported, and namely calls corresponding instruction to respond this edge input operation according to edge input operation; Application layer receives the recognition result of the normal incoming event reported, and namely calls corresponding instruction to respond this normal input operation according to normal input operation.
Should be understood that the incoming event of the embodiment of the present invention comprises the input operation only in A district, the input operation only in C district and results from the input operation in A district and C district simultaneously.Thus, instruction also comprises the instruction corresponding with this three classes incoming event.The combination that the embodiment of the present invention can realize A district and the input operation of C district controls mobile terminal, such as, input operation is the relevant position of simultaneously clicking A district and C district, corresponding instruction is for closing a certain application, therefore, by clicking the input operation of A district and relevant position, C district simultaneously, the closedown to application can be realized.
In one embodiment, the input processing method of the embodiment of the present invention also comprises:
S11, create an input equipment object with device identification for each incoming event.
Concrete, in one embodiment, can be normal incoming event and create the first input equipment object, it has the first mark.First input equipment object is corresponding with input equipment touch-screen.Application framework layer arranges one second input equipment object.This second input equipment object (such as, being FITdevice) is virtual unit, is a null device, and it has one second mark, for corresponding with edge incoming event.Should be understood that also can be corresponding with the first input equipment object with the first mark by edge incoming event, and corresponding with the second input equipment object with the second mark by normally controlling event.
In one embodiment, the input processing method of the embodiment of the present invention also comprises:
S21, application framework layer, according to the current state of mobile terminal, report after changing and adjust according to current state to the coordinate reported.
Concrete, the current state of mobile terminal comprises: the anglec of rotation is 0 degree, clockwise 90 degree, clockwise 180 degree, clockwise 270 degree etc.
Should be understood that in an embodiment of the present invention, if be rotated counterclockwise, then counterclockwise 90 degree with clockwise 270 degree identical, counterclockwise 180 degree with clockwise 180 degree identical, counterclockwise 270 degree with clockwise 90 degree identical.
The specific implementation changed coordinate and adjust, see the description in above-mentioned steps S103 and application framework layer, does not repeat them here.
In one embodiment, step S21 can be realized by inputdispatcher::dispatchmotion ().
S22, judge whether incoming event is edge incoming event according to device identification, if belong to, then above perform step S3, if do not belong to, perform step S4.
Concrete, see above-mentioned Figure 11, when judging whether incoming event is edge incoming event according to device identification, first obtain device identification, determine whether touch screen type equipment according to device identification; If so, then further whether judgment device mark is the mark of the device identification of C district and above-mentioned second input equipment object, is if so, then judged as edge incoming event, if not, is then judged as normal incoming event.Should be understood that also can after being judged as touch screen kind equipment, and whether further judgment device mark is the mark that the device identification of A district and above-mentioned first input equipment are corresponding, is if so, then judged as normal incoming event, if not, is then judged as edge incoming event.
The input processing method of the embodiment of the present invention, can realize converting touch area, edge accordingly according to the rotation of touch-screen, better to adapt to the operation of user, improves Consumer's Experience; On the other hand, owing to just carrying out the operation distinguishing A district and C district at application framework floor, and carry out the foundation of virtual unit at application framework layer, avoid and driving floor differentiation A district and C district to the dependence of hardware; By arranging touch point numbering, can realize distinguishing finger, compatible A agreement and B agreement; And accessible site is in the operating system of mobile terminal, applicable different hardware, different types of mobile terminal are portable good; All key elements (coordinate, numbering etc. of touch point) of touch point are stored, and follow-up judgement edge input (such as, FIT) can provide convenient.
See Figure 13, it is the effect schematic diagram utilizing the camera applications of the input processing method of the embodiment of the present invention to mobile terminal to open.Wherein, the figure on Figure 13 left side is the main interface schematic diagram of mobile terminal, and wherein, region 1010 is the touch point of the input operation of the realized unlatching camera function pre-set at edge input area (C region 101).Concrete, clicking on region 1010 can realize opening camera.Then in the terminal, storing instruction is: open camera, it is corresponding with the input operation of clicking on region 1010.
When needs use camera, the region 1010 of user click touch-screen, drives layer to obtain this incoming event, and is reported to application framework layer.According to the coordinate of touch point, application framework layer can judge that this incoming event is edge incoming event.Application framework layer carries out processing and identification to this edge incoming event, and according to touch point coordinate, duration and coding, identifying this input operation is clicking on region 1010.Recognition result is reported to application layer by application framework layer, and namely application layer performs the instruction of opening camera.
See Figure 14, when mobile terminal turns clockwise after 90s, C region 101 there occurs corresponding change with the touch point that can realize opening camera function.The flow process of camera and the similar of above-mentioned Figure 13 is opened after clicking the region 1010 in Figure 14.
Should be understood that in Figure 13 and Figure 14, after opening camera function, not shown C district, but it still exists, or according to the above-mentioned description to C Division of the embodiment of the present invention, open after camera, it is relatively wide etc. that C sector width can be arranged, and this can be readily appreciated by one skilled in the art.
It is the touch-screen division schematic diagram of the mobile terminal of second embodiment of the invention see Figure 15.In this embodiment, in order to prevent the region of departing from input beginning in user's input process from causing accuracy rate to decline, zone of transition 103 (T district) is increased at the touch panel edge of mobile terminal.
In this embodiment, if incoming event is from C district, is offset to T district and then still thinks that this slip is edge gesture; If incoming event is from C district, be offset to A district, then think that this edge gesture terminates, start normal incoming event; If incoming event is from T district or A district, no matter slides into any region of touch panel afterwards, all think that this slip is normal incoming event.
The report flow of the incoming event of this embodiment is identical with the interaction control method described in above-described embodiment, difference is only: when application framework layer edge incoming event carries out processing and identification, need to judge according to above-mentioned three kinds of situations, to determine incoming event accurately.Such as, application framework floor judges to obtain incoming event from C district according to the touch point that certain incoming event reports, and (touch point coordinate when namely input starts is positioned at C district to be offset to A district, and the coordinate of a certain touch point is positioned at A district in input process), then the first judge module and the second judge module are incoming event according to the result that coordinate judges is edge incoming event, and this edge incoming event terminates, start normal incoming event, drive layer namely to start reporting of incoming event next time.
The mobile terminal of the embodiment of the present invention can be implemented in a variety of manners.Such as, the terminal described in the present invention can comprise the such as mobile terminal of mobile phone, mobile phone, smart phone, notebook computer, digit broadcasting receiver, PDA (personal digital assistant), PAD (panel computer), PMP (portable media player), guider etc. and the fixed terminal of such as digital TV, desk-top computer etc.
Accordingly, the embodiment of the present invention also provides a kind of subscriber equipment, is its hardware configuration schematic diagram see Figure 16.See Figure 16, subscriber equipment 1000 comprises touch-screen 2010, controller 200, memory storage 310, GPS chip 320, communicator 330, video processor 340, audio process 350, button 360, microphone 370, camera 380, loudspeaker 390 and action sensor 906.
Touch-screen 2010 can be divided into A district and C district as mentioned above, or A district, C district and T district.Touch-screen 2010 can be implemented as various types of display, such as LCD (liquid crystal display), OLED (Organic Light Emitting Diode) display and PDP (plasma display panel).Touch-screen 2010 can comprise driving circuit, and it can be embodied as, such as a-siTFT, LTPS (low temperature polycrystalline silicon) TFT and OTFT (organic tft), and back light unit.
Meanwhile, touch-screen 2010 can comprise the touch sensor of the touch gestures for sensing user.Touch sensor can be implemented as various types of sensor, such as capacity type, resistance type or piezo type.Capacity type is by sensing microgalvanometer calculation touch coordinate value encourage by the health of user when a part (such as, the finger of the user) touch-surface of user's body being coated with the touch-screen of conductive material surperficial.According to resistance type, touch-screen comprises two battery lead plates, and when user touches touch panel by the electric current of sensing flowing when the upper plate at touch point place contacts with lower plate, calculates touch coordinate value.In addition, when input function supported by subscriber equipment 1000, touch-screen 2010 can sense user's gesture of the input media for using such as pen and so on except user's finger.When input media is writing pencil (styluspen) comprising coil, subscriber equipment 1000 can comprise the magnetic sensor (not shown) for sensing magnetic field, and described magnetic field changes the degree of approach of magnetic sensor according to writing pencil interior loop.Thus, except sensing touch gesture, subscriber equipment 1000 can also sense close gesture, and namely writing pencil hovers over above subscriber equipment 1000.
Memory storage 310 can store various program needed for the operation of subscriber equipment 1000 and data.Such as, memory storage 310 can store program and the data of the various screens that above will show in each district (such as, A district, C district) for formation.
Controller 200 is by using the program and data displaying contents in each district of touch-screen 2010 that are stored in memory storage 310.
Controller 200 comprises RAM210, ROM220, CPU230, GPU (Graphics Processing Unit) 240 and bus 250.RAM210, ROM220, CPU230 and GPU240 can be connected to each other by bus 250.
CPU (processor) 230 access to storage device 310 and use the operating system (OS) be stored in memory storage 310 perform startup.And CPU230 performs various operation by using various programs, content and the data be stored in memory storage 310.
ROM220 stores the command set being used for system and starting.When open command is transfused to and electric power is provided, the OS be stored in memory storage 310 is copied to RAM210 according to being stored in command set in ROM220 by CPU230, and by running OS start up system.When startup completes, the various program copies that are stored in memory storage 310 to RAM210, and are performed various operation by the reproducer run in RAM210 by CPU230.Specifically, GPU240 can generate by using counter (not shown) and renderer (not shown) the screen comprising the so various objects of such as icon, image and text.Counter calculates the eigenwert that such as coordinate figure, form, size and color are such, wherein respectively according to the layout of screen color mark object.
GPS chip 320 is the unit from GPS (GPS) satellite reception gps signal, and calculates the current location of subscriber equipment 1000.When using Navigator or when asking the current location of user, controller 200 can by the position using GPS chip 320 to calculate user.
Communicator 330 is the unit according to various types of communication means and various types of external unit executive communication.Communicator 330 comprises WiFi chip 331, Bluetooth chip 332, wireless communication chips 333 and NFC chip 334.Controller 200 performs the communication with various external unit by using communicator 330.
WiFi chip 331 and Bluetooth chip 332 are respectively according to WiFi method and bluetooth approach executive communication.When using WiFi chip 331 or Bluetooth chip 332, such as service set identifier (servicesetidentifier, SSID) and the such various link informations of session key can first be received and dispatched, communication can be connected by using link information, and various information can be received and dispatched.Wireless communication chips 333 is the chips according to such as IEEE, Zigbee, 3G (third generation), the such various communication standard executive communications of 3GPP (third generation collaborative project) and LTE (Long Term Evolution).NFC chip 334 is that various RFID frequency span is 135 KHz, 13.56 megahertzes, 433 megahertzes, 860 ~ 960 megahertzes and 2.45 gigahertz (GHZ)s such as according to using NFC (near-field communication) method of 13.56 gigahertz bandwidth in the middle of various RFID frequency span to carry out the chip operated.
Video processor 340 is that process is included in the content that received by communicator 330 or is stored in the unit of the video data in the content in memory storage 310.Video processor 340 can perform the various image procossing for video data, such as decoding, convergent-divergent, noise filtering, frame rate conversion and resolution conversion.
Audio process 350 is that process is included in the content that received by communicator 330 or is stored in the unit of the voice data in the content in memory storage 310.Audio process 350 can perform the various process for voice data, such as decodes, amplifies and noise filtering.
Corresponding contents can be reproduced by driving video processor 340 and audio process 350 when running playback program Time Controller 200 for content of multimedia.
Loudspeaker 390 exports the voice data generated in audio process 350.
Button 360 can be various types of button, such as mechanical button or the touch pads formed on some regions the front of the main ectosome as subscriber equipment 1000, side or the back side or touch-wheel.
Microphone 370 is the unit receiving user speech or other sound and they are transformed to voice data.The user speech inputted by microphone 370 during controller 200 can be used in calling procedure, or they are transformed to voice data and are stored in memory storage 310.
Camera 380 is unit of control capturing still image according to user or video image.Camera 380 can be implemented as multiple unit, such as front camera and back side camera.As described below, camera 380 can be used as the device obtaining user images in the one exemplary embodiment of the sight of track user.
When providing camera 380 and microphone 370, controller 200 can according to the sound of the user inputted by microphone 370 or the user action executivecontrol function identified by camera 380.Therefore, subscriber equipment 1000 can operate under action control pattern or Voice command pattern.When operating under action control pattern, controller 200 takes user by activating camera 380, follows the tracks of the change of user action, and performs corresponding operation.When operating under Voice command pattern, controller 200 can operate to analyze the voice that inputted by microphone 370 and according to the user speech executivecontrol function analyzed under speech recognition mode.
In the subscriber equipment 1000 supporting action control pattern or Voice command pattern, in above-mentioned various one exemplary embodiment, use speech recognition technology or action recognition technology.Such as, when user perform that picture selects to mark in home screen to action like this or say corresponding to object voice command time, can determine to have selected corresponding object and the control operation with this object matching can be performed.
Action sensor 906 is unit of the movement of the main body of sensing user equipment 1000.Subscriber equipment 1000 can rotate or tilt along various direction.Action sensor 906 can sense by use in the so various sensors of such as geomagnetic sensor, gyro sensor and acceleration transducer one or more the moving characteristic that such as sense of rotation, angle and slope are such.Should be understood that accordingly, touch-screen also rotates when subscriber equipment rotates, and be identical with the anglec of rotation of subscriber equipment.
And, although not shown in figure 16, but according to one exemplary embodiment, subscriber equipment 1000 can also comprise can be connected with USB connector USB port, for connecting as earphone, mouse, LAN and reception and the various input port of various outer members the DMB chip processing DMB (DMB) signal and other sensors various.
As mentioned above, memory storage 310 can store various program.
Based on the subscriber equipment shown in Figure 16, in an embodiment of the present invention, touch-screen, for detecting the touch signal resulted from touch panel, and for according to touch signal identification touch point.
Action sensor, for detecting the anglec of rotation of subscriber equipment.
Processor, comprising: driver module, application framework module and application module;
Wherein, driver module, for obtaining incoming event according to touch signal, and is reported to application framework module;
Application framework module, for according to the position, touch point of incoming event reported and the anglec of rotation, judge that touch point is positioned at touch area, edge or normal touch region, if be positioned at touch area, edge, after carrying out processing and identification, recognition result is reported application module; If be positioned at normal touch region, after carrying out processing and identification, recognition result is reported application module;
Application module, for performing corresponding instruction according to the recognition result reported.
Should be understood that the principle of work of each module of the subscriber equipment of this embodiment and details and above-described embodiment describe identical, do not repeat them here.
The method of toch control of the embodiment of the present invention, subscriber equipment, input processing method and mobile terminal, can realize converting touch area, edge accordingly according to the rotation of touch-screen, better to adapt to the operation of user, improves Consumer's Experience; On the other hand, owing to just carrying out the operation distinguishing A district and C district at application framework floor, and carry out the foundation of virtual unit at application framework layer, avoid and driving floor differentiation A district and C district to the dependence of hardware; By arranging touch point numbering, can realize distinguishing finger, compatible A agreement and B agreement; And accessible site is in the operating system of mobile terminal, applicable different hardware, different types of mobile terminal are portable good; All key elements (coordinate, numbering etc. of touch point) of touch point are stored, and follow-up judgement edge input (such as, FIT) can provide convenient.
In process flow diagram or any process otherwise described in an embodiment of the present invention or method describe and can be understood to, represent and comprise one or more for realizing the module of the code of the executable instruction of the step of specific logical function or process, fragment or part, and the scope of embodiment of the present invention comprises other realization, wherein can not according to order that is shown or that discuss, comprise according to involved function by the mode while of basic or by contrary order, carry out n-back test, this should understand by those skilled in the art described in embodiments of the invention.
By reference to the accompanying drawings embodiments of the invention are described above; but the present invention is not limited to above-mentioned embodiment; above-mentioned embodiment is only schematic; instead of it is restrictive; those of ordinary skill in the art is under enlightenment of the present invention; do not departing under the ambit that present inventive concept and claim protect, also can make a lot of form, these all belong within protection of the present invention.

Claims (24)

1. a method of toch control, is characterized in that, comprising:
Detect the touch signal resulted from touch panel;
According to touch signal identification touch point;
Detect the anglec of rotation of touch panel;
According to the touch point identified and the described anglec of rotation, judge that touch point is positioned at touch area, edge or normal touch region;
Corresponding instruction is performed based on judged result.
2. method of toch control according to claim 1, it is characterized in that, the described anglec of rotation comprises: rotate 0 degree, dextrorotation turn 90 degrees, dextrorotation turnback, turn clockwise 270 degree, be rotated counterclockwise 90 degree, be rotated counterclockwise 180 degree and be rotated counterclockwise 270 degree.
3. method of toch control according to claim 2, is characterized in that, the touch point that described basis identifies and the described anglec of rotation, judges that touch point is positioned at touch area, edge or normal touch region and comprises:
If the anglec of rotation is 0 degree, then as Wc<x< (W-Wc), touch point is positioned at normal touch region, otherwise touch point is positioned at touch area, edge;
If the anglec of rotation is clockwise 90 degree, then, as Wc<y<H-Wc, touch point is positioned at normal touch region, otherwise touch point is positioned at touch area, edge;
If the anglec of rotation is clockwise 180 degree, then as Wc<x< (W-Wc), touch point is positioned at normal touch region, otherwise touch point is positioned at touch area, edge;
If the anglec of rotation is clockwise 270 degree, then, as Wc<y<H-Wc, touch point is positioned at normal touch region, otherwise touch point is positioned at touch area, edge;
Wherein, x is the transverse axis coordinate being positioned at touch panel place coordinate system of touch point, and x is the transverse axis coordinate being positioned at touch panel place coordinate system of touch point, and W is the width of touch panel, and Wc is the width of edge Petting Area.
4. a subscriber equipment, is characterized in that, comprising: touch-screen, action sensor and processor;
Touch-screen, comprising: touch panel and touch controller, wherein:
Touch panel, for detecting the touch signal resulted from touch panel;
Touch controller, for according to touch signal identification touch point;
Action sensor, for detecting the anglec of rotation of described subscriber equipment;
Processor, comprising: driver module, application framework module and application module, wherein:
Described driver module, for obtaining incoming event according to described touch signal, and is reported to described application framework module;
Described application framework module, for according to the anglec of rotation and the position, touch point of incoming event that reports, judges that touch point is positioned at touch area, edge or normal touch region;
Application module, for performing corresponding instruction based on judged result.
5. an input processing method, is characterized in that, comprising:
Drive the incoming event that layer acquisition user is produced by input equipment, and be reported to application framework layer;
Application framework layer is according to the current state of mobile terminal and the incoming event reported, judge that incoming event is edge incoming event, or normal incoming event, if normal incoming event then carries out processing and identification to normal incoming event, and recognition result is reported application layer, if edge incoming event then edge incoming event carries out processing and identification, and recognition result is reported application layer;
Application layer performs corresponding instruction according to the recognition result reported.
6. input processing method according to claim 5, is characterized in that, described method also comprises:
For each incoming event creates the input equipment object that has device identification.
7. input processing method according to claim 6, is characterized in that, described for each incoming event create one there is device identification input equipment object comprise:
Normal incoming event is corresponding with the touch-screen with the first device identification;
Application framework layer arranges one, and to have the second input equipment object of the second device identification corresponding with edge incoming event.
8. input processing method according to claim 5, is characterized in that, described driving layer obtains the incoming event that user is produced by input equipment, and is reported to application framework layer to comprise:
Described driving layer is that each touch point gives one for distinguishing the numbering of finger, and adopts A protocol to report described incoming event.
9. input processing method according to claim 5, is characterized in that, described driving layer obtains the incoming event that user is produced by input equipment, and is reported to application framework layer to comprise:
Described driving layer adopts B agreement to report described incoming event;
Described method also comprises:
Described application framework layer is the numbering that each touch point in described incoming event gives for distinguishing finger.
10. the input processing method according to any one of claim 5-9, it is characterized in that, the current state of described mobile terminal comprises: rotate 0 degree, dextrorotation turn 90 degrees, dextrorotation turnback, turn clockwise 270 degree, be rotated counterclockwise 90 degree, be rotated counterclockwise 180 degree and be rotated counterclockwise 270 degree.
11. input processing methods according to claim 10, is characterized in that, if the anglec of rotation is 0 degree, then as Wc<x< (W-Wc), then application framework layer judges that incoming event is normal incoming event, otherwise, be edge incoming event;
If the anglec of rotation is clockwise 90 degree, then as Wc<y<H-Wc, then application framework layer judges that incoming event is normal incoming event, otherwise, be edge incoming event;
If the anglec of rotation is clockwise 180 degree, then as Wc<x< (W-Wc), then application framework layer judges that incoming event is normal incoming event, otherwise, be edge incoming event;
If the anglec of rotation is clockwise 270 degree, then as Wc<y<H-Wc, then application framework layer judges that incoming event is normal incoming event, otherwise, be edge incoming event;
Wherein, x is the transverse axis coordinate being positioned at touch panel place coordinate system of touch point, and x is the transverse axis coordinate being positioned at touch panel place coordinate system of touch point, and W is the width of touch panel, and Wc is the width of edge Petting Area.
12. 1 kinds of mobile terminals, is characterized in that, comprising:
Input equipment;
Action sensor, for detecting the current state of described mobile terminal;
Driving layer, for obtaining the incoming event that user is produced by input equipment, and being reported to application framework layer;
Application framework layer, for according to the current state of mobile terminal and the incoming event that reports, judge that incoming event is edge incoming event, or normal incoming event, if normal incoming event then carries out processing and identification to normal incoming event, and recognition result is reported application layer, if edge incoming event then edge incoming event carry out processing and identification, and recognition result is reported application layer;
Application layer, for performing corresponding instruction according to the recognition result reported.
13. mobile terminals according to claim 12, is characterized in that, described normal incoming event is corresponding with the first input equipment object with the first device identification;
Described application framework layer also for arranging the second input equipment object that has the second device identification, for corresponding with described edge incoming event.
14. mobile terminals according to claim 12, it is characterized in that, described driving layer adopts A agreement or B agreement to report incoming event, reports incoming event according to A agreement, then described event acquisition module is also for giving one for distinguishing the numbering of finger for each touch point;
Report incoming event according to B agreement, then described application framework layer is also for giving the numbering for distinguishing finger for each touch point.
15. mobile terminals according to claim 12, is characterized in that, described driving layer comprises event acquisition module, for obtaining the incoming event that user is produced by input equipment.
16. mobile terminals according to claim 12, is characterized in that, described application framework layer comprises input reader;
Described mobile terminal also comprises the device node be arranged between described driving layer and described input reader, for notifying that described input reader obtains incoming event;
Described input reader, for traveling through device node, obtaining incoming event and reporting.
17. mobile terminals according to claim 12, it is characterized in that, the current state of described mobile terminal comprises: rotate 0 degree, dextrorotation turn 90 degrees, dextrorotation turnback, turn clockwise 270 degree, be rotated counterclockwise 90 degree, be rotated counterclockwise 180 degree and be rotated counterclockwise 270 degree.
18. mobile terminals according to claim 17, is characterized in that, described application framework layer also comprises: the first event processing module, report after the incoming event for reporting described input reader carries out coordinate calculating;
For the coordinate figure reported according to current state and described first event processing module of described mobile terminal, first judge module, judges whether incoming event is edge incoming event, if not then reported by incoming event.
19. mobile terminals according to claim 18, is characterized in that, described application framework layer also comprises:
Second event processing module, reports after the incoming event for reporting described input reader carries out coordinate calculating;
For the coordinate figure reported according to current state and the described second event processing module of described mobile terminal, second judge module, judges whether incoming event is edge incoming event, if then reported by incoming event.
20. mobile terminals according to claim 18 or 19, is characterized in that, if the anglec of rotation is 0 degree, then as Wc<x< (W-Wc), then judged result is incoming event is normal incoming event, otherwise, be edge incoming event;
If the anglec of rotation is clockwise 90 degree, then as Wc<y<H-Wc, then judged result is incoming event is normal incoming event, otherwise, be edge incoming event;
If the anglec of rotation is clockwise 180 degree, then as Wc<x< (W-Wc), then judged result is incoming event is normal incoming event, otherwise, be edge incoming event;
If the anglec of rotation is clockwise 270 degree, then as Wc<y<H-Wc, then judged result is incoming event is normal incoming event, otherwise, be edge incoming event;
Wherein, x is the transverse axis coordinate being positioned at touch panel place coordinate system of touch point, and x is the transverse axis coordinate being positioned at touch panel place coordinate system of touch point, and W is the width of touch panel, and Wc is the width of edge Petting Area.
21. mobile terminals according to claim 20, is characterized in that, described application framework layer also comprises:
Event distributes module, reports for the event described second judge module and described first judge module reported.
22. mobile terminals according to claim 21, is characterized in that, described application framework layer also comprises:
First application module;
Second application module;
3rd judge module, for distributing according to described event whether the device identification decision event comprised in event that module reports is edge incoming event, if belong to, then reporting described first application module, otherwise reporting when described second application module;
Described first application module, for identify normal incoming event according to the correlation parameter of normal incoming event and recognition result be reported to application layer;
Described second application module, for carrying out identifying and the application layer reported by recognition result according to the correlation parameter edge incoming event of edge incoming event.
23. mobile terminals according to claim 12, is characterized in that, described input equipment is the touch-screen of mobile terminal;
Described touch-screen comprises at least one input field, edge and at least one normal input field.
24. mobile terminals according to claim 12, is characterized in that, described input equipment is the touch-screen of mobile terminal;
Described touch-screen comprises at least one input field, edge, at least one normal input field and at least one zone of transition.
CN201510819757.8A 2015-11-20 2015-11-20 Method of toch control, user equipment, input processing method and mobile terminal Active CN105335007B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201510819757.8A CN105335007B (en) 2015-11-20 2015-11-20 Method of toch control, user equipment, input processing method and mobile terminal
PCT/CN2016/102777 WO2017084469A1 (en) 2015-11-20 2016-10-20 Touch control method, user equipment, input processing method and mobile terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510819757.8A CN105335007B (en) 2015-11-20 2015-11-20 Method of toch control, user equipment, input processing method and mobile terminal

Publications (2)

Publication Number Publication Date
CN105335007A true CN105335007A (en) 2016-02-17
CN105335007B CN105335007B (en) 2019-10-08

Family

ID=55285599

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510819757.8A Active CN105335007B (en) 2015-11-20 2015-11-20 Method of toch control, user equipment, input processing method and mobile terminal

Country Status (2)

Country Link
CN (1) CN105335007B (en)
WO (1) WO2017084469A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017084469A1 (en) * 2015-11-20 2017-05-26 努比亚技术有限公司 Touch control method, user equipment, input processing method and mobile terminal
CN107479745A (en) * 2017-07-31 2017-12-15 北京雷石天地电子技术有限公司 A kind of method, module and operating system for configuring touch-screen
WO2019105188A1 (en) * 2017-11-29 2019-06-06 广州视源电子科技股份有限公司 Touch sensing signal processing method, system and device, and electronic device

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101676843A (en) * 2008-09-18 2010-03-24 联想(北京)有限公司 Touch inputting method and touch inputting device
CN102236468A (en) * 2010-04-26 2011-11-09 宏达国际电子股份有限公司 Sensing method, computer program product and portable device
CN104583903A (en) * 2013-11-26 2015-04-29 华为技术有限公司 Method, system and terminal for preventing faulty touch operation
CN104735256A (en) * 2015-03-27 2015-06-24 努比亚技术有限公司 Method and device for judging holding mode of mobile terminal

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105511675B (en) * 2015-11-20 2020-07-24 重庆桔子科技发展有限公司 Touch control method, user equipment, input processing method, mobile terminal and intelligent terminal
CN105335007B (en) * 2015-11-20 2019-10-08 努比亚技术有限公司 Method of toch control, user equipment, input processing method and mobile terminal

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101676843A (en) * 2008-09-18 2010-03-24 联想(北京)有限公司 Touch inputting method and touch inputting device
CN102236468A (en) * 2010-04-26 2011-11-09 宏达国际电子股份有限公司 Sensing method, computer program product and portable device
CN104583903A (en) * 2013-11-26 2015-04-29 华为技术有限公司 Method, system and terminal for preventing faulty touch operation
CN104735256A (en) * 2015-03-27 2015-06-24 努比亚技术有限公司 Method and device for judging holding mode of mobile terminal

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017084469A1 (en) * 2015-11-20 2017-05-26 努比亚技术有限公司 Touch control method, user equipment, input processing method and mobile terminal
CN107479745A (en) * 2017-07-31 2017-12-15 北京雷石天地电子技术有限公司 A kind of method, module and operating system for configuring touch-screen
CN107479745B (en) * 2017-07-31 2020-07-21 北京雷石天地电子技术有限公司 Method and module for configuring touch screen and operating system
WO2019105188A1 (en) * 2017-11-29 2019-06-06 广州视源电子科技股份有限公司 Touch sensing signal processing method, system and device, and electronic device
US11036329B2 (en) 2017-11-29 2021-06-15 Guangzhou Shiyuan Electronics Co., Ltd Touch sensing signal processing method, system and device, and electronic device

Also Published As

Publication number Publication date
CN105335007B (en) 2019-10-08
WO2017084469A1 (en) 2017-05-26

Similar Documents

Publication Publication Date Title
CN105511675A (en) Touch control method, user equipment, input processing method, mobile terminal and intelligent terminal
US10996834B2 (en) Touchscreen apparatus user interface processing method and touchscreen apparatus
US20200050359A1 (en) Apparatus including a touch screen under a multi-application environment and controlling method thereof
CN105487705A (en) Mobile terminal, input processing method and user equipment
US10067666B2 (en) User terminal device and method for controlling the same
US20170322713A1 (en) Display apparatus and method for controlling the same and computer-readable recording medium
US11157127B2 (en) User terminal apparatus and controlling method thereof
US11853543B2 (en) Method and apparatus for controlling display of video call interface, storage medium and device
CN108733296B (en) Method, device and equipment for erasing handwriting
AU2014312481A1 (en) Display apparatus, portable device and screen display methods thereof
CN107153546B (en) Video playing method and mobile device
CN108920069A (en) A kind of touch operation method, device, mobile terminal and storage medium
CN109117241B (en) Display direction control method, system and mobile terminal
CN105573545A (en) Gesture correction method, apparatus and gesture input processing method
CN110647286A (en) Screen element control method, device, equipment and storage medium
US11455071B2 (en) Layout method, device and equipment for window control bars
CN105335007A (en) Touch control method, user equipment, input processing method and mobile terminal
KR102180404B1 (en) User terminal apparatus and control method thereof
KR102351634B1 (en) Terminal apparatus, audio system and method for controlling sound volume of external speaker thereof
CN107728898B (en) Information processing method and mobile terminal
CN102890606A (en) Information processing device, information processing method, and program
JP2014056519A (en) Portable terminal device, incorrect operation determination method, control program, and recording medium
WO2014056319A1 (en) Touch sensitive device and unlocking method thereof
CN104657042B (en) A kind of application program image target triggering implementation method and its mobile terminal
KR102492182B1 (en) User terminal apparatus and control method thereof

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant