WO2017084469A1 - 触摸控制方法、用户设备、输入处理方法和移动终端 - Google Patents

触摸控制方法、用户设备、输入处理方法和移动终端 Download PDF

Info

Publication number
WO2017084469A1
WO2017084469A1 PCT/CN2016/102777 CN2016102777W WO2017084469A1 WO 2017084469 A1 WO2017084469 A1 WO 2017084469A1 CN 2016102777 W CN2016102777 W CN 2016102777W WO 2017084469 A1 WO2017084469 A1 WO 2017084469A1
Authority
WO
WIPO (PCT)
Prior art keywords
input
touch
event
input event
edge
Prior art date
Application number
PCT/CN2016/102777
Other languages
English (en)
French (fr)
Inventor
李鑫
迟建华
Original Assignee
努比亚技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 努比亚技术有限公司 filed Critical 努比亚技术有限公司
Publication of WO2017084469A1 publication Critical patent/WO2017084469A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Definitions

  • the present invention relates to the field of communications, and more particularly to a touch control method, a user equipment, an input processing method, and a mobile terminal.
  • the terminal frame is narrower and narrower.
  • edge input technology for example, edge touch
  • the driver layer determines whether the touch occurs in the edge input region according to the touch point information.
  • the method of obtaining the touch point information by the driver layer is also highly targeted, which leads to the need for each type when judging the event type (whether it is an edge input event).
  • the input chip is modified and ported differently, and the workload is large and error-prone.
  • the driver layer when the driver layer reports an event, it can select either the A protocol or the B protocol.
  • the B protocol distinguishes the finger ID.
  • the implementation of the edge input relies on the finger ID, which is used to compare the data of two clicks before and after the same finger when multi-point input. Therefore, the prior art input scheme can only support the B protocol, while the driver using the A protocol cannot be supported.
  • the edge input area of the existing mobile terminal is fixed, and cannot be correspondingly changed as the mobile terminal rotates, and the user experience is poor.
  • the input scheme of the prior art has strong hardware dependency, cannot support the A protocol and the B protocol at the same time, and has a defect of poor user experience, and needs to be improved.
  • the technical problem to be solved by the present invention is to provide a touch control method, a user equipment, an input processing method, and a mobile device, in which the edge input method of the above-described mobile terminal of the prior art cannot perform corresponding conversion according to the rotation of the mobile terminal. terminal.
  • a touch control method including:
  • the corresponding instruction is executed based on the judgment result.
  • the angle of rotation comprises: 0 degrees of rotation, 90 degrees of clockwise rotation, 180 degrees of clockwise rotation, 270 degrees of clockwise rotation, 90 degrees of counterclockwise rotation, 180 degrees of counterclockwise rotation, and 260 counterclockwise rotation. degree.
  • determining whether the touch point is located in the edge touch area or the normal touch area according to the recognized touch point and the rotation angle includes:
  • x is the horizontal axis coordinate of the coordinate system of the touch panel located at the touch point
  • x is the horizontal axis coordinate of the coordinate system of the touch panel located at the touch point
  • W is the width of the touch panel
  • Wc is the width of the edge touch area.
  • a user equipment including: a touch screen, a motion sensor, and a processor;
  • the touch screen includes: a touch panel and a touch controller, wherein:
  • a touch panel configured to detect a touch signal generated on the touch panel
  • a touch controller configured to recognize a touch point according to the touch signal
  • a motion sensor configured to detect a rotation angle of the user equipment
  • the processor includes: a driver module, an application framework module, and an application module, wherein:
  • the driving module is configured to acquire an input event according to the touch signal, and report the event to the application framework module;
  • the application framework module is configured to determine, according to the rotation angle and the touch point position of the reported input event, whether the touch point is located in the edge touch area or the normal touch area;
  • the application module is configured to execute a corresponding instruction based on the judgment result.
  • an input processing method including:
  • the driver layer acquires an input event generated by the user through the input device, and reports it to the application framework layer;
  • the application framework layer determines whether the input event is an edge input event or a normal input event according to the current state of the mobile terminal and the reported input event. If it is a normal input event, the normal input event is processed and recognized, and the recognition result is reported to the application layer. If the edge is input, the edge input event is processed and recognized, and the recognition result is reported to the application layer;
  • the application layer executes the corresponding instruction according to the reported recognition result.
  • the method further includes:
  • the creating an input device object having a device identifier for each input event includes:
  • the application framework layer sets a second input device object having a second device identification corresponding to an edge input event.
  • the driving layer acquiring an input event generated by the user through the input device and reporting to the application framework layer includes:
  • the driving layer assigns a number for distinguishing the finger to each touch point, and reports the input event by using the A protocol protocol.
  • the driving layer acquiring an input event generated by the user through the input device and reporting to the application framework layer includes:
  • the driving layer reports the input event by using the B protocol
  • the method further includes:
  • the application framework layer assigns a number for distinguishing a finger to each of the input events.
  • the current state of the mobile terminal includes: 0 degrees of rotation, 90 degrees of clockwise rotation, 180 degrees of clockwise rotation, 270 degrees of clockwise rotation, 90 degrees of counterclockwise rotation, 180 degrees of counterclockwise rotation, and inverse The hour hand rotates 270 degrees.
  • the application framework layer determines that the input event is a normal input event; otherwise, the edge input event;
  • the application frame layer is used to judge that the input event is a normal input event; otherwise, the edge input event is;
  • the rotation angle is 270 degrees clockwise, when Wc ⁇ y ⁇ H-Wc, the application frame layer is used to judge that the input event is a normal input event; otherwise, the edge input event is;
  • x is the horizontal axis coordinate of the coordinate system of the touch panel where the touch point is located
  • x is the touch point
  • W is the width of the touch panel
  • Wc is the width of the edge touch area
  • a mobile terminal including:
  • a motion sensor configured to detect a current state of the mobile terminal
  • the driver layer is configured to obtain an input event generated by the user through the input device, and report the event to the application framework layer;
  • the application framework layer is configured to determine whether the input event is an edge input event or a normal input event according to the current state of the mobile terminal and the reported input event, and if the normal input event is processed, the normal input event is processed and recognized, and the recognition result is reported. To the application layer, if the edge input event is processed, the edge input event is processed and recognized, and the recognition result is reported to the application layer;
  • the application layer is configured to execute a corresponding instruction according to the reported recognition result.
  • the normal input event corresponds to a first input device object having a first device identification
  • the application framework layer is further configured to set a second input device object having a second device identifier for corresponding to the edge input event.
  • the driving layer reports an input event by using the A protocol or the B protocol. If the input event is reported by the A protocol, the event obtaining module is further configured to assign a number for distinguishing the finger to each touch point. ;
  • the application framework layer is further configured to assign a number for distinguishing the finger to each touch point.
  • the driver layer includes an event acquisition module configured to obtain an input event generated by a user through an input device.
  • the application framework layer includes an input reader
  • the mobile terminal further includes a device section disposed between the driving layer and the input reader Point, configured to notify the input reader to obtain an input event;
  • the input reader is configured to traverse the device node, obtain an input event, and report the event.
  • the current state of the mobile terminal includes: 0 degrees of rotation, 90 degrees of clockwise rotation, 180 degrees of clockwise rotation, 270 degrees of clockwise rotation, 90 degrees of counterclockwise rotation, 180 degrees of counterclockwise rotation, and inverse The hour hand rotates 270 degrees.
  • the application framework layer further includes: a first event processing module configured to perform coordinate calculation on the input event reported by the input reader and report the result;
  • the first judging module is configured to determine whether the input event is an edge input event according to the current state of the mobile terminal and the coordinate value reported by the first event processing module, and if not, the input event is reported.
  • the application framework layer further includes:
  • a second event processing module configured to perform coordinate calculation on the input event reported by the input reader and report the result
  • the second judging module is configured to determine whether the input event is an edge input event according to the current state of the mobile terminal and the coordinate value reported by the second event processing module, and if yes, report the input event.
  • the judgment result is that the input event is a normal input event; otherwise, the edge input event;
  • x is the horizontal axis coordinate of the coordinate system of the touch panel where the touch point is located
  • x is the touch point
  • W is the width of the touch panel
  • Wc is the width of the edge touch area
  • the application framework layer further includes:
  • the event dispatching module is configured to report the event reported by the second determining module and the first determining module.
  • the application framework layer further includes:
  • the third judging module is configured to determine whether the event is an edge input event according to the device identifier included in the event reported by the event dispatching module, and if yes, report the event to the first application module, otherwise report to the second application module.
  • the first application module is configured to identify a normal input event according to a relevant parameter of the normal input event, and report the recognition result to the application layer;
  • the second application module is configured to identify an edge input event according to a related parameter of the edge input event and report the recognition result to the application layer.
  • the input device is a touch screen of the mobile terminal
  • the touch screen includes at least one edge input area and at least one normal input area.
  • the input device is a touch screen of the mobile terminal
  • the touch screen includes at least one edge input zone, at least one normal input zone, and at least one transition zone.
  • the touch control method, the user equipment, the input processing method and the mobile terminal of the present invention can realize the corresponding change of the edge touch area according to the rotation of the touch screen, so as to better adapt to the user's operation and improve the user experience;
  • the application framework layer the operations of distinguishing between the A zone and the C zone are performed, and the virtual device is established in the application framework layer, thereby avoiding the dependence of the driver layer on the hardware of the A zone and the C zone; by setting the touch point number, it can be realized Differentiate fingers, compatible with A Protocol and B protocol; and can be integrated into the operating system of the mobile terminal, can be applied to different hardware, different types of mobile terminals, portability is good; all elements of the touch point (coordinates, numbers, etc. of touch points) are stored, Subsequent judgment of edge input (eg, FIT) provides convenience.
  • FIG. 1 is a schematic structural diagram of hardware of a mobile terminal according to an embodiment of the present invention.
  • FIG. 2 is a schematic diagram of a touch screen area division of a mobile terminal according to a first embodiment of the present invention
  • FIG. 3 is a schematic diagram of a touch screen rotation angle of the mobile terminal according to an embodiment of the present invention when the rotation angle is 0 degrees;
  • FIG. 4 is a schematic diagram of a touch screen rotation angle of a mobile terminal according to an embodiment of the present invention when it is 90 degrees clockwise;
  • FIG. 5 is a schematic diagram of a touch screen rotation angle of the mobile terminal according to an embodiment of the present invention when it is 180 degrees clockwise;
  • FIG. 6 is a schematic diagram of a touch screen rotation angle of a mobile terminal according to an embodiment of the present invention when it is 270 degrees clockwise;
  • FIG. 7 is a schematic flow chart of a touch control method according to an embodiment of the present invention.
  • FIG. 8 is a schematic diagram of a software architecture of a mobile terminal according to an embodiment of the present invention.
  • FIG. 9 is a schematic structural diagram of a mobile terminal according to an embodiment of the present invention.
  • FIG. 10 is a schematic flowchart of determining an edge input event in an embodiment of the present invention.
  • FIG. 11 is a schematic flowchart of determining an input event according to a device identifier according to an embodiment of the present invention.
  • FIG. 13 is a schematic diagram showing an effect of turning on a camera application of a mobile terminal when a rotation angle is 0 degrees by using an input processing method according to an embodiment of the present invention
  • FIG. 14 is a schematic diagram showing an effect of opening a camera application of a mobile terminal when a rotation angle is 90 degrees clockwise by using an input processing method according to an embodiment of the present invention
  • FIG. 15 is a schematic diagram of a touch screen area division of a mobile terminal according to a second embodiment of the present invention.
  • FIG. 16 is a schematic diagram showing the hardware structure of a user equipment according to an embodiment of the present invention.
  • a mobile terminal includes an input device, a processor 903, and a display screen 904.
  • the input device is a touch screen 2010.
  • the touch screen 2010 includes a touch panel 901 and a touch controller 902.
  • the input device may also be a non-touch input device (eg, an infrared input device, etc.) or the like.
  • Touch controller 902 can be a single application specific integrated circuit (ASIC), which can include one or more processor subsystems, which can include one or more ARM processors or other processors with similar functions and capabilities.
  • ASIC application specific integrated circuit
  • the touch controller 902 is mainly used for receiving a touch signal generated by the touch panel 901, and processing the same to the processor 903 of the mobile terminal.
  • processing is, for example, analog-to-digital conversion of a physical input signal, processing to obtain touch point coordinates, processing to obtain a touch duration, and the like.
  • the processor 903 receives the output of the touch controller 902, performs processing, and performs an action based on the output.
  • the actions include, but are not limited to, moving an object such as a table or indicator, scrolling or panning, adjusting control settings, opening a file or document, viewing a menu, as a selection, executing an instruction, operating a peripheral device coupled to the host device Answering a phone call, making a call, terminating a phone call, changing volume or audio settings, storing information related to phone communications (eg, address, frequently used number, received call, missed call), logging in to a computer or computer network, allowing authorization An individual accesses a restricted area of a computer or computer network, records a user profile associated with a user preferences configuration of a computer desktop, allows access to network content, launches a particular program, encrypts or decodes a message, and the like.
  • the processor 903 is also coupled to the display screen 904.
  • Display 904 is used to provide a UI to a user of the device.
  • processor 903 can be a separate component from touch controller 902. In other embodiments, the processor 903 can be a composite component with the touch controller 902.
  • the touch panel 901 is provided with a discrete capacitive sensor, a resistive sensor, a force sensor, an optical sensor, or the like.
  • the touch panel 901 includes an electrode array made of a conductive material in a lateral direction and a longitudinal direction.
  • the touch controller 902 uses a self-capacitance scan, and then scans the M rows and the N columns respectively, according to each row and each column.
  • the signal is used to calculate the coordinates of the finger on the touch screen.
  • the number of scans is M+N times.
  • the touch controller 902 uses multi-contact mutual capacitance scanning to intersect the rows and columns. Scanning, whereby the number of scans is M ⁇ N times.
  • the touch panel When the user's finger touches the panel, the touch panel generates a touch signal (which is an electrical signal) to the touch controller 902.
  • the touch controller 902 can obtain the coordinates of the touched point by scanning.
  • the touch panel 901 of the touch screen 2010 is physically a set of independent coordinate positioning systems. After the touch point coordinates of the touch are reported to the processor 903, the processor 903 is converted to the display screen 904. Pixel coordinates to correctly identify the input operation.
  • FIG. 2 is a schematic diagram of area division of a touch panel according to a first embodiment of the present invention.
  • the touch panel of the touch screen is divided into three regions, wherein the C region 101 is an edge input region, and the A region 100 is a normal input region.
  • the input operation in the A area is processed according to the existing normal processing manner. For example, clicking an application icon in the A area 100 starts the application.
  • the input operation in the C area 101 it can be defined as an edge input processing mode.
  • a bilateral sliding in the C area 101 can be defined, that is, terminal acceleration is performed.
  • the C zone may be divided in a fixed manner or a custom partition.
  • Fixed Dividing that is, setting a fixed length, fixed wide area as the C area 101.
  • the C area 101 may include a partial area on the left side of the touch panel and a partial area on the right side, the positions of which are fixedly disposed on both side edges of the touch panel, as shown in FIG.
  • the C zone 101 can also be divided only at one side edge.
  • the custom division that is, the number, location, and size of the area of the C area 101
  • the basic graphic design of the C area 101 is a rectangle, and the position and size of the C area can be determined by inputting the coordinates of the two vertices of the diagonal of the graphic.
  • the embodiment of the present invention does not limit the division and setting manner of the C area.
  • T0 of the upper left corner of the touch panel is set as the coordinate origin, and the coordinate value is (0, 0).
  • the coordinate value of the lower right corner of the touch panel is T7 (W, H), where W is the width of the touch panel and H is the height of the touch panel.
  • the touch screen is divided into an A zone and a C zone as described above, and the A zone and the C zone belong to the same coordinate system.
  • the coordinates are also correspondingly divided. For example, if the width of the touch panel is W and the width of the C area is Wc, the touch points whose coordinates are located in the area defined by T0, T1, T4, and T5, and/or the coordinates are defined by T2, T3, T6, and T7.
  • a touch point within an area is defined as an edge touch point; and a touch point whose coordinates are located in an area defined by T1, T2, T5, and T6 is defined as a normal touch point.
  • the touch screen orientation described in FIG. 3 above is the initial orientation, and the touch screen is rotated 90 degrees clockwise. At this time, the coordinate system does not change. In order to facilitate the operation, the position of the C area has changed.
  • the coordinates are located at T0, S2, S4 and The touch points in the area defined by T3, and/or the touch points whose coordinates are located in the area defined by T4, S1, T7, and S3, are defined as edge touch points; and the coordinates are defined by S1, S2, S3, and S4. Touch points within the area are defined as normal touch points.
  • the orientation of the touch screen described above with reference to FIG. 3 is the initial orientation, and the touch screen is rotated 180 degrees clockwise. At this time, the coordinate system has not changed, and the position of the C region has not changed.
  • the orientation of the touch screen described above with reference to FIG. 3 is the initial orientation, and the touch screen is rotated 270 degrees clockwise. At this time, the coordinate system has not changed, and the position of the C region is the same as that shown in FIG. 4 above.
  • the coordinate system of the touch screen has not changed, that is, regardless of the state of the touch screen of the mobile terminal in any of the above-mentioned FIG. 3 to FIG. 6 or other rotation angles (these rotation states may be
  • the motion sensor 906 detects that when the touch panel 901 receives the touch signal, the coordinates of the touched point reported by the touch controller 902 are all reported in accordance with the coordinate system shown in FIG. 3, and the rotation state of the touch screen is not paid attention to.
  • the display screen 904 also rotates correspondingly, and the processor 903 adaptively converts the coordinates reported by the touch controller 902 to adapt to the pixel coordinates of the display screen 904.
  • the correspondence between the rotation angle and the conversion method is stored in the memory 905, and such conversion will be described later.
  • the touch control method according to the embodiment of the present invention includes the following steps:
  • a touch controller detects the signal, and obtains physical coordinates of the touch point by scanning or the like.
  • a coordinate system as shown in FIGS. 3-6 is employed.
  • the touch screen of the mobile terminal of the embodiment of the present invention is divided into an edge touch area and The normal touch area, therefore, the touch gestures of different areas are defined separately.
  • the touch gestures of the normal touch zone include: click, double tap, slide, and the like.
  • the touch gestures of the edge touch area include: sliding on the left edge, sliding on the left edge, sliding on the right edge, sliding on the right edge, bilateral sliding, bilateral sliding, holding the four corners of the mobile phone, sliding back and forth on one side, holding a grip One-handed grip.
  • left side and “right side” herein are relative. For example, if it is shown in FIG. 3, the area where the M point is located is “left side”, and the opposite side is “right side”. . As shown in FIG. 4, the area where the M point is located is “left side”, and the opposite side is “right side”. That is, in the embodiment of the present invention, "left side” and “right side” change with the rotation of the touch screen.
  • the rotation angle of the touch panel can be obtained by detecting the rotation angle of the mobile terminal by the motion sensor.
  • the processor determines the area to which the touch point belongs according to the physical coordinates reported by the touch controller.
  • the coordinate range of each region is stored in the memory.
  • the coordinate range of the edge touch area is: the coordinates are located in the area defined by T0, T1, T4, and T5, and/or the coordinates are located in the area defined by T2, T3, T6, and T7.
  • the coordinate range of the normal touch area is: the coordinates are located in the area defined by T1, T2, T5, and T6.
  • the coordinate range of the edge touch area is: the coordinates are located in the area defined by T0, S2, S4, and T3, and/or the coordinates. Located in the area defined by T4, S1, T7 and S3.
  • the coordinate range of the normal touch area is: the coordinates are located in the area defined by S1, S2, S3, and S4.
  • the conversion rules are:
  • the above conversion rule is based on the same size of the display coordinate system and the size of the touch panel coordinate system (for example, both are 1080 ⁇ 1920 pixels), if the coordinate system of the display screen and the touch panel coordinate system If the size is different, after the above conversion, it is adjusted to adapt to the coordinates of the display screen. Specifically, the coordinates of the touch panel are multiplied by the corresponding conversion coefficients.
  • the conversion factor is the ratio of the size of the display to the touch panel. For example, if the touch panel is 720 ⁇ 1280 and the display screen is 1080 ⁇ 1920, the ratio of the display screen to the touch panel is 1.5, thereby multiplying the abscissa and the ordinate of the physical coordinates of the reported touch panel by 1.5, respectively. When it is (xc, yc), it is changed to (1.5 ⁇ xc, 1.5 ⁇ yc), or (1.5 ⁇ yc, 1.5 ⁇ W - xc), etc. when converted to display coordinates.
  • an accurate display can be realized, and the correct touch gesture can be recognized, thereby executing an instruction corresponding to the touch gesture.
  • the touch gesture is in one-to-one correspondence with the instructions and stored in the memory.
  • the touch control method of the embodiment of the present invention can implement a corresponding transformed edge touch area according to the rotation of the touch screen, so as to better adapt to the user's operation and improve the user experience.
  • the software architecture of the mobile terminal in the embodiment of the present invention includes: an input device 201, a driving layer 202, and an application framework layer 203. And application layer 204.
  • the functions of the driver layer 202, the application framework layer 203, and the application layer 204 are performed by the processor 903.
  • input device 201 is a touch screen that includes a touch panel and a touch controller.
  • the input device 201 receives the input operation of the user, converts the physical input into a touch signal, and transmits the touch signal to the driving layer 202; the driving layer 202 parses the input position to obtain parameters such as specific coordinates and duration of the touch point, and This parameter is uploaded to the application framework layer 203, and communication between the application framework layer 203 and the driver layer 202 can be implemented through a corresponding interface.
  • the application framework layer 203 receives the parameters reported by the driver layer 202, parses, distinguishes the edge input event and the normal input event, and passes the valid input to the specific application of the application layer 204 to meet the application layer 204 according to different Input operations perform different input operation instructions.
  • FIG. 9 is a schematic structural diagram of a mobile terminal according to an embodiment of the present invention.
  • the input device includes the touch screen 2010 described above.
  • the driver layer 202 includes an event acquisition module 2020.
  • a device node 2021 is disposed between the drive layer 202 and the application framework layer 203.
  • the application framework layer 203 includes an input reader 2030, a first event processing module 2031, a second event processing module 2032, a first determining module 2033, a second determining module 2034, an event dispatching module 2035, a third determining module 2036, and a first The application module 2037, the second application module 2038, and the like.
  • the driving layer 202 includes an event obtaining module 2001 configured to acquire an input event generated by the user through the input device 201, for example, an input operation event through the touch screen.
  • the input events include a normal input event (A zone input event) and an edge input event (C zone input event).
  • Normal input events include input operations such as click, double click, and slide in Area A.
  • Edge input events include sliding on the left edge of Zone C, sliding of the left edge, sliding of the right edge, sliding of the right edge, bilateral sliding, bilateral sliding, holding the four corners of the phone, sliding back and forth, holding one Input operation such as grip, one-handed grip.
  • the event acquisition module 2001 is further configured to acquire related parameters such as coordinates, duration, and the like of the touch point of the input operation. If the input event is reported by the A protocol, the event obtaining module 2001 further It is configured to assign a number (ID) for distinguishing a finger for each touch point. Therefore, if the input event is reported by the A protocol, the reported data includes parameters such as coordinates of the touched point, duration, and the number of the touched point.
  • ID a number
  • a device node 2011 is disposed between the driver layer 202 and the input reader 2030, and is configured to notify the input reader 2030 of the application framework layer 203 to acquire an input event.
  • the input reader 2030 is configured to traverse the device node, obtain an input event, and report it. If the driver layer 202 reports an input event using the B protocol, the input reader 2030 is further configured to assign a number (ID) for distinguishing the finger for each touch point. In an embodiment of the invention, the input reader 2030 is further configured to store all of the element information (coordinates, duration, number, etc.) of the touch point.
  • each input event creates an input device object with a device identification.
  • a first input device object can be created for a normal input event with a first identity.
  • the first input device object corresponds to the actual hardware touch screen.
  • the application framework layer 203 further includes a second input device object (for example, an EDGE device), which is a virtual device, that is, an empty device, and has a second identifier for use with the edge.
  • the input event corresponds.
  • the edge input event may also correspond to the first input device object having the first identity and the normal control event corresponding to the second input device object having the second identity.
  • the first event processing module 2031 is configured to process an input event reported by the input reader 2030, for example, coordinate calculation of a touch point.
  • the second event processing module 2032 is configured to process an input event reported by the input reader 2030, for example, coordinate calculation of a touch point.
  • the first determining module 2033 is configured to determine whether the event is an edge input event based on the coordinate value (X value), and if not, upload the event to the event dispatching module 2035.
  • the second determining module 2034 is configured to determine, according to the coordinate value (X value), whether the event is an edge loss The event is entered, and if so, the event is uploaded to the event dispatch module 2035.
  • the first determining module 2033 acquires the horizontal axis coordinate of the touched point when determining whether the event is an edge input event, and sets the horizontal axis coordinate (ie, the X-axis coordinate) (x) and the C-zone width (Wc) of the touched point. And the touch screen width (W) for comparison. Specifically, if Wc ⁇ x ⁇ (W-Wc), the touch point is located in the A area, and the event is a normal input event; otherwise, the event is an edge input event; if the event is not an edge input event (ie, a normal input event), the event is Reported to the event dispatch module 2035. Similarly, the second judging module 2034 judges whether the event is an edge input event according to the manner shown in FIG. 4, and if the result of the judgment is that the event is an edge input event, the event is reported to the event dispatching module 2035.
  • the judgment flow shown in FIG. 10 is based on the touch screen of the mobile terminal as shown in FIG. 2, that is, the mobile terminal includes the C area 101 located at the left and right edges, and the A area 100 located in the middle. Therefore, when coordinate setting is performed along the coordinate system shown in FIG. 3, if Wc ⁇ x ⁇ (W - Wc), it can be determined that the touched point is located in the A area.
  • the judgment formula (Wc ⁇ x ⁇ (W-Wc)) may be adjusted according to the division of the mobile terminal area, for example, if the mobile terminal includes only one C area 101 located at the left edge, and the width thereof is Wc, when Wc ⁇ x ⁇ W, the touch point is located in the A area; otherwise, the touch point is located in the C area. If the mobile terminal includes only one C area 101 located at the right edge and its width is Wc, then when x ⁇ (W-Wc), the touch point is located in the A area; otherwise, the touch point is located in the C area.
  • the motion sensor can detect such rotation and pass the rotation information to the processor.
  • the processor combines the detection result of the motion sensor to determine the input event area. Specifically, if the rotation angle is 90 degrees clockwise, that is, the rotation is the state shown in FIG. 4, the judgment basis of the first judgment module and the second judgment module becomes: if Wc ⁇ y ⁇ H-Wc, the touch point Located in Area A, otherwise, the touch point is located in Area C. Where y is the Y-axis coordinate of the touch point.
  • the first judgment is made.
  • the judgment basis of the module and the second judging module is: if Wc ⁇ x ⁇ (W-Wc), the touch point is located in the A area, otherwise, the touch point is located in the C area.
  • the judgment basis of the first judgment module and the second judgment module becomes: if Wc ⁇ y ⁇ H-Wc, the touch point is located in the A area. Otherwise, the touch point is located in the C area.
  • y is the Y-axis coordinate of the touch point.
  • the event dispatch module 2035 is configured to report the edge input event and/or the A zone input event to the third determination module 2036.
  • the edge input event is not the same as the channel used for the A zone input event escalation. Edge input events are reported on dedicated channels.
  • the event dispatching module 2035 is further configured to acquire the current state of the mobile terminal, and convert and adjust the reported coordinates according to the current state.
  • the current state of the mobile terminal is obtained according to the detection result of the motion sensor.
  • the current state includes: a rotation angle of 0 degrees, a clockwise 90 degrees, a clockwise 180 degrees, a clockwise 270 degrees, and the like. It should be understood that if it is rotated counterclockwise, the counterclockwise 90 degrees is the same as the clockwise 270 degrees, the counterclockwise 180 degrees is the same as the clockwise 180 degrees, and the counterclockwise 270 degrees is the same as the clockwise 90 degrees.
  • the event dispatch module 2035 is implemented by inputdispatcher::dispatchmotion().
  • the third judging module 2036 is configured to determine whether the event is an edge input event according to the device identifier (ID). If it belongs, it is reported to the first application module 2037, otherwise it is reported to the second application module 2038.
  • the third determining module 2036 first obtains the device identifier, and determines whether it is a touch screen type device according to the device identifier; if yes, further determines whether the device identifier is the C region device identifier, that is, the second input.
  • the identifier of the device object if yes, is determined to be an edge input event, and if not, it is determined to be a normal input event.
  • the device identifier is an A-zone device identifier, that is, an identifier corresponding to the first input device, and if yes, determining that the device is a normal input event, and if not, determining that the device is an edge. Enter the event.
  • the first application module 2037 is configured to process an input event related to the input of the A zone. Specifically, the process includes: performing processing identification according to the touch point coordinates, duration, number, and the like of the input operation, The recognition result is reported to the application layer.
  • the second application module 2038 is configured to process an input event related to the input of the C zone. Specifically, the process includes: performing process identification according to the touch point coordinates, duration, and number of the processing operation, and reporting the recognition result to the application layer. For example, according to the coordinates, duration and number of the touched point, it can be recognized whether the input operation is a click or slide of the A area, or a single side of the C area.
  • the application layer 204 includes applications such as a camera, a gallery, a lock screen, etc. (Application 1, Application 2, ).
  • the input operations in the embodiments of the present invention include an application level and a system level, and the system level gesture processing also classifies it as an application layer.
  • the application level is the manipulation of the application, for example, on, off, volume control, and the like.
  • the system level is the manipulation of the mobile terminal, for example, power on, acceleration, inter-application switching, global return, and the like.
  • the application layer can obtain the input event of the C area by registering the Listener of the C area event, or can obtain the input event of the A area by registering the Listener of the A area event.
  • the mobile terminal sets and stores instructions corresponding to different input operations, including instructions corresponding to edge input operations and instructions corresponding to normal input operations.
  • the application layer receives the recognition result of the reported edge input event, that is, invokes a corresponding instruction according to the edge input operation in response to the edge input operation.
  • the application layer receives the recognition result of the reported normal input event, that is, calls the corresponding instruction according to the normal input operation in response to the normal input operation.
  • the input events of the embodiments of the present invention include input operations only in the A zone, input operations only in the C zone, and input operations simultaneously generated in the A zone and the C zone.
  • the instructions also include instructions corresponding to these three types of input events.
  • the embodiment of the present invention can implement the combination of the input operations of the A zone and the C zone to control the mobile terminal. For example, the input operation is to simultaneously click the corresponding positions of the A zone and the C zone, and the corresponding instruction is to close an application, and therefore, At the same time, click the input operation of the corresponding position in the A zone and the C zone to close the application.
  • the mobile terminal of the embodiment of the present invention can implement a corresponding transformed edge touch area according to the rotation of the touch screen to better adapt to the user's operation and improve the user experience; on the other hand, the A area and the C are distinguished in the application framework layer.
  • the operation of the area and the establishment of the virtual device in the application framework layer avoids the dependence of the driver layer on the hardware of the A area and the C area; by setting the touch point number, the finger can be distinguished, and the A protocol and the B protocol are compatible;
  • the functions of the second application module 2038 and the like can be integrated into the operating system of the mobile terminal, and can be applied to different hardware and different kinds of mobile terminals, and the portability is good; the input reader (Input Reader) automatically puts all the touch points. Elements (co
  • FIG. 12 is a flowchart of an input processing method according to an embodiment of the present invention, including the following steps:
  • the S1 and the driver layer acquire an input event generated by the user through the input device, and report it to the application framework layer.
  • the input device receives an input operation (ie, an input event) of the user, converts the physical input into an electrical signal, and transmits the electrical signal to the driving layer.
  • the input events include an A zone input event and a C zone input event.
  • Input events in Zone A include input operations such as click, double click, and slide in Zone A.
  • Input events in Zone C include sliding on the left edge of Zone C, slipping on the left edge, slipping on the right edge, sliding on the right edge, bilaterally sliding, bilateral sliding, single Slide back and forth, hold a grip, hold with one hand, and so on.
  • the driving layer analyzes the input position according to the received electrical signal to obtain related parameters such as specific coordinates and duration of the touched point.
  • the relevant parameters are reported to the application framework layer.
  • step S1 further includes:
  • a number (ID) for distinguishing the finger is assigned to each touch point.
  • the driver layer reports the input event by using the A protocol
  • the reported data includes the above related parameters and the number of the touched point.
  • the application framework layer determines whether the input event is an edge input event or a normal input event. If it is a normal input event, step S3 is performed, and if the edge input event is performed, step S4 is performed.
  • the application framework layer can determine whether it is an edge input event or a normal input event according to the coordinates in the relevant parameters of the input event.
  • the horizontal axis coordinate of the touched point is first acquired, and then the horizontal axis coordinate (ie, the X-axis coordinate) (x) of the touched point is compared with the C-zone width (Wc) and the touchscreen width (W). If Wc ⁇ x ⁇ (W-Wc), the touch point is in the A area, and the event is a normal input event; otherwise, the event is an edge input event.
  • the step S2 further includes: assigning a number (ID) for distinguishing the finger to each touch point; and performing all the element information (coordinates, duration, number, etc.) of the touch point. storage.
  • the finger by setting the touch point number, the finger can be distinguished, and the A protocol and the B protocol are compatible; and all the elements of the touch point (coordinates, numbers, and the like of the touch point) are stored, and the edge input can be subsequently determined (for example, , FIT) provides convenience.
  • the edge input event is not the same as the channel used for normal input event reporting. Edge input events use dedicated channels.
  • the application framework layer processes and identifies the normal input event, and reports the recognition result to the application layer.
  • the application framework layer processes and recognizes the edge input event, and reports the recognition result to the application layer.
  • the process identification includes: performing process identification according to touch point coordinates, duration, number, and the like of the input operation to determine an input operation. For example, according to the coordinates, duration, and number of the touched point, it is possible to recognize whether the input operation of the A area is clicked or swiped, or the input operation of the single-sided back and forth of the C area.
  • the application layer executes the corresponding instruction according to the reported recognition result.
  • the application layer includes applications such as a camera, a gallery, and a lock screen.
  • the input operations in the embodiments of the present invention include an application level and a system level, and the system level gesture processing also classifies it as an application layer.
  • the application level is the manipulation of the application, for example, on, off, volume control, and the like.
  • the system level is the manipulation of the mobile terminal, for example, power on, acceleration, inter-application switching, global return, and the like.
  • the mobile terminal sets and stores instructions corresponding to different input operations, including instructions corresponding to edge input operations and instructions corresponding to normal input operations.
  • the application layer receives the recognition result of the reported edge input event, that is, the corresponding instruction is invoked according to the edge input operation to respond to the edge input operation; the application layer receives the recognition result of the reported normal input event, that is, the corresponding input is invoked according to the normal input operation.
  • the instruction responds to the normal input operation.
  • the input events of the embodiments of the present invention include input operations only in the A zone, input operations only in the C zone, and input operations simultaneously generated in the A zone and the C zone.
  • the instructions also include instructions corresponding to these three types of input events.
  • the embodiment of the present invention can implement the combination of the input operations of the A zone and the C zone to control the mobile terminal. For example, the input operation is to simultaneously click the corresponding positions of the A zone and the C zone, and the corresponding instruction is to close an application, and therefore, At the same time, click the input operation of the corresponding position in the A zone and the C zone to close the application.
  • the input processing method of the embodiment of the present invention further includes:
  • a first input device object can be created for a normal input event, It has a first identity.
  • the first input device object corresponds to the input device touch screen.
  • the application framework layer sets a second input device object.
  • the second input device object (for example, a FIT device) is a virtual device, that is, an empty device, and has a second identifier for corresponding to an edge input event.
  • the edge input event may also correspond to the first input device object having the first identity and the normal control event corresponding to the second input device object having the second identity.
  • the input processing method of the embodiment of the present invention further includes:
  • the application framework layer converts and adjusts the reported coordinates according to the current state according to the current state of the mobile terminal.
  • the current state of the mobile terminal includes: a rotation angle of 0 degrees, a clockwise 90 degrees, a clockwise 180 degrees, a clockwise 270 degrees, and the like.
  • the counterclockwise 90 degrees is the same as the clockwise 270 degrees
  • the counterclockwise 180 degrees is the same as the clockwise 180 degrees
  • the counterclockwise 270 degrees is the same as the clockwise 90 degrees.
  • step S21 can be implemented by inputdispatcher::dispatchmotion().
  • step S22 Determine, according to the device identifier, whether the input event is an edge input event, if yes, execute step S3, and if not, perform step S4.
  • the device identifier determines, according to the device identifier, whether the input event is an edge input event, first obtain the device identifier, and determine whether the device is a touch screen type device according to the device identifier; if yes, further determine whether the device identifier is the C region device identifier. That is, the identifier of the second input device object is determined to be an edge input event, and if not, it is determined to be a normal input event. It should be understood that after determining that the device is a touch screen device, it is further determined whether the device identifier is an A-zone device identifier, that is, an identifier corresponding to the first input device, and if yes, determining that the device is a normal input event. If not, it is judged as an edge input event.
  • the input processing method of the embodiment of the present invention can implement the corresponding transformed edge touch area according to the rotation of the touch screen to better adapt to the user's operation and improve the user experience; on the other hand, the A area is distinguished by the application framework layer.
  • the operation of the C area, and the establishment of the virtual device in the application framework layer avoids the dependence of the driver layer on the hardware of the A area and the C area; by setting the touch point number, the finger can be distinguished, and the A protocol and the B protocol are compatible; It can be integrated into the operating system of the mobile terminal, and can be applied to different hardware and different types of mobile terminals, and the portability is good; all the elements of the touch point (coordinates, numbers, etc. of the touch points) are stored, and the edge input can be judged later ( For example, FIT) provides convenience.
  • FIG. 13 it is a schematic diagram of an effect of turning on a camera application of a mobile terminal by using an input processing method according to an embodiment of the present invention.
  • the figure on the left side of FIG. 13 is a schematic diagram of the main interface of the mobile terminal, wherein the area 1010 is a touch point preset in the edge input area (C area 101) to enable an input operation of turning on the camera function. Specifically, clicking on the area 1010 can enable the camera to be turned on. Then, in the mobile terminal, an instruction is stored to: turn on the camera, which corresponds to the input operation of the click area 1010.
  • the user clicks on the area 1010 of the touch screen, and the driver layer acquires the input event and reports it to the application framework layer.
  • the application framework layer can determine that the input event is an edge input event according to the coordinates of the touch point.
  • the application framework layer processes and recognizes the edge input event, and recognizes the input operation as the click region 1010 according to the touch point coordinates, duration, and encoding.
  • the application framework layer reports the recognition result to the application layer, and the application layer executes an instruction to turn on the camera.
  • T region is added at the edge of the touch panel of the mobile terminal.
  • the current sliding is considered to be an edge gesture; if the input event starts from the C area and deviates to the A area, the edge gesture is considered to be ended. Normal input event; if the input event starts from the T area or the A area, it is considered that the current sliding event is a normal input event regardless of sliding to any area of the touch panel.
  • the reporting process of the input event of this embodiment is the same as the interaction control method described in the foregoing embodiment.
  • the application framework layer determines that the input event starts from the C area according to the touch point reported by an input event, and deviates to the A area (ie, the touch point coordinates at the start of the input are located in the C area, and a touch point in the input process)
  • the coordinate is located in the A area
  • the first judgment module and the second judgment module input the event as the edge input event according to the result of the coordinate judgment, and the edge input event ends, the normal input event starts, and the driver layer starts the next input event. Reported.
  • the mobile terminal of the embodiment of the present invention can be implemented in various forms.
  • the terminal described in the present invention may include, for example, a mobile phone, a mobile phone, a smart phone, a notebook computer, a digital broadcast receiver, a PDA (Personal Digital Assistant), a PAD (Tablet), a PMP (Portable Multimedia Player), a navigation device Mobile terminals of the like and fixed terminals such as digital TVs, desktop computers, and the like.
  • FIG. 16 is a schematic diagram of a hardware structure thereof.
  • the user equipment 1000 includes a touch screen 2010, a controller 200, a storage device 310, a global positioning system (GPS) chip 320, a communicator 330, a video processor 340, an audio processor 350, a button 360, a microphone 370, and a camera 380. , speaker 390 and motion sensing 906.
  • GPS global positioning system
  • the touch screen 2010 can be divided into an A zone and a C zone, or an A zone, a C zone, and a T zone as described above.
  • the touch screen 2010 can be implemented as various types of displays such as a liquid crystal display (LCD), an organic light emitting diode (OLED) display, and a plasma display panel (PDP).
  • the touch screen 2010 may include a driving circuit that can be implemented, for example, as an a-si TFT, a low temperature polysilicon (LTPS) TFT, and an OTFT (organic TFT), and a backlight unit.
  • the touch screen 2010 may include a touch sensor for sensing a touch gesture of a user.
  • the touch sensor can be implemented as various types of sensors, such as a capacitor type, a resistance type, or a piezoelectric type.
  • the capacitance type calculates a touch coordinate value by sensing a micro current excited by a user's body when a portion of the user's body (eg, a user's finger) is touched on the surface of the touch screen coated with the conductive material.
  • the touch screen includes two electrode plates, and the touch coordinate value is calculated by sensing a current flowing when the upper plate and the lower plate at the touch point are in contact when the user touches the touch panel.
  • the touch screen 2010 may sense a user gesture for using an input device such as a pen other than the user's finger.
  • the input device is a stylus pen including a coil
  • the user device 1000 may include a magnetic sensor (not shown) for sensing a magnetic field that changes according to the proximity of the coil within the stylus to the magnetic sensor .
  • the user device 1000 can also sense a proximity gesture, ie, the stylus hover over the user device 1000.
  • the storage device 310 can store various programs and data required for the operation of the user device 1000.
  • the storage device 310 can store programs and data for constructing various screens to be displayed on the respective areas (for example, the A area, the C area).
  • the controller 200 displays content on each area of the touch screen 2010 by using programs and data stored in the storage device 310.
  • the controller 200 includes a RAM 210, a ROM 220, a CPU 230, a graphics processing unit (GPU) 240, and a bus 250.
  • the RAM 210, the ROM 220, the CPU 230, and the GPU 240 can pass through the bus 250 are connected to each other.
  • a processor (CPU) 230 accesses the storage device 310 and performs booting using an operating system (OS) stored in the storage device 310. Moreover, the CPU 230 performs various operations by using various programs, contents, and data stored in the storage device 310.
  • OS operating system
  • the ROM 220 stores a set of commands for system startup.
  • the CPU 230 copies the OS stored in the storage device 310 to the RAM 210 according to the command set stored in the ROM 220, and starts the system by running the OS.
  • the CPU 230 copies the various programs stored in the storage device 310 to the RAM 210, and performs various operations by running the copy program in the RAM 210.
  • the GPU 240 can generate a screen including various objects such as icons, images, and text by using a calculator (not shown) and a renderer (not shown).
  • the calculator calculates feature values such as coordinate values, format, size, and color, wherein the objects are color-coded according to the layout of the screen, respectively.
  • the GPS chip 320 is a unit that receives GPS signals from GPS satellites and calculates the current location of the user equipment 1000. When the navigation program is used or when the current location of the user is requested, the controller 200 can calculate the location of the user by using the GPS chip 320.
  • the communicator 330 is a unit that performs communication with various types of external devices in accordance with various types of communication methods.
  • the communicator 330 includes a WiFi chip 331, a Bluetooth chip 332, a wireless communication chip 333, and an NFC chip 334.
  • the controller 200 performs communication with various external devices by using the communicator 330.
  • the WiFi chip 331 and the Bluetooth chip 332 perform communication according to the WiFi method and the Bluetooth method, respectively.
  • various connection information such as a service set identifier (SSID) and a session key may be first transmitted and received, communication may be connected by using connection information, and each of the communication information may be transmitted and received.
  • SSID service set identifier
  • the wireless communication chip 333 is a chip that performs communication in accordance with various communication standards such as IEEE, Zigbee, Third Generation (3G), Third Generation Partnership Project (3GPP), and Long Term Evolution (LTE).
  • NFC chip 334 is the root According to a chip operating using a near field communication (NFC) method of 13.56 MHz bandwidth among various RFID bandwidths, various RFID frequency bands such as 135 kHz, 13.56 MHz, 433 MHz, 860 to 960 MHz, and 2.45 GHz.
  • NFC near field communication
  • the video processor 340 is a unit that processes video data included in content received through the communicator 330 or content stored in the storage device 310.
  • Video processor 340 can perform various image processing for video data, such as decoding, scaling, noise filtering, frame rate conversion, and resolution conversion.
  • the audio processor 350 is a unit that processes audio data included in content received through the communicator 330 or content stored in the storage device 310.
  • the audio processor 350 can perform various processing for audio data, such as decoding, amplification, and noise filtering.
  • the controller 200 can reproduce the corresponding content by driving the video processor 340 and the audio processor 350 when the reproduction program is run for the multimedia content.
  • the speaker 390 outputs the audio data generated in the audio processor 350.
  • the button 360 can be various types of buttons, such as mechanical buttons or touch pads or touch wheels formed on some areas such as the front, side or back of the main outer body of the user device 1000.
  • the microphone 370 is a unit that receives user voice or other sounds and converts them into audio data.
  • the controller 200 can use user voices input through the microphone 370 during the call process, or convert them into audio data and store them in the storage device 310.
  • the camera 380 is a unit that captures a still image or a video image according to a user's control.
  • Camera 380 can be implemented as a plurality of units, such as a front camera and a rear camera. As described below, camera 380 can be used as a means of obtaining a user image in an exemplary embodiment that tracks the user's gaze.
  • the controller 200 can perform a control operation according to a user's voice input through the microphone 370 or a user motion recognized by the camera 380. Therefore, the user equipment 1000 can operate in an action control mode or a voice control mode.
  • action control When operating in the mode, the controller 200 photographs the user by activating the camera 380, tracks changes in user actions, and performs corresponding operations.
  • voice control mode the controller 200 can operate in the voice recognition mode to analyze the voice input through the microphone 370 and perform a control operation according to the analyzed user voice.
  • a voice recognition technology or a motion recognition technology is used in the above various exemplary embodiments. For example, when the user performs an action such as selecting an object marked on the home screen or speaking a voice command corresponding to the object, it may be determined that the corresponding object is selected and a control operation matching the object may be performed.
  • the motion sensor 906 is a unit that senses the movement of the body of the user device 1000.
  • User device 1000 can be rotated or tilted in various directions.
  • the motion sensor 906 can sense moving features such as rotational direction, angle, and slope by using one or more of various sensors such as a geomagnetic sensor, a gyro sensor, and an acceleration sensor. It should be understood that when the user device is rotated, correspondingly, the touch screen is also rotated and is rotated at the same angle as the user device.
  • the user device 1000 may further include a USB port connectable to a USB connector, for connecting like a headphone, a mouse, a LAN, and receiving and processing a digital multimedia broadcast.
  • a USB port connectable to a USB connector, for connecting like a headphone, a mouse, a LAN, and receiving and processing a digital multimedia broadcast.
  • Various input ports of various external components such as DMB chips of (DMB) signals, and various other sensors.
  • DMB DMB chips of
  • the storage device 310 can store various programs.
  • a touch screen is configured to detect a touch signal generated on the touch panel and to identify a touch point based on the touch signal.
  • a motion sensor configured to detect a rotation angle of the user equipment.
  • the processor includes: a driving module, an application framework module, and an application module;
  • the driving module is configured to acquire an input event according to the touch signal, and report the report to the application framework module;
  • the application frame module is configured to determine whether the touch point is located in the edge touch area or the normal touch area according to the touched position and the rotation angle of the reported input event, and if the edge touch area is processed, the recognition result is reported to the application module; After being processed and recognized in the normal touch area, the recognition result is reported to the application module;
  • the application module is configured to execute the corresponding instruction according to the reported recognition result.
  • the touch control method, the user equipment, the input processing method, and the mobile terminal of the embodiment of the present invention can implement a corresponding change edge touch area according to the rotation of the touch screen, so as to better adapt to the user's operation and improve the user experience;
  • the application framework layer the operations of distinguishing between the A zone and the C zone are performed, and the virtual device is established in the application framework layer, thereby avoiding the dependence of the driver layer on the hardware of the A zone and the C zone; by setting the touch point number, it can be realized It is compatible with the A protocol and the B protocol; it can be integrated into the operating system of the mobile terminal, and can be applied to different hardware and different types of mobile terminals, and has good portability; all the elements of the touch point (coordinates, numbers, etc. of the touch point, etc.) ) is stored, which can be facilitated by subsequent judgment of edge input (for example, FIT).
  • edge input for example, FIT
  • Any process or method description in the flowcharts or otherwise described in the embodiments of the invention may be understood to represent code that includes one or more executable instructions for implementing the steps of a particular logical function or process. Modules, segments or portions, and the scope of the embodiments of the invention includes additional implementations, in which the functions may be performed in a substantially simultaneous manner or in an inverse order depending on the functions involved, in the order shown or discussed. This should be understood by those skilled in the art of the embodiments of the present invention.

Abstract

一种触摸控制方法、用户设备(1000)、输入处理方法和移动终端,所述触摸控制方法包括:检测产生于触摸面板(901)上的触摸信号(S100);根据触摸信号识别触摸点(S101);检测触摸面板(901)的旋转角度;根据识别出的触摸点及所述旋转角度,判断触摸点位于边缘触摸区域还是正常触摸区域(S102);基于判断结果执行相应的指令(S103)。该方法可实现根据触摸屏(2010)的旋转相应的变换边缘触摸区域,以更好的适应用户的操作,提高用户体验;在应用框架层(203)进行区分A区(100)和C区(101)的操作,且在应用框架层(203)进行虚拟设备的建立,避免了在驱动层(202)区分A区(100)和C区(101)对硬件的依赖;通过设置触摸点编号,可实现区分手指,兼容A协议和B协议。

Description

触摸控制方法、用户设备、输入处理方法和移动终端 技术领域
本发明涉及通讯领域,更具体地说,涉及一种触摸控制方法、用户设备、输入处理方法和移动终端。
背景技术
随着移动终端技术的发展,终端边框越做越窄。为了改善用户的输入体验,边缘输入技术(例如,边缘触控)应运而生。
现有技术的边缘输入,当检测触摸点信息(touch info)后,在驱动层即根据触摸点信息判断触控是否发生在边缘输入的区域。
然而,在实际中由于输入芯片存在多样性,驱动层获取触摸点信息的方法也都带有极强的针对性,这就导致在判断事件类型(是否为边缘输入事件)时,需要对各款输入芯片做差异化的修改和移植,工作量较大且容易出错。
另一方面,驱动层在上报事件时,可以选择A协议或者B协议两种实现方式,其中B协议会区分手指ID。而边缘输入的实现需要依赖手指ID,在多点输入时用于对比同一手指前后两次点击的数据。因此,现有技术的输入方案仅能支持B协议,而采用A协议的驱动则不能得到支持。
再者,现有的移动终端的边缘输入区域是固定的,不能随着移动终端的旋转而进行相应的变换,用户体验差。
因此,现有技术的输入方案存在硬件依赖性强,不能同时支持A协议和B协议,且用户体验差的缺陷,需要改进。
发明内容
本发明要解决的技术问题在于,针对现有技术的上述移动终端的边缘输入方式不能根据移动终端的旋转而进行相应的变换的缺陷,提供一种触摸控制方法、用户设备、输入处理方法和移动终端。
本发明解决其技术问题所采用的技术方案是:
第一方面,提供一种触摸控制方法,包括:
检测产生于触摸面板上的触摸信号;
根据触摸信号识别触摸点;
检测触摸面板的旋转角度;
根据识别出的触摸点及所述旋转角度,判断触摸点位于边缘触摸区域还是正常触摸区域;
基于判断结果执行相应的指令。
在一个实施例中,所述旋转角度包括:旋转0度、顺时针旋转90度、顺时针旋转180度、顺时针旋转270度、逆时针旋转90度、逆时针旋转180度和逆时针旋转270度。
在一个实施例中,所述根据识别出的触摸点及所述旋转角度,判断触摸点位于边缘触摸区域还是正常触摸区域包括:
若旋转角度为0度,则当Wc<x<(W-Wc)时,触摸点位于正常触摸区域,否则,触摸点位于边缘触摸区域;
若旋转角度为顺时针90度,则当Wc<y<H-Wc时,触摸点位于正常触摸区域,否则,触摸点位于边缘触摸区域;
若旋转角度为顺时针180度,则当Wc<x<(W-Wc)时,触摸点位于正常触摸区域,否则,触摸点位于边缘触摸区域;
若旋转角度为顺时针270度,则当Wc<y<H-Wc时,触摸点位于正常触摸区域,否则,触摸点位于边缘触摸区域;
其中,x为触摸点的位于触摸面板所在坐标系的横轴坐标,x为触摸点的位于触摸面板所在坐标系的横轴坐标,W为触摸面板的宽度,Wc为边缘触摸区的宽度。
第二方面,提供一种用户设备,包括:触摸屏、动作传感器和处理器;
触摸屏,包括:触摸面板和触摸控制器,其中:
触摸面板,配置为检测产生于触摸面板上的触摸信号;
触摸控制器,配置为根据触摸信号识别触摸点;
动作传感器,配置为检测所述用户设备的旋转角度;
处理器,包括:驱动模块、应用框架模块和应用模块,其中:
所述驱动模块,配置为根据所述触摸信号获取输入事件,并上报到所述应用框架模块;
所述应用框架模块,配置为根据旋转角度及上报的输入事件的触摸点位置,判断触摸点位于边缘触摸区域还是正常触摸区域;
应用模块,配置为基于判断结果执行相应的指令。
第三方面,提供一种输入处理方法,包括:
驱动层获取用户通过输入设备产生的输入事件,并上报到应用框架层;
应用框架层根据移动终端的当前状态和上报的输入事件,判断输入事件是边缘输入事件,还是正常输入事件,若为正常输入事件则对正常输入事件进行处理识别,并将识别结果上报给应用层,若为边缘输入事件则对边缘输入事件进行处理识别,并将识别结果上报给应用层;
应用层根据上报的识别结果执行相应的指令。
在一个实施例中,所述方法还包括:
为每一输入事件创建一具有设备标识的输入设备对象。
在一个实施例中,所述为每一输入事件创建一具有设备标识的输入设备对象包括:
将正常输入事件与具有第一设备标识的触摸屏相对应;
应用框架层设置一具有第二设备标识的第二输入设备对象与边缘输入事件相对应。
在一个实施例中,所述驱动层获取用户通过输入设备产生的输入事件,并上报到应用框架层包括:
所述驱动层为每一触摸点赋予一用于区分手指的编号,并采用A协议协议上报所述输入事件。
在一个实施例中,所述驱动层获取用户通过输入设备产生的输入事件,并上报到应用框架层包括:
所述驱动层采用B协议上报所述输入事件;
所述方法还包括:
所述应用框架层为所述输入事件中的每一触摸点赋予用于区分手指的编号。
在一个实施例中,所述移动终端的当前状态包括:旋转0度、顺时针旋转90度、顺时针旋转180度、顺时针旋转270度、逆时针旋转90度、逆时针旋转180度和逆时针旋转270度。
在一个实施例中,若旋转角度为0度,则当Wc<x<(W-Wc)时,则应用框架层判断输入事件为正常输入事件,否则,为边缘输入事件;
若旋转角度为顺时针90度,则当Wc<y<H-Wc时,则应用框架层判断输入事件为正常输入事件,否则,为边缘输入事件;
若旋转角度为顺时针180度,则当Wc<x<(W-Wc)时,则应用框架层判断输入事件为正常输入事件,否则,为边缘输入事件;
若旋转角度为顺时针270度,则当Wc<y<H-Wc时,则应用框架层判断输入事件为正常输入事件,否则,为边缘输入事件;
其中,x为触摸点的位于触摸面板所在坐标系的横轴坐标,x为触摸点 的位于触摸面板所在坐标系的横轴坐标,W为触摸面板的宽度,Wc为边缘触摸区的宽度。
第四方面,提供一种移动终端,包括:
输入设备;
动作传感器,配置为检测所述移动终端的当前状态;
驱动层,配置为获取用户通过输入设备产生的输入事件,并上报到应用框架层;
应用框架层,配置为根据移动终端的当前状态和上报的输入事件,判断输入事件是边缘输入事件,还是正常输入事件,若为正常输入事件则对正常输入事件进行处理识别,并将识别结果上报给应用层,若为边缘输入事件则对边缘输入事件进行处理识别,并将识别结果上报给应用层;
应用层,配置为根据上报的识别结果执行相应的指令。
在一个实施例中,所述正常输入事件与具有第一设备标识的第一输入设备对象相对应;
所述应用框架层还配置为设置一具有第二设备标识的第二输入设备对象,用于与所述边缘输入事件相对应。
在一个实施例中,所述驱动层采用A协议或B协议上报输入事件,若采用A协议上报输入事件,则所述事件获取模块还配置为为每一触摸点赋予一用于区分手指的编号;
若采用B协议上报输入事件,则所述应用框架层还配置为为每一触摸点赋予用于区分手指的编号。
在一个实施例中,所述驱动层包括事件获取模块,配置为获取用户通过输入设备产生的输入事件。
在一个实施例中,所述应用框架层包括输入读取器(input reader);
所述移动终端还包括设置于所述驱动层和所述输入读取器间的设备节 点,配置为通知所述输入读取器获取输入事件;
所述输入读取器,配置为遍历设备节点,获取输入事件并上报。
在一个实施例中,所述移动终端的当前状态包括:旋转0度、顺时针旋转90度、顺时针旋转180度、顺时针旋转270度、逆时针旋转90度、逆时针旋转180度和逆时针旋转270度。
在一个实施例中,所述应用框架层还包括:第一事件处理模块,配置为对所述输入读取器上报的输入事件进行坐标计算后上报;
第一判断模块,配置为根据所述移动终端的当前状态和所述第一事件处理模块上报的坐标值判断输入事件是否为边缘输入事件,若不是则将输入事件上报。
在一个实施例中,所述应用框架层还包括:
第二事件处理模块,配置为对所述输入读取器上报的输入事件进行坐标计算后上报;
第二判断模块,配置为根据所述移动终端的当前状态和所述第二事件处理模块上报的坐标值判断输入事件是否为边缘输入事件,若是则将输入事件上报。
在一个实施例中,若旋转角度为0度,则当Wc<x<(W-Wc)时,则判断结果为输入事件为正常输入事件,否则,为边缘输入事件;
若旋转角度为顺时针90度,则当Wc<y<H-Wc时,则判断结果为输入事件为正常输入事件,否则,为边缘输入事件;
若旋转角度为顺时针180度,则当Wc<x<(W-Wc)时,则判断结果为输入事件为正常输入事件,否则,为边缘输入事件;
若旋转角度为顺时针270度,则当Wc<y<H-Wc时,则判断结果为输入事件为正常输入事件,否则,为边缘输入事件;
其中,x为触摸点的位于触摸面板所在坐标系的横轴坐标,x为触摸点 的位于触摸面板所在坐标系的横轴坐标,W为触摸面板的宽度,Wc为边缘触摸区的宽度。
在一个实施例中,所述应用框架层还包括:
事件派发模块,配置为将所述第二判断模块和所述第一判断模块上报的事件进行上报。
在一个实施例中,所述应用框架层还包括:
第一应用模块;
第二应用模块;
第三判断模块,配置为根据所述事件派发模块上报的事件中包含的设备标识判断事件是否为边缘输入事件,若属于,则上报给所述第一应用模块,否则上报给当所述第二应用模块;
所述第一应用模块,配置为根据正常输入事件的相关参数对正常输入事件进行识别并将识别结果上报到应用层;
所述第二应用模块,配置为根据边缘输入事件的相关参数对边缘输入事件进行识别并将识别结果上报的应用层。
在一个实施例中,所述输入设备为移动终端的触摸屏;
所述触摸屏包括至少一个边缘输入区和至少一个正常输入区。
在一个实施例中,所述输入设备为移动终端的触摸屏;
所述触摸屏包括至少一个边缘输入区、至少一个正常输入区和至少一个过渡区。
实施本发明的触摸控制方法、用户设备、输入处理方法和移动终端,可实现根据触摸屏的旋转,相应的变换边缘触摸区域,以更好的适应用户的操作,提高用户体验;另一方面,由于在应用框架层才进行区分A区和C区的操作,且在应用框架层进行虚拟设备的建立,避免了在驱动层区分A区和C区对硬件的依赖;通过设置触摸点编号,可实现区分手指,兼容A 协议和B协议;且可集成到移动终端的操作系统中,可适用不同硬件、不同种类的移动终端,可移植性好;触摸点的所有要素(触摸点的坐标、编号等)被存储,可后续判断边缘输入(例如,FIT)提供便利。
附图说明
下面将结合附图及实施例对本发明作进一步说明,附图中:
图1是本发明一实施例的移动终端的硬件结构示意图;
图2是本发明第一实施例的移动终端的触摸屏区域划分示意图;
图3是本发明实施例的移动终端的触摸屏旋转角度为0度时的示意图;
图4是本发明实施例的移动终端的触摸屏旋转角度为顺时针90度时的示意图;
图5是本发明实施例的移动终端的触摸屏旋转角度为顺时针180度时的示意图;
图6是本发明实施例的移动终端的触摸屏旋转角度为顺时针270度时的示意图;
图7是本发明实施例的触摸控制方法的流程示意图;
图8是本发明一实施例的移动终端的软件架构示意图;
图9是本发明一实施例的移动终端的结构示意图;
图10是本发明实施例中判断边缘输入事件的流程示意图;。
图11是本发明实施例根据设备标识判断输入事件的流程示意图;
图12是本发明实施例的输入处理方法的流程图;
图13是利用本发明实施例的输入处理方法对旋转角度为0度时的移动终端的相机应用进行开启的效果示意图;
图14是利用本发明实施例的输入处理方法对旋转角度为顺时针90度时的移动终端的相机应用进行开启的效果示意图;
图15是本发明第二实施例的移动终端的触摸屏区域划分示意图;
图16是本发明一实施例的用户设备的硬件结构示意图。
具体实施方式
为了对本发明的技术特征、目的和效果有更加清楚的理解,现对照附图详细说明本发明的具体实施方式。
参见图1,本发明一实施例的移动终端包括:输入设备、处理器903和显示屏904。在一个实施例中,输入设备为触摸屏2010。触摸屏2010包括触摸面板901和触摸控制器902。此外,输入设备还可为非触摸式输入设备(例如,红外输入设备等)等。
触摸控制器902可以是单个专用集成电路(ASIC),其可以包括一个或多个处理器子系统,处理器子系统可以包括一个或多个ARM处理器或者其它具有类似功能和性能的处理器。
触摸控制器902主要用于接收产生于触摸面板901的触摸信号,并进行处理后传输给移动终端的处理器903。这种处理例如,为将物理输入信号进行模数转换、处理得到触摸点坐标、处理得到触摸持续时间等。
处理器903接收触摸控制器902的输出,进行处理后基于该输出执行动作。所述动作包括但不限于,移动诸如由表或指示符的对象、滚动或摇摄、调整控制设置、打开文件或文档、查看菜单、作为选择、执行指令、操作耦接到主机设备的外围设备、应答电话呼叫、拨打电话、终止电话呼叫、改变音量或音频设置、存储于电话通信相关的信息(例如,地址、常用号码、已接呼叫、未接呼叫)、登录计算机或计算机网络、允许授权个体访问计算机或计算机网络的受限区域、记载与计算机桌面的用户喜好配置相关联的用户简档、允许访问网络内容、启动特定程序、加密或解码消息,等等。
处理器903还与显示屏904连接。显示屏904用于向设备的用户提供UI。
在一些实施例中,处理器903可以是与触摸控制器902分开的部件。在其它实施例中,处理器903可以与触摸控制器902为一合成的部件。
在一个实施例中,触摸面板901设置有分立的电容性传感器、电阻性传感器、力传感器、光学传感器或类似传感器等。
触摸面板901内包括有由导电材料制成横向和纵向的电极阵列。对于一个M行和N列电极阵列的单点触摸屏(仅能确定单点触摸的坐标),触摸控制器902采用自电容扫描,则分别扫描M行和N列后就可以根据每一行和每一列信号来进行计算,定位手指在触摸屏上的坐标。扫描次数为M+N次。
对于一个M行和N列电极阵列的多触点触摸屏(能检测并解析多点的坐标,即多点触控),触摸控制器902采用多触点互电容扫描,对行和列的交叉点扫描,由此,扫描次数为M×N次。
当用户的手指触摸面板,触摸面板产生触摸信号(为电信号)发送给触摸控制器902。触摸控制器902通过扫描可以得到触摸点的坐标。在一个实施例中,触摸屏2010的触摸面板901在物理上是一套独立的坐标定位系统,每次触摸的触摸点坐标上报到处理器903后,由处理器903转换为适应于显示屏904的像素坐标,以正确识别输入操作。
参见图2为本发明第一实施例的触摸面板的区域划分示意图。在该实施例中,为了实现边缘防误触、且提供新的交互方式,将触摸屏的触摸面板划分为三个区域,其中,C区101为边缘输入区,A区100为正常输入区。
在本发明的实施例中,A区内的输入操作,按照现有的正常处理方式进行处理,例如,A区100内单击某应用图标即开启该应用等。对于C区101内的输入操作,可定义为边缘输入处理方式,例如,可定义C区101内双边滑动即进行终端加速等。
在本发明的实施例中,C区可采用固定方式划分或自定义划分。固定 划分,即设置固定长度、固定宽带的区域作为C区101。C区101可包括位于触摸面板左侧的部分区域和右侧的部分区域,其位置固定设于触摸面板的两侧边缘,如图1所示。当然,也可仅在一侧边缘处划分C区101。
自定义划分,即C区101的区域的个数、位置及大小,可自定义的设置,例如,可由用户进行设定,也可由移动终端根据自身需求,调整C区101的区域的数量、位置及大小。通常,C区101的基本图形设计为矩形,只要输入图形对角的两个顶点坐标即可确定C区的位置和大小。
为满足不同用户对不同应用的使用习惯,还可设置应用于不同应用场景下的多套C区设置方案。例如,在系统桌面下,因为图标占位较多,两侧的C区宽度设置得相对较窄;而当点击相机图标进入相机应用后,可设置此场景下的C区数量、位置、大小,在不影响对焦的情况下,C区宽度可设置的相对较宽。
本发明实施例对C区的划分、设置方式不作限制。
参见图3,触摸面板左上角T0被设置为坐标原点,坐标值为(0,0)。而触摸面板的右下角的坐标值为T7(W,H),其中,W为触摸面板的宽度,H为触摸面板的高度。
在本发明的一个实施例中,如上所述将触摸屏划分为A区和C区,A区和C区属于同一坐标系。当移动终端的触摸面板被划分为多个区域后,将坐标也对应进行划分。例如,若触摸面板的宽度为W,C区宽度为Wc,则将坐标位于T0、T1、T4和T5所限定的区域内的触摸点,和/或坐标位于T2、T3、T6和T7所限定的区域内的触摸点,定义为边缘触摸点;而将坐标位于T1、T2、T5和T6所限定的区域内的触摸点定义为正常触摸点。
参见图4,以上述图3所述的触摸屏方位为初始方位,顺时针将触摸屏旋转90度,此时,坐标系并未发生改变。为了便于操作,C区位置发生了改变,参见图4,触摸屏顺时针旋转90度后,将坐标位于T0、S2、S4和 T3所限定的区域内的触摸点,和/或坐标位于T4、S1、T7和S3所限定的区域内的触摸点,定义为边缘触摸点;而将坐标位于S1、S2、S3和S4所限定的区域内的触摸点定义为正常触摸点。
参见图5,以上述图3所述的触摸屏方位为初始方位,顺时针将触摸屏旋转180度,此时,坐标系并未发生改变,C区位置未发生改变。
参见图6,以上述图3所述的触摸屏方位为初始方位,顺时针将触摸屏旋转270度,此时,坐标系并未发生改变,C区的位置和上述图4所示的相同。
如图3-图6所示的触摸屏状态,其触摸屏的坐标系均未发生改变,即无论移动终端的触摸屏处于上述图3-图6的任一状态或其它旋转角度的状态(这些旋转状态可由动作传感器906检测得到),触摸面板901接收到触摸信号时,触摸控制器902上报的触摸点的坐标都是按照图3所示的坐标系进行上报的,不会关注触摸屏的旋转状态。而由于触摸屏2010发生旋转后,显示屏904也相应的发生了旋转,处理器903将把触摸控制器902上报的坐标进行适应性的转换以适应显示屏904的像素坐标。存储器905中存储有旋转角度与转换方法之间的对应关系,这样的转换将在后续进行介绍。
参见图7,基于上述移动终端,本发明实施例的触摸控制方法包括以下步骤:
S100、检测产生于触摸面板上的触摸信号。
S101、根据触摸信号识别触摸点。
具体的,当手指或其它物体触摸面板产生触摸手势时,生成触摸信号,触摸控制器检测该信号,并通过扫描等方式获得触摸点的物理坐标。在本发明实施例中,采用如图3-图6中所示的坐标系。
由上所述,本发明实施例的移动终端的触摸屏被划分为边缘触摸区和 正常触摸区,因此,对不同区的触摸手势分别进行定义。在一个实施例中,正常触摸区的触摸手势包括:单击、双击、滑动等。边缘触摸区的触摸手势包括:左侧边缘上滑、左侧边缘下滑、右侧边缘上滑、右侧边缘下滑、双边上滑、双边下滑、握持手机四角、单边来回滑、握一握、单手握持等。
应理解,这里的“左侧”和“右侧”是相对而言的,例如,若为图3所示,则M点所在的区域为“左侧”,相对的侧即为“右侧”。若为图4所示,则M点所在的区域为“左侧”,相对的侧即为“右侧”。即本发明实施例中,“左侧”和“右侧”随着触摸屏的转动而变化。
S102、检测触摸面板的旋转角度,根据识别出的触摸点及所述旋转角度,判断触摸点位于边缘触摸区域还是正常触摸区域。
具体的,触摸面板的旋转角度可由动作传感器检测移动终端的旋转角度从而得出。
处理器根据触摸控制器上报的物理坐标判断触摸点所属的区域。在本发明的实施例中,存储器中存储有各区域的坐标范围。
参见图3和图5,边缘触摸区域的坐标范围为:坐标位于T0、T1、T4和T5所限定的区域内,和/或坐标位于T2、T3、T6和T7所限定的区域内。正常触摸区域的坐标范围为:坐标位于T1、T2、T5和T6所限定的区域内。
参见图4和图6,当触摸屏发生顺时针90度旋转或顺时针270度旋转时,边缘触摸区域的坐标范围为:坐标位于T0、S2、S4和T3所限定的区域内,和/或坐标位于T4、S1、T7和S3所限定的区域内。正常触摸区域的坐标范围为:坐标位于S1、S2、S3和S4所限定的区域内。
S103、基于判断结果执行相应的指令。
具体的,由于触摸面板的坐标和显示屏的坐标为两个独立的坐标系,因此,需要将触摸面板的物理坐标映射为显示屏的像素点坐标,以实现正确显示触点效果、识别触摸手势。具体的,转换规则为:
旋转角度为0,即处于图3所示的状态时,对于触摸点M,触摸控制器上报的坐标为(xc,,yc),则无需进行转换,即显示屏的坐标同样为(xc,,yc)。
旋转角度为顺时针90度时,即处于图4所示的状态时,对于触摸点M,触摸控制器上报的坐标为(xc,,yc),则转换后的坐标为(yc,W-xc)。
旋转角度为顺时针180度时,即处于图5所示的状态时,对于触摸点M,触摸控制器上报的坐标为(xc,,yc),则转换后的坐标为(W-xc,H-yc)。
旋转角度为顺时针270度时,即处于图6所示的状态时,对于触摸点M,触摸控制器上报的坐标为(xc,,yc),则转换后的坐标为(H-yc,xc)。
应理解,上述转换规则是建立在显示屏坐标系的大小和触摸面板坐标系的大小相同的基础上的(例如,均为1080×1920像素),若显示屏的坐标系与触摸面板坐标系的大小不相同,则在经过上述转换后,还要调整为适应于显示屏的坐标,具体的,将触摸面板的坐标乘以相应的转换系数。转换系数即显示屏和触摸面板的大小的比值。例如,若触摸面板为720×1280,而显示屏为1080×1920,则显示屏和触摸面板的比值为1.5,由此,将上报的触摸面板的物理坐标的横坐标和纵坐标分别乘以1.5,原来为(xc,yc),则转换为显示屏坐标时则变为(1.5×xc,1.5×yc),或(1.5×yc,1.5×W-xc)等等。
坐标转换和调整后,即可实现准确的显示,识别出正确的触控手势,由此执行与触控手势对应的指令。在本发明的实施例中,触控手势与指令一一对应并存储于存储器中。
本发明实施例的触摸控制方法可实现根据触摸屏的旋转相应的变换边缘触摸区域,以更好的适应用户的操作,提高用户体验。
参见图8,本发明一实施例的移动终端的软件架构示意图。本发明实施例的移动终端的软件架构包括:输入设备201、驱动层202、应用框架层203 和应用层204。其中,驱动层202、应用框架层203和应用层204的功能由处理器903执行。在一个实施例中,输入设备201为包括触摸面板和触摸控制器的触摸屏。
输入设备201接收到用户的输入操作,将物理输入转变为触摸信号,将触摸信号传递至驱动层202;驱动层202对输入的位置进行解析,得到触摸点的具体坐标、持续时间等参数,将该参数上传至应用框架层203,应用框架层203与驱动层202的通信可通过相应的接口来实现。应用框架层203接收到驱动层202上报的参数,进行解析,区分边缘输入事件和正常输入事件,并将有效的输入向上传递给应用层204的具体哪一个应用,以满足应用层204根据不同的输入操作执行不同的输入操作指令。
参见图9,为本发明一实施例的移动终端的结构示意图。在本发明的一个实施例中,输入设备包括上述所述的触摸屏2010。驱动层202包括事件获取模块2020。在驱动层202和应用框架层203之间设置有设备节点2021。应用框架层203包括输入读取器2030、第一事件处理模块2031、第二事件处理模块2032、第一判断模块2033、第二判断模块2034和事件派发模块2035、第三判断模块2036、第一应用模块2037、第二应用模块2038等。
其中,驱动层202包括事件获取模块2001,配置为获取用户通过输入设备201产生的输入事件,例如,通过触摸屏进行的输入操作事件。在本发明的实施例中,输入事件包括:正常输入事件(A区输入事件)和边缘输入事件(C区输入事件)。正常输入事件包括在A区进行的单击、双击、滑动等输入操作。边缘输入事件包括在C区进行的左侧边缘上滑、左侧边缘下滑、右侧边缘上滑、右侧边缘下滑、双边上滑、双边下滑、握持手机四角、单边来回滑、握一握、单手握持等输入操作。
此外,事件获取模块2001还配置为获取输入操作的触摸点的坐标、持续时间等相关参数。若采用A协议上报输入事件,则事件获取模块2001还 配置为为每一触摸点赋予一用于区分手指的编号(ID)。由此,若采用A协议上报输入事件,则上报的数据包括触摸点的坐标、持续时间等参数,以及触摸点的编号。
驱动层202和输入读取器2030间设置有设备节点2011,配置为通知应用框架层203的输入读取器2030获取输入事件。
输入读取器2030,配置为遍历设备节点,获取输入事件并上报。若驱动层202采用B协议上报输入事件,则输入读取器2030还配置为为每一触摸点赋予用于区分手指的编号(ID)。在本发明的实施例中,输入读取器2030还配置为将触摸点的所有要素信息(坐标、持续时间、编号等)进行存储。
在本发明的实施例中,为了便于应用层204区分不同的输入事件以进行响应,每一输入事件创建一具有设备标识的输入设备对象。在一个实施例中,可为正常输入事件创建第一输入设备对象,其具有第一标识。第一输入设备对象与实际硬件触摸屏相对应。
此外,应用框架层203还包括一第二输入设备对象,该第二输入设备对象(例如,边缘输入设备,FIT device)为虚拟设备,即为一空设备,其有一第二标识,用于与边缘输入事件相对应。应理解,也可将边缘输入事件与具有第一标识的第一输入设备对象相对应,而将正常控事件与具有第二标识的第二输入设备对象相对应。
第一事件处理模块2031,配置为对输入读取器2030上报的输入事件进行处理,例如,触摸点的坐标计算。
第二事件处理模块2032,配置为对输入读取器2030上报的输入事件进行处理,例如,触摸点的坐标计算。
第一判断模块2033配置为根据坐标值(X值)判断事件是否为边缘输入事件,若不是则将事件上传到事件派发模块2035。
第二判断模块2034配置为根据坐标值(X值)判断事件是否为边缘输 入事件,若是则将事件上传到事件派发模块2035。
参见图10,第一判断模块2033在判断事件是否为边缘输入事件时,获取触摸点的横轴坐标,将触摸点的横轴坐标(即X轴坐标)(x)与C区宽度(Wc)以及触摸屏宽度(W)进行比较。具体的,若Wc<x<(W-Wc)则触摸点位于A区,事件为正常输入事件;否则,事件为边缘输入事件;若事件不是边缘输入事件(即为正常输入事件)则将事件上报到事件派发模块2035。同样的,第二判断模块2034在判断事件是否为边缘输入事件时,按照图4所示的方式进行判断,若判断结果为事件为边缘输入事件,则将事件上报到事件派发模块2035。
应理解,图10所示的判断流程是建立在如图2所示的移动终端的触摸屏基础上的,即移动终端包括位于左右两侧边缘的C区101,和位于中间的A区100。因此,当沿着图3所示的坐标系进行坐标设定时,若Wc<x<(W-Wc)则可确定触摸点位于A区。在其它实施例中,判断公式(Wc<x<(W-Wc))可根据移动终端区域的划分进行调整,例如,若移动终端仅包括一个位于左侧边缘的C区101,且其宽度为Wc,则当Wc<x<W时,触摸点位于A区;否则,触摸点位于C区。若移动终端仅包括一个位于右侧边缘的C区101,且其宽度为Wc,则当x<(W-Wc)时,触摸点位于A区;否则,触摸点位于C区。
应理解,当移动终端发生旋转时,动作传感器可检测到这种旋转,并将旋转信息传递给处理器。本发明实施例中,处理器结合动作传感器的检测结果进行输入事件区域的判断。具体的,若旋转角度为顺时针90度,即旋转为图4所示的状态,则第一判断模块和第二判断模块的判断依据变为:若Wc<y<H-Wc,则触摸点位于A区,否则,触摸点位于C区。其中,y为触摸点的Y轴坐标。
若旋转角度为顺时针180度,即旋转为图5所示的状态,则第一判断 模块和第二判断模块的判断依据为:若Wc<x<(W-Wc),则触摸点位于A区,否则,触摸点位于C区。
若旋转角度为顺时针270度,即旋转为图6所示的状态,则第一判断模块和第二判断模块的判断依据变为:若Wc<y<H-Wc,则触摸点位于A区,否则,触摸点位于C区。其中,y为触摸点的Y轴坐标。
应理解,若仅在触摸屏的一侧或一侧的某一区域划分C区,输入事件所在区域的判断进行相应的调整,整体判断思路为:无论触摸屏是否旋转,确定C区的长度和宽度,确定其坐标范围,在判断时,根据坐标范围进行排除,以确定输入事件所在的区域。
事件派发模块2035配置为将边缘输入事件和/或A区输入事件上报到第三判断模块2036。在一个实施例中,边缘输入事件和A区输入事件上报所采用的通道不相同。边缘输入事件采用专用通道上报。
此外,事件派发模块2035还配置为获取移动终端的当前状态,根据当前状态对上报的坐标进行转换和调整后上报。
本发明实施例中,根据动作传感器的检测结果获取移动终端的当前状态。当前状态包括:旋转角度为0度、顺时针90度、顺时针180度、顺时针270度等。应理解,若为逆时针旋转,则逆时针90度与顺时针270度相同,逆时针180度与顺时针180度相同,逆时针270度与顺时针90度相同。
对坐标进行转换和调整的具体实现参见上述步骤S103中的描述,在此不再赘述。
在一个实施例中,事件派发模块2035由inputdispatcher::dispatchmotion()实现。
第三判断模块2036配置为根据设备标识(ID)判断事件是否为边缘输入事件,若属于,则上报给第一应用模块2037,否则上报给当第二应用模块2038。
具体的,参见图11,第三判断模块2036在判断时,首先获取设备标识,根据设备标识判断是否为触屏类型设备;若是,则进一步判断设备标识是否为C区设备标识即上述第二输入设备对象的标识,若是,则判断为边缘输入事件,若否,则判断为正常输入事件。应理解,也可在判断为触屏类设备后,进一步判断设备标识是否为A区设备标识即上述第一输入设备对应的标识,若是,则判断为正常输入事件,若否,则判断为边缘输入事件。
在本发明的实施例中,第一应用模块2037配置为处理与A区输入相关的输入事件,具体的,这种处理包括:根据输入操作的触摸点坐标、持续时间、编号等进行处理识别,并将识别结果上报到应用层。第二应用模块2038配置为处理与C区输入相关的输入事件,具体的,这种处理包括:根据处理操作的触摸点坐标、持续时间、编号进行处理识别,并将识别结果上报到应用层。例如,根据触摸点的坐标、持续时间和编号即可识别出输入操作是A区的单击、滑动,还是C区的单边来回滑等。
应用层204包括相机、图库、锁屏等应用(应用1、应用2……)。本发明实施例中的输入操作包括应用级和系统级,系统级的手势处理也将其归类为应用层。其中,应用级为对应用程序的操控,例如,开启、关闭、音量控制等。系统级为对移动终端的操控,例如,开机、加速、应用间切换、全局返回等。应用层可以通过注册C区事件的Listener获得C区的输入事件进行处理,也可以通过注册A区事件的Listener获得A区的输入事件进行处理。
在一个实施例中,移动终端设置并存储有与不同的输入操作对应的指令,其中包括与边缘输入操作对应的指令和与正常输入操作对应的指令。应用层接收到上报的边缘输入事件的识别结果,即根据边缘输入操作调用相应的指令以响应该边缘输入操作。应用层接收到上报的正常输入事件的识别结果,即根据正常输入操作调用相应的指令以响应该正常输入操作。
应理解,本发明实施例的输入事件包括仅在A区的输入操作、仅在C区的输入操作以及同时产生于A区和C区的输入操作。由此,指令也包括与这三类输入事件对应的指令。本发明实施例可实现A区和C区输入操作的组合对移动终端进行控制,例如,输入操作为同时单击A区和C区的相应位置,对应的指令为关闭某一应用,因此,通过同时单击A区和C区相应位置的输入操作,可实现对应用的关闭。
本发明实施例的移动终端,可实现根据触摸屏的旋转相应的变换边缘触摸区域,以更好的适应用户的操作,提高用户体验;另一方面,由于在应用框架层才进行区分A区和C区的操作,且在应用框架层进行虚拟设备的建立,避免了在驱动层区分A区和C区对硬件的依赖;通过设置触摸点编号,可实现区分手指,兼容A协议和B协议;且由于输入读取器2030、第一事件处理模块2031、第二事件处理模块2032、第一判断模块2033、第二判断模块2034和事件派发模块2035、第三判断模块2036、第一应用模块2037、第二应用模块2038等的功能可集成到移动终端的操作系统中,可适用不同硬件、不同种类的移动终端,可移植性好;输入读取器(Input Reader)会自动将一个触摸点的所有要素(触摸点的坐标、编号等)保存起来,为后续判断边缘输入(例如,FIT)提供便利。
参见图12为本发明实施例的输入处理方法的流程图,包括以下步骤:
S1、驱动层获取用户通过输入设备产生的输入事件,并上报到应用框架层。
具体的,输入设备接收到用户的输入操作(即输入事件),将物理输入转变为电信号,并将电信号传递至驱动层。在本发明实施例中,输入事件包括A区输入事件和C区输入事件。A区输入事件包括在A区进行的单击、双击、滑动等输入操作。C区输入事件包括在C区进行的左侧边缘上滑、左侧边缘下滑、右侧边缘上滑、右侧边缘下滑、双边上滑、双边下滑、单 边来回滑、握一握、单手握持等输入操作。
驱动层根据接收到的电信号对输入位置进行解析,得到触摸点的具体坐标、持续时间等相关参数。该相关参数被上报到应用框架层。
此外,若驱动层采用A协议上报输入事件,则该步骤S1还包括:
为每一触摸点赋予一用于区分手指的编号(ID)。
由此,若驱动层采用A协议上报输入事件,则上报的数据包括上述相关参数,以及触摸点的编号。
S2、应用框架层判断输入事件是边缘输入事件,还是正常输入事件,若为正常输入事件则执行步骤S3,若为边缘输入事件则执行步骤S4。
具体的,应用框架层根据输入事件的相关参数中的坐标可判断其为边缘输入事件还是正常输入事件。参见上述图10,首先获取触摸点的横轴坐标,然后将触摸点的横轴坐标(即X轴坐标)(x)与C区宽度(Wc)以及触摸屏宽度(W)进行比较。若Wc<x<(W-Wc)则触摸点位于A区,事件为正常输入事件;否则,事件为边缘输入事件。若驱动层采用B协议上报输入事件,则步骤S2还具体包括:为每一触摸点赋予用于区分手指的编号(ID);将触摸点的所有要素信息(坐标、持续时间、编号等)进行存储。
应理解,当触摸屏发生旋转时,相应的判断参见上述描述,在此不再赘述。
由此,本发明实施例通过设置触摸点编号,可实现区分手指,兼容A协议和B协议;且触摸点的所有要素(触摸点的坐标、编号等)被存储,可后续判断边缘输入(例如,FIT)提供便利。
在一个实施例中,边缘输入事件和正常输入事件上报所采用的通道不相同。边缘输入事件采用专用通道。
S3、应用框架层对正常输入事件进行处理识别,并将识别结果上报给应用层。
S4、应用框架层对边缘输入事件进行处理识别,并将识别结果上报给应用层。
具体的,处理识别包括:根据输入操作的触摸点坐标、持续时间、编号等进行处理识别,以确定输入操作。例如,根据触摸点的坐标、持续时间和编号即可识别出是A区的单击、滑动等输入操作,还是C区的单边来回滑等输入操作。
S5、应用层根据上报的识别结果执行相应的指令。
具体的,应用层包括相机、图库、锁屏等应用。本发明实施例中的输入操作包括应用级和系统级,系统级的手势处理也将其归类为应用层。其中,应用级为对应用程序的操控,例如,开启、关闭、音量控制等。系统级为对移动终端的操控,例如,开机、加速、应用间切换、全局返回等。
在一个实施例中,移动终端设置并存储有与不同的输入操作对应的指令,其中包括与边缘输入操作对应的指令和与正常输入操作对应的指令。应用层接收到上报的边缘输入事件的识别结果,即根据边缘输入操作调用相应的指令以响应该边缘输入操作;应用层接收到上报的正常输入事件的识别结果,即根据正常输入操作调用相应的指令以响应该正常输入操作。
应理解,本发明实施例的输入事件包括仅在A区的输入操作、仅在C区的输入操作以及同时产生于A区和C区的输入操作。由此,指令也包括与这三类输入事件对应的指令。本发明实施例可实现A区和C区输入操作的组合对移动终端进行控制,例如,输入操作为同时单击A区和C区的相应位置,对应的指令为关闭某一应用,因此,通过同时单击A区和C区相应位置的输入操作,可实现对应用的关闭。
在一个实施例中,本发明实施例的输入处理方法还包括:
S11、为每一输入事件创建一具有设备标识的输入设备对象。
具体的,在一个实施例中,可为正常输入事件创建第一输入设备对象, 其具有第一标识。第一输入设备对象与输入设备触摸屏相对应。应用框架层设置一第二输入设备对象。该第二输入设备对象(例如,为FIT device)为虚拟设备,即为一空设备,其具有一第二标识,用于与边缘输入事件相对应。应理解,也可将边缘输入事件与具有第一标识的第一输入设备对象相对应,而将正常控事件与具有第二标识的第二输入设备对象相对应。
在一个实施例中,本发明实施例的输入处理方法还包括:
S21、应用框架层根据移动终端的当前状态,根据当前状态对上报的坐标进行转换和调整后上报。
具体的,移动终端的当前状态包括:旋转角度为0度、顺时针90度、顺时针180度、顺时针270度等。
应理解,在本发明的实施例中,若为逆时针旋转,则逆时针90度与顺时针270度相同,逆时针180度与顺时针180度相同,逆时针270度与顺时针90度相同。
对坐标进行转换和调整的具体实现参见上述步骤S103和应用框架层中的描述,在此不再赘述。
在一个实施例中,步骤S21可由inputdispatcher::dispatchmotion()实现。
S22、根据设备标识判断输入事件是否为边缘输入事件,若属于,则上执行步骤S3,若不属于则执行步骤S4。
具体的,参见上述图11,根据设备标识判断输入事件是否为边缘输入事件时,首先获取设备标识,根据设备标识判断是否为触屏类型设备;若是,则进一步判断设备标识是否为C区设备标识即上述第二输入设备对象的标识,若是,则判断为边缘输入事件,若否,则判断为正常输入事件。应理解,也可在判断为触屏类设备后,进一步判断设备标识是否为A区设备标识即上述第一输入设备对应的标识,若是,则判断为正常输入事件, 若否,则判断为边缘输入事件。
本发明实施例的输入处理方法,可实现根据触摸屏的旋转相应的变换边缘触摸区域,以更好的适应用户的操作,提高用户体验;另一方面,由于在应用框架层才进行区分A区和C区的操作,且在应用框架层进行虚拟设备的建立,避免了在驱动层区分A区和C区对硬件的依赖;通过设置触摸点编号,可实现区分手指,兼容A协议和B协议;且可集成到移动终端的操作系统中,可适用不同硬件、不同种类的移动终端,可移植性好;触摸点的所有要素(触摸点的坐标、编号等)被存储,可后续判断边缘输入(例如,FIT)提供便利。
参见图13,是利用本发明实施例的输入处理方法对移动终端的相机应用进行开启的效果示意图。其中,图13左边的图为移动终端的主界面示意图,其中,区域1010为在边缘输入区域(C区域101)预先设置的可实现开启相机功能的输入操作的触摸点。具体的,单击区域1010可实现开启相机。则在移动终端中,存储有指令为:开启相机,其与单击区域1010的输入操作相对应。
当需要使用相机时,用户单击触摸屏的区域1010,驱动层获取该输入事件,并上报到应用框架层。应用框架层根据触摸点的坐标可判断出该输入事件为边缘输入事件。应用框架层对该边缘输入事件进行处理识别,根据触摸点坐标、持续时间和编码,识别出该输入操作为单击区域1010。应用框架层将识别结果上报到应用层,应用层即执行开启相机的指令。
参见图14,当移动终端顺时针旋转90后,C区域101和可实现开启相机功能的触摸点发生了相应的改变。单击图14中的区域1010后开启相机的流程和上述图13的相似。
应理解,图13和图14中,开启相机功能后,未示出C区,但其仍存在,或根据本发明实施例的上述对C区划分的描述,开启相机后,C区宽 度可设置的相对较宽等,这可被本领域技术人员所理解。
参见图15为本发明第二实施例的移动终端的触摸屏划分示意图。在该实施例中,为了防止用户输入过程中偏离输入开始的区域导致准确率下降,在移动终端的触摸面板边缘增加过渡区103(T区)。
在该实施例中,若输入事件从C区开始,偏离到T区则依旧认为本次滑动是边缘手势;若输入事件从C区开始,偏离到A区,则认为本次边缘手势结束,开始正常输入事件;若输入事件从T区或者A区开始,无论之后滑动到触摸面板任何区域,都认为本次滑动是正常输入事件。
该实施例的输入事件的上报流程和上述实施例所述的交互控制方法相同,区别仅在于:应用框架层对边缘输入事件进行处理识别时,需要按照上述三种情况进行判断,以确定准确的输入事件。例如,应用框架层根据某一次输入事件上报的触摸点判断得到输入事件从C区开始,并偏离到A区(即输入开始时的触摸点坐标位于C区,而输入过程中某一触摸点的坐标位于A区),则第一判断模块和第二判断模块根据坐标判断的结果为输入事件为边缘输入事件,且本次边缘输入事件结束,开始正常输入事件,驱动层即开始下一次输入事件的上报。
本发明实施例的移动终端可以以各种形式来实施。例如,本发明中描述的终端可以包括诸如手机、移动电话、智能电话、笔记本电脑、数字广播接收器、PDA(个人数字助理)、PAD(平板电脑)、PMP(便携式多媒体播放器)、导航装置等等的移动终端以及诸如数字TV、台式计算机等等的固定终端。
相应的,本发明实施例还提供一种用户设备,参见图16为其硬件结构示意图。参见图16,用户设备1000包括触摸屏2010、控制器200、存储装置310、全球定位系统(GPS)芯片320、通信器330、视频处理器340、音频处理器350、按钮360、麦克风370、相机380、扬声器390和动作传感 器906。
触摸屏2010可以如上所述划分为A区和C区,或A区、C区和T区。触摸屏2010可以实现为各种类型的显示器,诸如液晶显示器(LCD)、有机发光二极管(OLED)显示器和等离子体显示板(PDP)。触摸屏2010可以包括驱动电路,其能够实现为,例如a-si TFT、低温多晶硅(LTPS)TFT和OTFT(有机TFT),和背光单元。
同时,触摸屏2010可以包括用于感测用户的触摸手势的触摸传感器。触摸传感器可以实现为各种类型的传感器,诸如电容类型、电阻类型或者压电类型。电容类型通过当用户身体的一部分(例如,用户的手指)触摸表面上涂敷有导电材料的触摸屏的表面时感测由用户的身体激励的微电流计算触摸坐标值。根据电阻类型,触摸屏包括两个电极板,并且当用户触摸触摸面板时通过感测当触摸点处的上板和下板接触时流动的电流,来计算触摸坐标值。此外,当用户设备1000支持笔输入功能时,触摸屏2010可以感测用于使用除了用户手指之外诸如笔之类的输入装置的用户手势。当输入装置是包括线圈的手写笔(stylus pen)时,用户设备1000可以包括用于感测磁场的磁性传感器(未示出),所述磁场根据手写笔内线圈对磁性传感器的接近度而改变。由此,除了感测触摸手势之外,用户设备1000还可以感测接近的手势,即手写笔悬停在用户设备1000上方。
存储装置310可以存储用户设备1000的操作所需的各种程序和数据。例如,存储装置310可以存储用于构成将在各区(例如,A区、C区)上显示的各种屏幕的程序和数据。
控制器200通过使用存储在存储装置310中的程序和数据在触摸屏2010的各区上显示内容。
控制器200包括RAM 210、ROM 220、CPU 230、图形处理单元(GPU)240和总线250。RAM 210、ROM 220、CPU 230和GPU 240可以通过总线 250彼此连接。
处理器(CPU)230访问存储装置310并且使用存储在存储装置310中的操作系统(OS)执行启动。而且,CPU 230通过使用存储在存储装置310中的各种程序、内容和数据执行各种操作。
ROM 220存储用于系统启动的命令集。当开启命令被输入并且电力被提供时,CPU 230根据存储在ROM 220中命令集将存储在存储装置310中的OS复制到RAM 210,并且通过运行OS启动系统。当启动完成时,CPU 230将存储在存储装置310中的各种程序复制到RAM 210,并且通过运行RAM 210中的复制程序执行各种操作。具体地说,GPU 240可以通过使用计算器(未示出)和渲染器(未示出)生成包括诸如图标、图像和文本这样的各种对象的屏幕。计算器计算诸如坐标值、格式、大小和颜色这样的特征值,其中分别根据屏幕的布局用颜色标记对象。
GPS芯片320是从GPS卫星接收GPS信号的单元,并且计算用户设备1000的当前位置。当使用导航程序时或者当请求用户的当前位置时,控制器200可以通过使用GPS芯片320计算用户的位置。
通信器330是根据各种类型的通信方法与各种类型的外部设备执行通信的单元。通信器330包括WiFi芯片331、蓝牙芯片332、无线通信芯片333和NFC芯片334。控制器200通过使用通信器330执行与各种外部设备的通信。
WiFi芯片331和蓝牙芯片332分别根据WiFi方法和蓝牙方法执行通信。当使用WiFi芯片331或者蓝牙芯片332时,诸如服务集标识符(service set identifier,SSID)和会话密钥这样的各种连接信息可以首先被收发,可以通过使用连接信息连接通信,并且可以收发各种信息。无线通信芯片333是根据诸如IEEE、Zigbee、第三代(3G)、第三代合作项目(3GPP)和长期演进(LTE)这样的各种通信标准执行通信的芯片。NFC芯片334是根 据使用各种RFID频带宽度当中13.56兆赫带宽的近场通信(NFC)方法进行操作的芯片,各种RFID频带宽度诸如135千赫兹、13.56兆赫、433兆赫、860~960兆赫和2.45吉赫。
视频处理器340是处理包括在通过通信器330接收到的内容或者存储在存储装置310中的内容中的视频数据的单元。视频处理器340可以执行对于视频数据的各种图像处理,诸如解码、缩放、噪声过滤、帧速率变换和分辩率变换。
音频处理器350是处理包括在通过通信器330接收到的内容或者存储在存储装置310中的内容中的音频数据的单元。音频处理器350可以执行对于音频数据的各种处理,诸如解码、放大和噪声过滤。
当对于多媒体内容运行再现程序时控制器200可以通过驱动视频处理器340和音频处理器350再现相应内容。
扬声器390输出在音频处理器350中生成的音频数据。
按钮360可以是各种类型的按钮,诸如机械按钮或者在像用户设备1000的主要外体的正面、侧面或者背面这样的一些区域上形成的触摸垫或者触摸轮。
麦克风370是接收用户语音或者其它声音并且将它们变换为音频数据的单元。控制器200可以使用在呼叫过程期间通过麦克风370输入的用户语音,或者将它们变换为音频数据并且存储在存储装置310中。
相机380是根据用户的控制捕获静止图像或者视频图像的单元。相机380可以实现为多个单元,诸如正面相机和背面相机。如下面所述,相机380可以用作在追踪用户的目光的示范性实施例中获得用户图像的装置。
当提供相机380和麦克风370时,控制器200可以根据通过麦克风370输入的用户的声音或者由相机380识别的用户动作执行控制操作。因此,用户设备1000可以在动作控制模式或者语音控制模式下操作。当在动作控 制模式下操作时,控制器200通过激活相机380拍摄用户,跟踪用户动作的改变,以及执行相应的操作。当在语音控制模式下操作时,控制器200可以在语音识别模式下操作以分析通过麦克风370输入的语音并且根据分析的用户语音执行控制操作。
在支持动作控制模式或者语音控制模式的用户设备1000中,在上述各种示范性实施例中使用语音识别技术或者动作识别技术。例如,当用户执行像选择在主页屏幕上标记的对象这样的动作或者说出相应于对象的语音命令时,可以确定选择了相应对象并且可以执行与该对象匹配的控制操作。
动作传感器906是感测用户设备1000的主体的移动的单元。用户设备1000可以旋转或者沿各种方向倾斜。动作传感器906可以通过使用诸如地磁传感器、陀螺仪传感器和加速度传感器这样的各种传感器中的一个或多个来感测诸如旋转方向、角度和斜率这样的移动特征。应理解,当用户设备旋转时,相应的,触摸屏也进行了旋转,且与用户设备的旋转角度是相同的。
而且,虽然在图16中未示出,但是根据示范性实施例,用户设备1000还可以包括能够与USB连接器连接的USB端口、用于连接像耳机、鼠标、LAN和接收并处理数字多媒体广播(DMB)信号的DMB芯片这样的各种外部元件的各种输入端口、以及各种其他传感器。
如上所述,存储装置310可以存储各种程序。
基于图16所示的用户设备,在本发明的实施例中,触摸屏,配置为检测产生于触摸面板上的触摸信号,以及用于根据触摸信号识别触摸点。
动作传感器,配置为检测用户设备的旋转角度。
处理器,包括:驱动模块、应用框架模块和应用模块;
其中,驱动模块,配置为根据触摸信号获取输入事件,并上报到应用框架模块;
应用框架模块,配置为根据上报的输入事件的触摸点位置及旋转角度,判断触摸点位于边缘触摸区域还是正常触摸区域,若位于边缘触摸区域则进行处理识别后将识别结果上报给应用模块;若位于正常触摸区域则进行处理识别后将识别结果上报给应用模块;
应用模块,配置为根据上报的识别结果执行相应的指令。
应理解,该实施例的用户设备的各模块的工作原理和细节和上述实施例描述的相同,在此不再赘述。
本发明实施例的触摸控制方法、用户设备、输入处理方法和移动终端,可实现根据触摸屏的旋转相应的变换边缘触摸区域,以更好的适应用户的操作,提高用户体验;另一方面,由于在应用框架层才进行区分A区和C区的操作,且在应用框架层进行虚拟设备的建立,避免了在驱动层区分A区和C区对硬件的依赖;通过设置触摸点编号,可实现区分手指,兼容A协议和B协议;且可集成到移动终端的操作系统中,可适用不同硬件、不同种类的移动终端,可移植性好;触摸点的所有要素(触摸点的坐标、编号等)被存储,可后续判断边缘输入(例如,FIT)提供便利。
流程图中或在本发明的实施例中以其他方式描述的任何过程或方法描述可以被理解为,表示包括一个或更多个用于实现特定逻辑功能或过程的步骤的可执行指令的代码的模块、片段或部分,并且本发明实施方式的范围包括另外的实现,其中可以不按所示出或讨论的顺序,包括根据所涉及的功能按基本同时的方式或按相反的顺序,来执行功能,这应被本发明的实施例所述技术领域的技术人员所理解。
上面结合附图对本发明的实施例进行了描述,但是本发明并不局限于上述的具体实施方式,上述的具体实施方式仅仅是示意性的,而不是限制性的,本领域的普通技术人员在本发明的启示下,在不脱离本发明宗旨和权利要求所保护的范围情况下,还可做出很多形式,这些均属于本发明的 保护之内。

Claims (24)

  1. 一种触摸控制方法,包括:
    检测产生于触摸面板上的触摸信号;
    根据触摸信号识别触摸点;
    检测触摸面板的旋转角度;
    根据识别出的触摸点及所述旋转角度,判断触摸点位于边缘触摸区域还是正常触摸区域;
    基于判断结果执行相应的指令。
  2. 根据权利要求1所述的触摸控制方法,其中,所述旋转角度包括:旋转0度、顺时针旋转90度、顺时针旋转180度、顺时针旋转270度、逆时针旋转90度、逆时针旋转180度和逆时针旋转270度。
  3. 根据权利要求2所述的触摸控制方法,其中,所述根据识别出的触摸点及所述旋转角度,判断触摸点位于边缘触摸区域还是正常触摸区域包括:
    若旋转角度为0度,则当Wc<x<(W-Wc)时,触摸点位于正常触摸区域,否则,触摸点位于边缘触摸区域;
    若旋转角度为顺时针90度,则当Wc<y<H-Wc时,触摸点位于正常触摸区域,否则,触摸点位于边缘触摸区域;
    若旋转角度为顺时针180度,则当Wc<x<(W-Wc)时,触摸点位于正常触摸区域,否则,触摸点位于边缘触摸区域;
    若旋转角度为顺时针270度,则当Wc<y<H-Wc时,触摸点位于正常触摸区域,否则,触摸点位于边缘触摸区域;
    其中,x为触摸点的位于触摸面板所在坐标系的横轴坐标,x为触摸点的位于触摸面板所在坐标系的横轴坐标,W为触摸面板的宽度,Wc为边缘触摸区的宽度。
  4. 一种用户设备,包括:触摸屏、动作传感器和处理器;
    触摸屏,包括:触摸面板和触摸控制器,其中:
    触摸面板,配置为检测产生于触摸面板上的触摸信号;
    触摸控制器,配置为根据触摸信号识别触摸点;
    动作传感器,配置为检测所述用户设备的旋转角度;
    处理器,包括:驱动模块、应用框架模块和应用模块,其中:
    所述驱动模块,配置为根据所述触摸信号获取输入事件,并上报到所述应用框架模块;
    所述应用框架模块,配置为根据旋转角度及上报的输入事件的触摸点位置,判断触摸点位于边缘触摸区域还是正常触摸区域;
    应用模块,配置为基于判断结果执行相应的指令。
  5. 一种输入处理方法,包括:
    驱动层获取用户通过输入设备产生的输入事件,并上报到应用框架层;
    应用框架层根据移动终端的当前状态和上报的输入事件,判断输入事件是边缘输入事件,还是正常输入事件,若为正常输入事件则对正常输入事件进行处理识别,并将识别结果上报给应用层,若为边缘输入事件则对边缘输入事件进行处理识别,并将识别结果上报给应用层;
    应用层根据上报的识别结果执行相应的指令。
  6. 根据权利要求5所述的输入处理方法,其中,所述方法还包括:
    为每一输入事件创建一具有设备标识的输入设备对象。
  7. 根据权利要求6所述的输入处理方法,其中,所述为每一输入事件创建一具有设备标识的输入设备对象包括:
    将正常输入事件与具有第一设备标识的触摸屏相对应;
    应用框架层设置一具有第二设备标识的第二输入设备对象与边缘输入事件相对应。
  8. 根据权利要求5所述的输入处理方法,其中,所述驱动层获取用户通过输入设备产生的输入事件,并上报到应用框架层包括:
    所述驱动层为每一触摸点赋予一用于区分手指的编号,并采用A协议协议上报所述输入事件。
  9. 根据权利要求5所述的输入处理方法,其中,所述驱动层获取用户通过输入设备产生的输入事件,并上报到应用框架层包括:
    所述驱动层采用B协议上报所述输入事件;
    所述方法还包括:
    所述应用框架层为所述输入事件中的每一触摸点赋予用于区分手指的编号。
  10. 根据权利要求5-9任一项所述的输入处理方法,其中,所述移动终端的当前状态包括:旋转0度、顺时针旋转90度、顺时针旋转180度、顺时针旋转270度、逆时针旋转90度、逆时针旋转180度和逆时针旋转270度。
  11. 根据权利要求10所述的输入处理方法,其中,若旋转角度为0度,则当Wc<x<(W-Wc)时,则应用框架层判断输入事件为正常输入事件,否则,为边缘输入事件;
    若旋转角度为顺时针90度,则当Wc<y<H-Wc时,则应用框架层判断输入事件为正常输入事件,否则,为边缘输入事件;
    若旋转角度为顺时针180度,则当Wc<x<(W-Wc)时,则应用框架层判断输入事件为正常输入事件,否则,为边缘输入事件;
    若旋转角度为顺时针270度,则当Wc<y<H-Wc时,则应用框架层判断输入事件为正常输入事件,否则,为边缘输入事件;
    其中,x为触摸点的位于触摸面板所在坐标系的横轴坐标,x为触摸点的位于触摸面板所在坐标系的横轴坐标,W为触摸面板的宽度,Wc为边缘 触摸区的宽度。
  12. 一种移动终端,包括:
    输入设备;
    动作传感器,配置为检测所述移动终端的当前状态;
    驱动层,配置为获取用户通过输入设备产生的输入事件,并上报到应用框架层;
    应用框架层,配置为根据移动终端的当前状态和上报的输入事件,判断输入事件是边缘输入事件,还是正常输入事件,若为正常输入事件则对正常输入事件进行处理识别,并将识别结果上报给应用层,若为边缘输入事件则对边缘输入事件进行处理识别,并将识别结果上报给应用层;
    应用层,配置为根据上报的识别结果执行相应的指令。
  13. 根据权利要求12所述的移动终端,其中,所述正常输入事件与具有第一设备标识的第一输入设备对象相对应;
    所述应用框架层还配置为设置一具有第二设备标识的第二输入设备对象,配置为与所述边缘输入事件相对应。
  14. 根据权利要求12所述的移动终端,其中,所述驱动层采用A协议或B协议上报输入事件,若采用A协议上报输入事件,则所述事件获取模块还配置为为每一触摸点赋予一用于区分手指的编号;
    若采用B协议上报输入事件,则所述应用框架层还配置为为每一触摸点赋予用于区分手指的编号。
  15. 根据权利要求12所述的移动终端,其中,所述驱动层包括事件获取模块,配置为获取用户通过输入设备产生的输入事件。
  16. 根据权利要求12所述的移动终端,其中,所述应用框架层包括输入读取器;
    所述移动终端还包括设置于所述驱动层和所述输入读取器间的设备节 点,配置为通知所述输入读取器获取输入事件;
    所述输入读取器,配置为遍历设备节点,获取输入事件并上报。
  17. 根据权利要求12所述的移动终端,其中,所述移动终端的当前状态包括:旋转0度、顺时针旋转90度、顺时针旋转180度、顺时针旋转270度、逆时针旋转90度、逆时针旋转180度和逆时针旋转270度。
  18. 根据权利要求17所述的移动终端,其中,所述应用框架层还包括:第一事件处理模块,配置为对所述输入读取器上报的输入事件进行坐标计算后上报;
    第一判断模块,配置为根据所述移动终端的当前状态和所述第一事件处理模块上报的坐标值判断输入事件是否为边缘输入事件,若不是则将输入事件上报。
  19. 根据权利要求18所述的移动终端,其中,所述应用框架层还包括:
    第二事件处理模块,配置为对所述输入读取器上报的输入事件进行坐标计算后上报;
    第二判断模块,配置为根据所述移动终端的当前状态和所述第二事件处理模块上报的坐标值判断输入事件是否为边缘输入事件,若是则将输入事件上报。
  20. 根据权利要求18或19所述的移动终端,其中,若旋转角度为0度,则当Wc<x<(W-Wc)时,则判断结果为输入事件为正常输入事件,否则,为边缘输入事件;
    若旋转角度为顺时针90度,则当Wc<y<H-Wc时,则判断结果为输入事件为正常输入事件,否则,为边缘输入事件;
    若旋转角度为顺时针180度,则当Wc<x<(W-Wc)时,则判断结果为输入事件为正常输入事件,否则,为边缘输入事件;
    若旋转角度为顺时针270度,则当Wc<y<H-Wc时,则判断结果为输 入事件为正常输入事件,否则,为边缘输入事件;
    其中,x为触摸点的位于触摸面板所在坐标系的横轴坐标,x为触摸点的位于触摸面板所在坐标系的横轴坐标,W为触摸面板的宽度,Wc为边缘触摸区的宽度。
  21. 根据权利要求20所述的移动终端,其中,所述应用框架层还包括:
    事件派发模块,配置为将所述第二判断模块和所述第一判断模块上报的事件进行上报。
  22. 根据权利要求21所述的移动终端,其中,所述应用框架层还包括:
    第一应用模块;
    第二应用模块;
    第三判断模块,配置为根据所述事件派发模块上报的事件中包含的设备标识判断事件是否为边缘输入事件,若属于,则上报给所述第一应用模块,否则上报给当所述第二应用模块;
    所述第一应用模块,配置为根据正常输入事件的相关参数对正常输入事件进行识别并将识别结果上报到应用层;
    所述第二应用模块,配置为根据边缘输入事件的相关参数对边缘输入事件进行识别并将识别结果上报的应用层。
  23. 根据权利要求12所述的移动终端,其中,所述输入设备为移动终端的触摸屏;
    所述触摸屏包括至少一个边缘输入区和至少一个正常输入区。
  24. 根据权利要求12所述的移动终端,其中,所述输入设备为移动终端的触摸屏;
    所述触摸屏包括至少一个边缘输入区、至少一个正常输入区和至少一个过渡区。
PCT/CN2016/102777 2015-11-20 2016-10-20 触摸控制方法、用户设备、输入处理方法和移动终端 WO2017084469A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201510819757.8 2015-11-20
CN201510819757.8A CN105335007B (zh) 2015-11-20 2015-11-20 触摸控制方法、用户设备、输入处理方法和移动终端

Publications (1)

Publication Number Publication Date
WO2017084469A1 true WO2017084469A1 (zh) 2017-05-26

Family

ID=55285599

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2016/102777 WO2017084469A1 (zh) 2015-11-20 2016-10-20 触摸控制方法、用户设备、输入处理方法和移动终端

Country Status (2)

Country Link
CN (1) CN105335007B (zh)
WO (1) WO2017084469A1 (zh)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105335007B (zh) * 2015-11-20 2019-10-08 努比亚技术有限公司 触摸控制方法、用户设备、输入处理方法和移动终端
CN107479745B (zh) * 2017-07-31 2020-07-21 北京雷石天地电子技术有限公司 一种配置触摸屏的方法、模块及操作系统
CN107844220B (zh) * 2017-11-29 2020-02-11 广州视源电子科技股份有限公司 触感信号的处理方法、系统、装置及电子设备

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101676843A (zh) * 2008-09-18 2010-03-24 联想(北京)有限公司 触摸输入方法及触摸输入装置
CN102236468A (zh) * 2010-04-26 2011-11-09 宏达国际电子股份有限公司 感测方法、计算机程序产品与便携式装置
CN104583903A (zh) * 2013-11-26 2015-04-29 华为技术有限公司 一种防止触摸误操作的方法、系统和终端
CN104735256A (zh) * 2015-03-27 2015-06-24 努比亚技术有限公司 移动终端的握持方式判断方法及装置
CN105335007A (zh) * 2015-11-20 2016-02-17 努比亚技术有限公司 触摸控制方法、用户设备、输入处理方法和移动终端
CN105511675A (zh) * 2015-11-20 2016-04-20 努比亚技术有限公司 触摸控制方法、用户设备、输入处理方法、移动终端及智能终端

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101676843A (zh) * 2008-09-18 2010-03-24 联想(北京)有限公司 触摸输入方法及触摸输入装置
CN102236468A (zh) * 2010-04-26 2011-11-09 宏达国际电子股份有限公司 感测方法、计算机程序产品与便携式装置
CN104583903A (zh) * 2013-11-26 2015-04-29 华为技术有限公司 一种防止触摸误操作的方法、系统和终端
CN104735256A (zh) * 2015-03-27 2015-06-24 努比亚技术有限公司 移动终端的握持方式判断方法及装置
CN105335007A (zh) * 2015-11-20 2016-02-17 努比亚技术有限公司 触摸控制方法、用户设备、输入处理方法和移动终端
CN105511675A (zh) * 2015-11-20 2016-04-20 努比亚技术有限公司 触摸控制方法、用户设备、输入处理方法、移动终端及智能终端

Also Published As

Publication number Publication date
CN105335007A (zh) 2016-02-17
CN105335007B (zh) 2019-10-08

Similar Documents

Publication Publication Date Title
WO2017097097A1 (zh) 触摸控制方法、用户设备、输入处理方法、移动终端及智能终端
US11749151B2 (en) Display apparatus and method for displaying
US11042185B2 (en) User terminal device and displaying method thereof
US11054988B2 (en) Graphical user interface display method and electronic device
US10996834B2 (en) Touchscreen apparatus user interface processing method and touchscreen apparatus
WO2017084470A1 (zh) 移动终端、输入处理方法及用户设备、计算机存储介质
KR102427833B1 (ko) 사용자 단말장치 및 디스플레이 방법
KR101515620B1 (ko) 사용자 단말 장치 및 그의 제어 방법
KR102519800B1 (ko) 전자 장치
US10067666B2 (en) User terminal device and method for controlling the same
US11157127B2 (en) User terminal apparatus and controlling method thereof
US20160139797A1 (en) Display apparatus and contol method thereof
KR20150094479A (ko) 사용자 단말 장치 및 이의 디스플레이 방법
WO2017088694A1 (zh) 手势校准方法、装置及手势输入处理方法、计算机存储介质
US10095384B2 (en) Method of receiving user input by detecting movement of user and apparatus therefor
CN105824531A (zh) 数值调整方法及装置
KR20150134674A (ko) 사용자 단말 및 이의 제어 방법, 그리고 멀티미디어 시스템
WO2017084469A1 (zh) 触摸控制方法、用户设备、输入处理方法和移动终端
US10474335B2 (en) Image selection for setting avatars in communication applications
KR102351634B1 (ko) 사용자 단말장치, 음향 시스템 및 외부 스피커의 음량 제어 방법
WO2013177761A1 (zh) 显示控制方法和装置
KR102492182B1 (ko) 사용자 단말 장치 및 그 제어 방법
EP3287887B1 (en) User terminal apparatus and controlling method thereof

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16865647

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16865647

Country of ref document: EP

Kind code of ref document: A1