CN114035721A - Touch screen display method and device and storage medium - Google Patents

Touch screen display method and device and storage medium Download PDF

Info

Publication number
CN114035721A
CN114035721A CN202210012992.4A CN202210012992A CN114035721A CN 114035721 A CN114035721 A CN 114035721A CN 202210012992 A CN202210012992 A CN 202210012992A CN 114035721 A CN114035721 A CN 114035721A
Authority
CN
China
Prior art keywords
terminal
mode
touch
cursor
touch screen
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210012992.4A
Other languages
Chinese (zh)
Inventor
聂光
高杨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202210012992.4A priority Critical patent/CN114035721A/en
Publication of CN114035721A publication Critical patent/CN114035721A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04812Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Abstract

The embodiment of the application provides a touch screen display method, a touch screen display device and a storage medium, which are applied to the technical field of terminals and comprise the following steps: when the terminal equipment is in a first mode, receiving trigger operation from a touch object; responding to the triggering operation, and displaying the floating cursor in the second mode by the terminal equipment; when the terminal device is in the second mode, if the terminal device receives a second sliding operation on the user interface, the terminal device controls the floating cursor to move along with the sliding position of the second sliding operation, and the page content in the user interface is not changed. Therefore, the probability that the terminal device mistakenly responds to the touch operation of the user and then displays other irrelevant interfaces can be reduced, and the use experience of the user is improved.

Description

Touch screen display method and device and storage medium
Technical Field
The present application relates to the field of terminal technologies, and in particular, to a touch screen display method and apparatus, and a storage medium.
Background
With the development of electronic technology, electronic devices equipped with touch screens are widely used in various fields, and users can control the electronic devices through touch operations.
For example, the touch operation may include a click operation and a slide operation. For example, when performing a click operation, a user may click on an icon on a touch screen using a finger or a stylus pen, and the icon may include: an Application (APP), a website link, a text document, or the like, and the electronic device may open the APP, open a webpage, open a document, or the like in response to the click operation. Certainly, the user can also slide the touch screen to realize page turning or up-and-down sliding of the screen interface.
However, in a scene in which a user is performing presentation explanation or screen projection using an electronic device, in order to indicate a content that the user desires to pay attention to a listener, the user may habitually use a finger or a stylus pen to indicate a specific position of the content in a touch screen, for example, the user may click on a certain position of the touch screen or slide in a certain area of the touch screen, and this operation may cause the terminal device to detect a trigger operation to perform an interface jump or the like, thereby causing interruption of presentation explanation, which is not favorable for the use of the presenter, and also reducing the viewing experience of the listener.
Disclosure of Invention
The embodiment of the application provides a touch screen display method, a touch screen display device and a storage medium, wherein a second mode is set in a terminal device, a floating cursor can be displayed on a touch screen in the second mode, the floating cursor can move along with sliding operation in the touch screen, and page content in the touch screen of the terminal device is not changed during moving, so that the probability that the terminal device mistakenly responds to touch operation of a user and then displays other irrelevant interfaces is reduced, and the use experience of the user is improved.
In a first aspect, an embodiment of the present application provides a touch screen display method, including: when the terminal equipment is in a first mode, receiving trigger operation from a touch object; in the first mode, if the terminal device receives a first sliding operation on the user interface, the page content of the user interface changes along with the first sliding operation; responding to the triggering operation, and displaying the floating cursor in the second mode by the terminal equipment; when the terminal device is in the second mode, if the terminal device receives a second sliding operation on the user interface, the terminal device controls the floating cursor to move along with the sliding position of the second sliding operation, and the page content in the user interface is not changed. In the implementation mode, the terminal equipment is provided with the first mode and the second mode, a user can normally operate the terminal equipment based on the first mode, and when the terminal equipment is in the second mode, the suspension cursor can reduce the probability that the terminal equipment displays other irrelevant interfaces after mistakenly responding to touch operation, so that the use experience of the user is improved.
It should be noted that, for convenience of description, the embodiment of the present application adopts a handwriting mode (also referred to as a first mode) and a cursor mode (also referred to as a second mode) for illustration, and in an actual implementation, the terminal device does not necessarily have to be limited to the two modes.
For example, the terminal device is in a handwriting mode, it can be understood that the terminal device executes a handwriting function, and when a touch operation is received in the touch screen, the terminal device can change the content displayed in the user interface based on the touch operation. For example, when the terminal device receives a sliding operation of a stylus pen or a finger in a screen, the terminal device may implement page turning, page sliding, displaying a sliding track on a page, displaying a dynamic effect, or displaying a prompt box for deleting a message, and the like; when the terminal device receives the clicking operation of the handwriting pen or the finger in the screen, the terminal device can realize the jumping to the page corresponding to the control and the like.
The terminal device is in cursor mode, which can be understood as a function that the terminal device performs similar to when receiving an operation of a mouse. When a touch operation is received in the touch screen, the terminal device can change the position of the floating cursor based on the touch operation without changing the content displayed in the user interface. For example, when the terminal device is in the cursor mode, the floating cursor may be displayed on the screen, and when the touch screen of the terminal device receives the sliding operation of the stylus or the finger, the terminal device may control the floating cursor to move along with the sliding operation; when the terminal device receives a click operation of a stylus or a finger on the screen, the terminal device can move the floating cursor to the position where the click operation is located, and the like.
In a possible implementation manner, when the terminal device is in the first mode, if the terminal device receives a click operation for a target control on a user interface, the terminal device jumps to a page corresponding to the target control; and/or when the terminal device is in the second mode, if the terminal device receives a click operation aiming at the target control on the user interface, the terminal device moves the floating cursor to a position triggered by the click operation in the touch screen. Therefore, the first mode and the second mode are set in the terminal equipment, a user can normally operate the terminal equipment based on the first mode, and when the terminal equipment is in the second mode, the probability that the terminal equipment displays other irrelevant interfaces after mistaken response touch operation can be reduced by the suspension cursor, so that the use experience of the user is improved.
In a possible implementation manner, if the terminal device receives the second sliding operation on the user interface, the controlling, by the terminal device, the floating cursor to move along with the sliding position of the second sliding operation includes: when the terminal equipment detects a first contact event of a touch object, the terminal equipment enters a first state; when the terminal equipment is in a first state, if the terminal equipment detects that the touch object does not leave the touch screen and the touch object generates displacement in the touch screen, the terminal equipment enters a second state; and when the terminal equipment is in the second state, the terminal equipment controls the floating cursor to move according to the report point displacement of the touch object. Therefore, in the second mode, the user can enter the first state and the second state based on the first contact event, and the floating cursor can move in the second state, so that the user experience is improved.
In a possible implementation manner, the controlling, by the terminal device, the floating cursor to move according to the pointing displacement of the touch object includes: the terminal equipment converts the report point information of the touch object into coordinate information; and the terminal equipment controls the floating cursor to move according to the coordinate information. Therefore, the position of the suspended cursor is accurately positioned through the coordinate information, the terminal device can accurately control the movement of the suspended cursor, and the user experience is improved.
It should be noted that, the present application does not limit the specific manner in which the terminal device controls the floating cursor to move according to the coordinate information. In a possible implementation manner, the coordinate information is displacement information, and the terminal device may draw the cursor at the actual position of the cursor after calculating the actual position of the cursor based on the displacement information. In another possible implementation manner, the coordinate information is real coordinate information, that is, the terminal device may be positioned to a specific actual cursor position based on the coordinate information and draw the cursor at the position. Therefore, the floating cursor is controlled to move along with the report point displacement of the touch object.
In a possible implementation manner, the converting, by the terminal device, the report point information of the touch object into the coordinate information includes: and the terminal equipment removes other information except the coordinate information in the report point information to obtain the coordinate information. Therefore, the terminal equipment can obtain the coordinate information, so that the terminal equipment can accurately position the position of the suspended cursor according to the coordinate information, the terminal equipment can accurately control the movement of the suspended cursor, and the user experience is improved.
In a possible implementation manner, when the terminal device is in the first state, if the terminal device detects that the touch object does not displace on the touch screen, the terminal device leaves the touch screen, and the terminal device enters the third state; when the terminal device is in the third state, if the terminal device detects a second contact event of the touch object, a time interval between the second contact event and the first contact event is smaller than a time threshold, and a distance between positions corresponding to the touch screen of the second contact event and the first contact event is smaller than a distance threshold, the terminal device enters the fourth state; when the terminal equipment is in the fourth state, if the terminal equipment detects that the touch object generates displacement on the touch screen, the display content of the displacement position generated by the touch screen of the terminal equipment is highlighted; or, if the terminal device detects that the touch object leaves the touch screen, the terminal device displays the focus cursor at the position of the second contact event. Therefore, when the terminal device is in the second mode, clicking and left-key dragging behaviors of a mouse can be simulated through double-click and double-click sliding operation of the touch object, application forms of the cursor in the second mode are enriched, and user experience is improved.
In one possible implementation manner, when the terminal device is in the first mode, receiving a trigger operation from the touch object includes: when the terminal equipment is in a first mode, the terminal equipment displays a first interface, and the first interface comprises a suspension button; when triggering of the suspension button is received, the terminal device expands the suspension button in a first interface, and the expanded suspension button comprises a first control corresponding to a first mode and a second control corresponding to a second mode; and the terminal equipment receives the triggering operation of the second control. Therefore, the terminal equipment can be switched to the second mode simply, conveniently and quickly through the suspension button, the mode switching time is shortened, and the user experience is improved.
In a possible implementation manner, before the terminal device displays the floating cursor in the second mode, the method includes: the terminal device switches from the first mode to the second mode. Therefore, the terminal equipment can realize the switching between the first mode and the second mode, and the user experience is improved.
In one possible implementation manner, switching, by the terminal device, from the first mode to the second mode includes: the terminal equipment registers virtual cursor equipment; the terminal equipment switches a module for processing an event generated in the touch screen from a handwriting event conversion module to an event adaptation processing module; the handwriting event conversion module is used for processing a handwriting event in the touch screen, and the event adaptation processing module is used for processing a cursor input event in the touch screen. Therefore, the virtual cursor equipment is arranged on the software layer and the event adaptation processing module is switched, hardware implementation is not relied on when the first mode is switched to the second mode, reconnection of the bottom layer equipment and change of equipment nodes can not be triggered, and user experience is improved.
In one possible implementation manner, the registering, by the terminal device, the virtual cursor device includes: the terminal equipment creates a virtual equipment identifier; the terminal device uses the virtual device identifier to create a virtual input device; the terminal equipment sets the input equipment as a touch object. In this way, the virtual cursor device is set at the software level. Therefore, when the terminal equipment is switched from the first mode to the second mode, the reconnection of the bottom layer equipment and the change of equipment nodes cannot be triggered, and the user experience is improved.
In one possible implementation manner, the switching, by the terminal device, a module for processing an event generated in the touch screen from the handwriting event conversion module to the event adaptation processing module includes: and the terminal equipment deletes the handwriting event conversion module and adds an event adaptation processing module. Therefore, when the terminal equipment processes the cursor event generated in the touch screen, the terminal equipment keeps an in-use event adaptation processing module and deletes the unused handwriting event conversion module, so that the memory of the terminal equipment is reduced.
In one possible implementation, the terminal device de-registers the virtual cursor device when the terminal device switches from the second mode to the first mode. Therefore, the terminal equipment can be switched from the second mode to the first mode according to the user requirements, and the user experience is improved.
In one possible implementation manner, the terminal device unregisters the virtual cursor device, including: the terminal equipment deletes the event adaptation processing module and adds a handwriting event conversion module. Therefore, when the terminal equipment processes the handwriting event generated in the touch screen, the terminal equipment keeps a handwriting event conversion module in use and deletes the unused event adaptive processing module, so that the memory of the terminal equipment is reduced.
In one possible implementation manner, when the terminal device is in the first mode, the receiving the trigger operation from the touch object includes: when the terminal equipment is in a first mode, receiving a trigger instruction from a stylus; the triggering instruction is as follows: the target button of the stylus pen is generated when the user performs a single-click operation, a double-click operation or a long-press operation, or when the stylus pen performs a preset gesture action. Therefore, when the touch object is a stylus pen, a user can quickly and conveniently switch from the first mode to the second mode based on the stylus pen, and user experience is improved.
In a possible implementation manner, when the terminal device is in the second mode, if receiving an operation of the touch object for switching to the first mode, the terminal device cancels the display of the floating cursor and enters the first mode; when the terminal device is in the first mode, if the terminal device receives a sliding operation on the user interface, the terminal device implements one or more of the following functions based on the sliding operation: page turning, page sliding, displaying a sliding track on a page, displaying a dynamic effect, or displaying a prompt box for deleting a message. Therefore, the terminal equipment can be switched from the second mode to the first mode according to the user requirements, and the user experience is improved.
In one possible implementation manner, when the terminal device is in the second mode, the terminal device establishes connection with the large-screen device, and the content displayed in the terminal device is projected to the large-screen device; or after the terminal equipment is connected with the large-screen equipment, the terminal equipment enters a second mode, and content displayed in the terminal equipment is projected to the large-screen equipment. Therefore, after the terminal equipment can be connected with the large-screen equipment, a user can perform touch operation based on the second mode of the terminal equipment, the large-screen equipment can display a display interface of the terminal equipment, and user experience is improved.
In a second aspect, an embodiment of the present application provides a touch screen display device, where the touch screen display device may be a terminal device, and may also be a chip or a chip system in the terminal device. The touch screen display device may include a communication unit, a display unit, and a processing unit. When the touch screen display device is a terminal device, the display unit may be a touch screen. The display unit is configured to perform a displaying step, so that the terminal device implements the touch screen display method described in the first aspect or any one of the possible implementation manners of the first aspect. When the touch screen display device is a terminal device, the processing unit may be a processor. The touch screen display device may further include a storage unit, which may be a memory. The storage unit is configured to store an instruction, and the processing unit executes the instruction stored in the storage unit, so that the terminal device implements the touch screen display method described in the first aspect or any one of the possible implementation manners of the first aspect. When the touch screen display device is a chip or a chip system in a terminal device, the processing unit may be a processor. The processing unit executes the instructions stored in the storage unit, so that the terminal device implements the touch screen display method described in the first aspect or any one of the possible implementation manners of the first aspect. The storage unit may be a storage unit (e.g., a register, a buffer, etc.) within the chip, or may be a storage unit (e.g., a read-only memory, a random access memory, etc.) located outside the chip within the terminal device.
Illustratively, the processor is configured to obtain display information of a touch screen related to a user, and push the display information to the touch screen when a condition is met; and the touch screen is used for receiving touch operation of the touch object and displaying a user page.
In one possible implementation manner, when the terminal device is in the first mode, the touch screen receives a trigger operation from a touch object; in the first mode, if the touch screen receives a first sliding operation, the processor controls the content of the page displayed by the user to change along with the first sliding operation; responding to the trigger operation, and displaying the floating cursor in the second mode by the touch screen; when the terminal device is in the second mode, if the touch screen receives a second sliding operation on the user interface, the processor controls the floating cursor to move along with the sliding position of the second sliding operation, and the content of the page in the user interface displayed by the touch screen is not changed.
In a possible implementation manner, when the terminal device is in the first mode, if the touch screen receives a click operation for the target control, the terminal device jumps to a page corresponding to the target control; and/or when the terminal device is in the second mode, if the touch screen receives a click operation aiming at the target control, the processor moves the floating cursor to a position triggered by the click operation in the touch screen.
In a possible implementation manner, if the touch screen receives the second sliding operation, the processor controls the floating cursor to move along with the sliding position of the second sliding operation, including: when the processor detects a first contact event of the touch object, the terminal equipment enters a first state; when the terminal equipment is in a first state, if the processor detects that the touch object does not leave the touch screen and the touch object generates displacement in the touch screen, the terminal equipment enters a second state; and when the terminal equipment is in the second state, the processor controls the floating cursor in the touch screen to move according to the report point displacement of the touch object.
In one possible implementation manner, the processor controls a floating cursor in the touch screen to move according to the pointing displacement of the touch object, including: converting the report point information of the touch object into coordinate information by the processor; and the processor controls the floating cursor in the touch screen to move according to the coordinate information.
In one possible implementation manner, the converting, by the processor, the hit information of the touch object into the coordinate information includes: and the processor removes other information except the coordinate information in the report point information to obtain the coordinate information.
In a possible implementation manner, after the terminal device enters the first state, when the terminal device is in the first state, if the processor detects that the touch object does not generate displacement on the touch screen, the terminal device leaves the touch screen, and the terminal device enters the third state; when the terminal device is in the third state, if the processor detects a second contact event of the touch object, a time interval between the second contact event and the first contact event is smaller than a time threshold, and a distance between positions corresponding to the touch screen of the second contact event and the first contact event is smaller than a distance threshold, the terminal device enters the fourth state; when the terminal equipment is in a fourth state, if the processor detects that the touch object generates displacement on the touch screen, the display content of the displacement position of the touch screen is highlighted; or, if the processor detects that the touch object leaves the touch screen, the touch screen displays the focus cursor at the position of the second contact event.
In one possible implementation manner, when the terminal device is in the first mode, the receiving, by the touch screen, a trigger operation from the touch object includes: when the terminal equipment is in a first mode, a touch screen displays a first interface, and the first interface comprises a suspension button; when triggering of the floating button is received, the touch screen expands the floating button in a first interface, and the expanded floating button comprises a first control corresponding to a first mode and a second control corresponding to a second mode; and the touch screen receives the trigger operation of the second control.
In one possible implementation manner, before the touch screen displays the floating cursor in the second mode, the method includes: the terminal device switches from the first mode to the second mode.
In one possible implementation manner, switching, by the terminal device, from the first mode to the second mode includes: the processor registers the virtual cursor device; the processor switches a module for processing the event generated in the touch screen from the handwriting event conversion module to the event adaptation processing module; the handwriting event conversion module is used for processing a handwriting event in the touch screen, and the event adaptation processing module is used for processing a cursor input event in the touch screen.
In one possible implementation, a processor registers a virtual cursor device, comprising: the processor creating a virtual device identifier; creating, by the processor, a virtual input device using the virtual device identifier; the processor sets the input device as a touch object.
In one possible implementation manner, the processor switches a module for processing an event generated in the touch screen from the handwriting event conversion module to the event adaptation processing module, and includes: the processor deletes the handwriting event conversion module and adds an event adaptation processing module.
In one possible implementation, the processor de-registers the virtual cursor device when the terminal device switches from the second mode to the first mode.
In one possible implementation, a processor de-registers a virtual cursor device, comprising: the processor deletes the event adaptation processing module and adds the handwriting event conversion module.
In one possible implementation manner, when the terminal device is in the first mode, the receiving the trigger operation from the touch object includes: when the terminal equipment is in a first mode, the interface circuit receives a trigger instruction from the stylus pen; the triggering instruction is as follows: the target button of the stylus pen is generated when the user performs a single-click operation, a double-click operation or a long-press operation, or when the stylus pen performs a preset gesture action.
In a possible implementation manner, when the terminal device is in the second mode, if the touch screen receives an operation of the touch object for switching to the first mode, the touch screen cancels the display of the floating cursor and enters the first mode; when the terminal equipment is in a first mode, if the touch screen receives a sliding operation on the user interface, the processor realizes one or more of the following functions based on the sliding operation: page turning, page sliding, displaying a sliding track on a page, displaying a dynamic effect, or displaying a prompt box for deleting a message.
In a possible implementation manner, when the terminal device is in the second mode, the interface circuit establishes connection with the large-screen device, and projects content displayed in the touch screen to the large-screen device; or after the interface circuit is connected with the large-screen device, the terminal device enters the second mode and projects the content displayed in the touch screen to the large-screen device.
In a third aspect, an embodiment of the present application provides an electronic device, including: a processor and a memory, the processor being configured to invoke a program in the memory to cause the terminal device to perform any of the methods for performing the first aspect or any of the possible implementations of the first aspect.
In a fourth aspect, an embodiment of the present application provides an electronic device, including: the touch screen comprises a processor, a touch screen and an interface circuit, wherein the interface circuit is used for communicating with other devices; the touch screen is used for receiving touch operation of a touch object and executing display; the processor is configured to execute the code instructions to implement the first aspect or any of the possible implementation manners of the first aspect.
In a fifth aspect, embodiments of the present application provide a computer-readable storage medium storing instructions that, when executed, implement the first aspect or any of the possible implementation manners of the first aspect.
In a sixth aspect, the present application provides a computer program product including a computer program, which when run on a computer causes the computer to perform the method described in the first aspect or any one of the possible implementations of the first aspect.
It should be understood that the second aspect to the sixth aspect of the present application correspond to the technical solutions of the first aspect of the present application, and the beneficial effects achieved by the aspects and the corresponding possible implementations are similar and will not be described again.
Drawings
FIG. 1 is a schematic diagram of a possible implementation of a scenario of a touch screen;
FIG. 2 is a schematic diagram of a possible implementation of a scenario of a touch screen;
fig. 3 is a schematic structural diagram of a terminal device 100 according to an embodiment of the present application;
fig. 4 is a schematic diagram of a software structure of the terminal device 100 according to an embodiment of the present application;
fig. 5 is a schematic interface diagram of entering a cursor mode according to an embodiment of the present disclosure;
fig. 6 is a schematic interface diagram of entering a cursor mode according to an embodiment of the present disclosure;
fig. 7 is a schematic interface diagram in a cursor mode according to an embodiment of the present disclosure;
fig. 8 is a schematic interface diagram of a cursor mode during screen projection according to an embodiment of the present disclosure;
FIG. 9 is a schematic interface diagram of a handwriting mode according to an embodiment of the present application;
fig. 10 is a flowchart illustrating a touch screen display method according to an embodiment of the present disclosure;
fig. 11 is a schematic diagram of an internal interaction of a terminal device according to an embodiment of the present application;
fig. 12 is a schematic processing flow diagram of a cursor input event according to an embodiment of the present application;
fig. 13 is a schematic structural diagram of a touch screen display device according to an embodiment of the present disclosure.
Detailed Description
In the embodiments of the present application, the words "first", "second", and the like are used to distinguish the same items or similar items having substantially the same functions and actions. For example, the first chip and the second chip are only used for distinguishing different chips, and the sequence order thereof is not limited. Those skilled in the art will appreciate that the terms "first," "second," etc. do not denote any order or quantity, nor do the terms "first," "second," etc. denote any order or importance.
It should be noted that in the embodiments of the present application, words such as "exemplary" or "for example" are used to indicate examples, illustrations or explanations. Any embodiment or design described herein as "exemplary" or "e.g.," is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, use of the word "exemplary" or "such as" is intended to present concepts related in a concrete fashion.
In the embodiments of the present application, "at least one" means one or more, "a plurality" means two or more. "and/or" describes the association relationship of the associated objects, meaning that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone, wherein A and B can be singular or plural. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship. "at least one of the following" or similar expressions refer to any combination of these items, including any combination of the singular or plural items. For example, at least one (one) of a, b, or c, may represent: a, b, c, a-b, a-c, b-c, or a-b-c, wherein a, b, c can be single or multiple.
The touch screen convenient to operate is widely applied to various fields, the touch screen is usually configured in electronic equipment such as a mobile phone, a computer and a vehicle-mounted terminal, and a user controls the electronic equipment through touch operation. The touch operation may include a click operation and a slide operation. The user may click on an icon on the touch screen using a finger or stylus, and the icon may include: the electronic device can open the APP, open a webpage, open a document, or the like in response to the click operation. The user can also slide the touch screen to realize page turning or up-and-down sliding of the screen interface.
However, in a scene in which a user is performing presentation explanation or screen projection using an electronic device, in order for the user to indicate content that the user wishes to pay attention to a listener, the user may habitually indicate a specific position of the content in a touch panel using a finger or a stylus pen. For example, a user may click a certain position of the touch screen or slide in a certain area of the touch screen, and the operation may cause the terminal device to detect a trigger operation to perform an interface jump, and the like, thereby causing interruption of presentation interpretation.
Illustratively, as shown in fig. 1, in the process of performing on-line teaching using an electronic device equipped with a touch screen, when a user explains that "a shape common in life includes a circle, a rectangle, and a square … …" recites a "square", the user wishes to indicate to a listener which shape is a square, and at this time, a stylus or a finger may mistakenly click on a link 101 of a square icon, and the electronic terminal responds to the click operation, causing the touch screen to change from a displayed interface of the "common shape" to an interface of a "square nature", and interrupting the teaching process of the user.
Illustratively, as shown in fig. 2, the user is speaking based on property 2 of the "square property" interface, and slides on the touch screen with the stylus pen according to the content of the speaking to indicate the listener about the words being explained, at this time, the sliding operation of the stylus pen is responded by the terminal device as "turning to the next page", the touch screen changes from displaying the current "square property" interface to displaying the next page "triangle property" interface, and the lecture process of the user is interrupted. Therefore, in the current demonstration scene, the electronic device may respond to the corresponding touch operation and mistakenly awaken other interfaces, so that the explanation process of the user is interrupted, and the use experience of the user and the listener is reduced.
In view of this, an embodiment of the present application provides a touch screen display method, where a cursor mode is set in a terminal device, so as to reduce the probability that the terminal device responds to a touch operation of a user by mistake and the touch screen displays other unrelated interfaces. Optionally, when the terminal device is in the cursor mode, a cursor pointer in a floating state appears on the touch screen, and a finger or a stylus of a user can simulate a mouse to perform touch operation on the touch screen. After the touch screen receives single click and sliding operation of a user, the terminal equipment executes a cursor event processing flow and moves the floating cursor to a corresponding position of the touch screen, so that the user can point out important attention content to a listener in a demonstration scene.
The electronic device includes a terminal device, which may also be referred to as a terminal (terminal), a User Equipment (UE), a Mobile Station (MS), a Mobile Terminal (MT), and so on. The terminal device may be a mobile phone (mobile phone), a smart tv, a wearable device, a tablet computer (Pad), a computer with a wireless transceiving function, a Virtual Reality (VR) terminal device, an Augmented Reality (AR) terminal device, a wireless terminal in industrial control (industrial control), a wireless terminal in self-driving (self-driving), a wireless terminal in remote surgery (remote medical supply), a wireless terminal in smart grid (smart grid), a wireless terminal in transportation safety (transportation safety), a wireless terminal in smart city (smart city), a wireless terminal in smart home (smart home), and so on. The embodiment of the present application does not limit the specific technology and the specific device form adopted by the terminal device.
In order to better understand the embodiments of the present application, the following describes the structure of the terminal device according to the embodiments of the present application:
fig. 3 shows a schematic structural diagram of the terminal device 100. The terminal device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a key 190, a motor 191, an indicator 192, a camera 193, a display screen 194, a Subscriber Identification Module (SIM) card interface 195, and the like.
The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It is to be understood that the illustrated structure of the embodiment of the present application does not constitute a specific limitation to the terminal device 100. In other embodiments of the present application, terminal device 100 may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processor (GPU), an Image Signal Processor (ISP), a controller, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors.
The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it may be called from memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
In some embodiments, processor 110 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc.
It should be understood that the interface connection relationship between the modules illustrated in the embodiment of the present application is an illustrative description, and does not limit the structure of the terminal device 100. In other embodiments of the present application, the terminal device 100 may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The charging management module 140 is configured to receive charging input from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 140 may receive charging input from a wired charger via the USB interface 130. In some wireless charging embodiments, the charging management module 140 may receive a wireless charging input through a wireless charging coil of the terminal device 100. The charging management module 140 may also supply power to the terminal device through the power management module 141 while charging the battery 142.
The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140, and supplies power to the processor 110, the internal memory 121, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be used to monitor parameters such as battery capacity, battery cycle count, battery state of health (leakage, impedance), etc. In some other embodiments, the power management module 141 may also be disposed in the processor 110. In other embodiments, the power management module 141 and the charging management module 140 may be disposed in the same device.
The wireless communication function of the terminal device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. The antennas in terminal device 100 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution including 2G/3G/4G/5G wireless communication applied on the terminal device 100. The mobile communication module 150 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 150 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 150 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.) or displays an image or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional modules, independent of the processor 110.
The wireless communication module 160 may provide a solution for wireless communication applied to the terminal device 100, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), bluetooth (bluetooth, BT), Global Navigation Satellite System (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR), and the like. The wireless communication module 160 may be one or more devices integrating at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves through the antenna 2 to radiate the electromagnetic waves.
In some embodiments, the antenna 1 of the terminal device 100 is coupled to the mobile communication module 150 and the antenna 2 is coupled to the wireless communication module 160 so that the terminal device 100 can communicate with the network and other devices through wireless communication technology. The wireless communication technology may include global system for mobile communications (GSM), General Packet Radio Service (GPRS), Code Division Multiple Access (CDMA), Wideband Code Division Multiple Access (WCDMA), time-division code division multiple access (time-division code division multiple access, TD-SCDMA), Long Term Evolution (LTE), BT, GNSS, WLAN, NFC, FM, and/or IR technologies, etc. GNSS may include Global Positioning System (GPS), global navigation satellite system (GLONASS), beidou satellite navigation system (BDS), quasi-zenith satellite system (QZSS), and/or Satellite Based Augmentation System (SBAS).
The terminal device 100 implements a display function by the GPU, the display screen 194, and the application processor. The GPU is a microprocessor for image processing, and is connected to the display screen 194 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 194 is used to display images, video, and the like. The display screen 194 includes a display panel. The display panel may be a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), or the like. In some embodiments, the terminal device 100 may include 1 or N display screens 194, where N is a positive integer greater than 1.
The terminal device 100 may implement a shooting function through the ISP, the camera 193, the video codec, the GPU, the display screen 194, the application processor, and the like.
The ISP is used to process the data fed back by the camera 193. For example, when a photo is taken, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing and converting into an image visible to naked eyes. The ISP can also carry out algorithm optimization on the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into image signal in standard RGB, YUV and other formats. In some embodiments, the terminal device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process digital image signals and other digital signals. For example, when the terminal device 100 selects a frequency point, the digital signal processor is used to perform fourier transform or the like on the frequency point energy.
Video codecs are used to compress or decompress digital video. The terminal device 100 may support one or more video codecs. In this way, the terminal device 100 can play or record video in a plurality of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The NPU is a neural-network (NN) computing processor that processes input information quickly by using a biological neural network structure, for example, by using a transfer mode between neurons of a human brain, and can also learn by itself continuously. The NPU can implement applications such as intelligent recognition of the terminal device 100, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the storage capability of the terminal device 100. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music, video, etc. are saved in an external memory card.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The internal memory 121 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The storage data area may store data (such as audio data, a phonebook, etc.) created during use of the terminal device 100, and the like. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like. The processor 110 executes various functional applications of the terminal device 100 and data processing by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
The terminal device 100 may implement an audio function through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or some functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also called a "horn", is used to convert the audio electrical signal into an acoustic signal. The terminal device 100 can listen to music through the speaker 170A, or listen to a handsfree call.
The receiver 170B, also called "earpiece", is used to convert the electrical audio signal into an acoustic signal. When the terminal device 100 answers a call or voice information, it is possible to answer a voice by bringing the receiver 170B close to the human ear.
The microphone 170C, also referred to as a "microphone," is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can input a voice signal to the microphone 170C by speaking the user's mouth near the microphone 170C. The terminal device 100 may be provided with at least one microphone 170C. In other embodiments, the terminal device 100 may be provided with two microphones 170C, which may implement a noise reduction function in addition to collecting sound signals. In other embodiments, the terminal device 100 may further include three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, and implement directional recording functions.
The headphone interface 170D is used to connect a wired headphone. The headset interface 170D may be the USB interface 130, or may be a 3.5mm open mobile electronic device platform (OMTP) standard interface, a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
Illustratively, the terminal device 100 may further include one or more of a key 190, a motor 191, an indicator 192, a SIM card interface 195 (eSIM card), and the like.
The software system of the terminal device 100 may adopt a hierarchical architecture, an event-driven architecture, a micro-core architecture, a micro-service architecture, a cloud architecture, or the like. The embodiment of the present application takes an Android system with a layered architecture as an example, and exemplarily illustrates a software structure of the terminal device 100.
Fig. 4 is a block diagram of a software configuration of the terminal device 100 according to the embodiment of the present application.
The layered architecture divides the software into several layers, each layer having a clear role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, an application layer, an application framework layer, an Android runtime (Android runtime) and system library, and a kernel layer from top to bottom.
The application layer may include a series of application packages.
As shown in fig. 4, the application packages may include camera, calendar, phone, map, phone, music, settings, mailbox, video, stylus applications, and the like.
The application framework layer provides an Application Programming Interface (API) and a programming framework for the application program of the application layer. The application framework layer includes a number of predefined functions.
As shown in fig. 4, the application framework layer may include an input management server including an input Event dispatcher and an input management service interface, and an input Event reader including an Event hub (Event hub) input device manager and a breakpoint processing module.
And the input management service interface is used for receiving the input event sent by the application program layer and informing other processing modules of the input event to perform a specific processing flow.
And the Event hub input device manager is used for creating and managing the input and output devices.
And the report point processing module is used for receiving the input event and executing a corresponding report point conversion processing flow. The report processing module can comprise an event adaptation processing module and/or a handwriting event conversion module.
The event adaptive processing module is used for processing a cursor input event in the touch screen, for example, when the event adaptive processing module receives a sliding operation in the touch screen, the event adaptive processing module can process point reporting information based on the sliding operation to obtain coordinate information, and assigns the coordinate information to a floating cursor in the touch screen, so that the floating cursor can move along with the sliding operation; when the event adaptation processing module receives the click operation in the touch screen, the event adaptation processing module can process the click report information based on the click operation to obtain coordinate information, and assigns the coordinate information to the floating cursor in the touch screen, so that the floating cursor can move to the position of the click operation.
The handwriting event conversion module is used for processing a handwriting event in the touch screen, for example, when the handwriting event conversion module receives a sliding operation in the touch screen, the handwriting event conversion module can process point reporting information based on the sliding operation to obtain information for realizing page turning, page sliding, sliding track display on a page and the like, so that page turning, page sliding, sliding track display on a page, action display or a prompt box for deleting a message is realized; when the handwriting event conversion module receives a click operation in the touch screen, the handwriting event conversion module can obtain a control corresponding to a click point based on click information corresponding to the click operation, so as to realize jumping to a page corresponding to the control and the like.
And the input event distributor is used for distributing the result processed by the cursor report point processing end module to each thread for corresponding processing.
The Android runtime comprises a core library and a virtual machine. The Android runtime is responsible for scheduling and managing an Android system.
The core library comprises two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. And executing java files of the application program layer and the application program framework layer into a binary file by the virtual machine. The virtual machine is used for performing the functions of object life cycle management, stack management, thread management, safety and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface managers (surface managers), Media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., OpenGL ES), 2D graphics engines (e.g., SGL), and the like.
The surface manager is used to manage the display subsystem and provide fusion of 2D and 3D layers for multiple applications.
The media library supports a variety of commonly used audio, video format playback and recording, and still image files, among others. The media library may support a variety of audio-video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, and the like.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
The following describes an exemplary workflow of software and hardware of the terminal device 100 in conjunction with a scenario where an application is started or an interface is switched in the application.
When the touch sensor 180K receives a touch operation, a corresponding hardware interrupt is sent to the kernel layer. The kernel layer processes the touch operation into an original input event (including information such as touch coordinates, touch force, and time stamp of the touch operation). The raw input events are stored at the kernel layer. And the application program framework layer acquires the original input event from the kernel layer and identifies the control corresponding to the input event. And the application program calls an interface of the application framework layer to start the application program, and then starts the display driver by calling the kernel layer to display a functional interface of the application program.
The following describes in detail a display process of a cursor mode of a terminal device according to an embodiment of the present application with reference to the drawings. It should be noted that "at … …" in the embodiment of the present application may be at the instant of a certain condition, or may be within a certain period of time after a certain condition occurs, and the embodiment of the present application is not particularly limited to this.
The application software capable of realizing the cursor mode is not limited in the embodiment of the application, for example, the application software may include a terminal device system application or a third party pre-installed application software that cannot be deleted by a user. Third party application software may also be included that supports user installation or removal.
It should be noted that, for convenience of description, the embodiment of the present application adopts a handwriting mode (also referred to as a first mode) and a cursor mode (also referred to as a second mode) for illustration, and in an actual implementation, the terminal device does not necessarily have to be limited to the two modes.
For example, the terminal device is in a handwriting mode, it can be understood that the terminal device executes a handwriting function, and when a touch operation is received in the touch screen, the terminal device can change the content displayed in the user interface based on the touch operation. For example, when the terminal device receives a sliding operation of a stylus pen or a finger in a screen, the terminal device may implement page turning, page sliding, displaying a sliding track on a page, displaying a dynamic effect, or displaying a prompt box for deleting a message, and the like; when the terminal device receives the clicking operation of the handwriting pen or the finger in the screen, the terminal device can realize the jumping to the page corresponding to the control and the like.
The terminal device is in cursor mode, which can be understood as a function that the terminal device performs similar to when receiving an operation of a mouse. When a touch operation is received in the touch screen, the terminal device can change the position of the floating cursor based on the touch operation without changing the content displayed in the user interface. For example, when the terminal device is in the cursor mode, the floating cursor may be displayed on the screen, and when the touch screen of the terminal device receives the sliding operation of the stylus or the finger, the terminal device may control the floating cursor to move along with the sliding operation; when the terminal device receives a click operation of a stylus or a finger on the screen, the terminal device can move the floating cursor to the position where the click operation is located, and the like.
The terminal device has more specific implementation for entering the cursor mode from the handwriting mode, and the embodiment of the application exemplarily illustrates several possible interface schematic diagrams of the terminal device entering the cursor mode by combining fig. 5 to fig. 6.
Fig. 5 is a schematic interface diagram for entering a cursor mode according to an embodiment of the present application.
As shown in a in fig. 5, a hover button 501 may be displayed in a first interface of the terminal device, and the hover button 501 may be located at any position of the interface.
Optionally, the display position of the floating button can be automatically adjusted to a blank position in the interface by the user, and can also be detected by the terminal device and adjusted to a blank position of the current interface, so that the floating button does not block other functional icons of the current interface, and the user is prevented from being influenced to start other application programs.
When the terminal device receives the trigger of the hover button 501 from the user's finger, the terminal device may enter the interface shown in b of fig. 5. It is understood that the figures are illustrated with a finger, which may be replaced by any object capable of triggering a touch screen, such as a stylus.
Optionally, the interface shown in b in fig. 5 includes a hover button 502 in an expanded state, the hover button may include a cursor mode application control 503, a handwriting mode application control 504, and a cursor effect application control 505 after being expanded, and the user may switch the handwriting mode and the cursor mode based on the hover button 502 in the expanded state. Optionally, the shape of the floating cursor may be adjustable and/or the color may be adjustable. For example, the cursor effect displayed by the terminal device may be set by the cursor effect application control 505 in a customized manner, and the cursor effect may include a color, a size, a shape, and the like, and may also be adjusted in other manners, which is not limited herein.
For example, when the terminal device receives an operation that the user triggers the cursor mode application control 503, the terminal device enters the interface of the cursor mode as shown in c in fig. 5. It can be understood that the terminal device is currently in the handwriting mode, and after the user clicks the touch cursor mode application control 503, the terminal device responds to the click touch, and switches from the handwriting mode to the cursor mode. If the current terminal device is in the cursor mode, after the user clicks the touch cursor mode application control 503, the terminal device may not respond to the operation and maintain the current mode; optionally, the terminal device is currently in the cursor mode, and after the user continuously clicks and touches the handwriting mode application control 504 twice, the terminal device responds to the click touch, and switches from the cursor mode to the handwriting mode. Optionally, when the terminal device is in the cursor mode, the suspension state of the mouse may be simulated through a single click operation, the click state of the mouse may be simulated through two consecutive click operations, and the drag state of the mouse may be simulated through two consecutive click and sliding operations. If the current terminal device is in the handwriting mode, the user continuously clicks and touches the handwriting mode application control 504 twice, and the terminal device may not respond to the operation and maintain the current mode.
After the setting operation of b in fig. 5, the terminal device enters into the cursor mode, and as shown in c in fig. 5, a cursor pointer in a floating state, referred to as a floating cursor 506 for short, appears in the interface. The initial position of the hover cursor 506 may be the position of the last click touch, such as the position corresponding to the cursor mode application control 503. The initial position may also be the termination position of the floating cursor 506 the last time the cursor mode was switched to the handwriting mode. The initial position can also be randomly suspended at any position of the current interface, and the application does not limit the initial position of the suspended cursor.
As shown in d in fig. 5, when the terminal device receives a click touch operation of a user when in the cursor mode, a floating cursor appears at a position on the touch screen of the terminal device corresponding to a position where the user clicks and touches. When the terminal equipment receives the sliding touch operation of the user, the terminal equipment controls the floating cursor to move along with the position of the touch operation. For example, as shown in e in fig. 5, when the user shows the current time to others through the mobile phone screen, to more clearly indicate the position of the clock, the pair "15: 19 "were circled. In the process that a user slides from an initial position A point to a termination position B point by using a finger or a stylus pen, the position of the floating cursor moves along with the touch position, and after the user stops sliding touch, the cursor pointer stays at the position of the B point. In a possible case, the delineation track is a touch track of the user performing the sliding touch operation, and is not a line which can display the sliding track and appears in the terminal device interface.
That is to say, in this embodiment of the application, the terminal device may display a first interface including a hover button, and when receiving a trigger to the hover button, the terminal device displays the expanded hover button on the first interface. The expanded floating button comprises an area for triggering a handwriting mode and an area for triggering a cursor mode; and when the trigger to the cursor mode is received, the terminal equipment is switched from the handwriting mode to the cursor mode. Therefore, the terminal equipment can be switched to the cursor mode simply, conveniently and quickly through the suspension button, the mode switching time is shortened, and the user experience is improved.
For example, fig. 6 is a schematic interface diagram for entering a cursor mode according to an embodiment of the present application.
As shown in a in fig. 6, an icon of the cursor mode application software 601 may be displayed in the interface of the terminal device, and the icon of the cursor mode application software 601 may be at any position of the interface. It will be appreciated that the cursor mode application software may include the third party application software of the above examples, and may also include system applications in the terminal device. For example, in a setup program of the system application, the user may select to open a handwriting setup menu to cause the terminal device to enter a cursor mode.
When the terminal device receives a trigger from the user to the cursor mode application 601, the terminal device may enter an interface shown as b in fig. 6.
Optionally, the interface shown in b in fig. 6 includes a handwriting setting menu, and the handwriting mode menu may include a switch option 602 for a cursor mode and a switch option 603 for a handwriting mode. The user can switch the handwriting mode and the cursor mode based on the handwriting setting menu.
For example: when the terminal equipment is in the handwriting mode, the switch option of the handwriting mode in the interface of the terminal equipment is displayed in an on state, and the switch option of the cursor mode is displayed in an off state. When the terminal device receives an operation of the switch option 602 for the user to trigger the cursor mode, the terminal device enters an interface shown as c in fig. 6. The interface may include a switch option for cursor mode 602, a switch option for handwriting mode 603, a hover cursor 506, and a cursor effect setting option. It is understood that the switch option 602 in cursor mode is in on state and the switch option 603 in handwriting mode is in off state. Optionally, the cursor effect displayed by the terminal device may be set by a user-defined cursor effect option, and the cursor effect may include a color, a size, a shape, and the like, and may also be adjusted by using other manners, which is not limited herein.
In one possible implementation, the terminal device is currently in a handwriting mode, and after the user clicks the switch option 602 of the touch cursor mode, the terminal device responds to the click touch and switches from the handwriting mode to the cursor mode. If the current terminal device is in the cursor mode, the user clicks the switch option 602 of the touch cursor mode, and the terminal device may not respond to the operation and maintains the current state mode; optionally, the terminal device is currently in the cursor mode, and after the user continuously clicks and touches the switch option 603 of the handwriting mode twice, the terminal device responds to the click touch, and switches from the cursor mode to the handwriting mode. If the current terminal device is in the handwritten mode, the user continuously clicks and touches the switch option 603 of the handwritten mode twice, and the terminal device does not respond to the operation and maintains the current mode.
After the setting operation shown in b in fig. 6, the terminal device enters the cursor mode, as shown in c in fig. 5, and a floating cursor 506 appears in the interface. After the user clicks any position on the touch screen, the floating cursor 506 appears at the current click touch position. When the user slides and touches the screen, the position of the floating cursor correspondingly changes along with the change of the touch position of the user. As shown in d and e in fig. 6, the operation flow of the terminal device in the cursor mode is similar to that of d in fig. 5 and e in fig. 5, and is not repeated here.
It is to be understood that fig. 5 and 6 illustrate one way of implementing switching between a handwriting mode and a cursor mode using a terminal device. In possible implementation, when the terminal device is connected with a stylus through a bluetooth module and the like, the terminal device side can be notified to start a cursor mode through the stylus.
In one possible implementation manner, an instruction for switching the cursor mode can be set in the stylus pen in a user-defined manner. After receiving the operation of clicking, double-clicking or long-pressing a button on the pen body side of the user, the handwriting pen reports the command of switching the cursor mode to the terminal equipment through the Bluetooth module. And the terminal equipment enters the cursor mode after receiving the instruction. For example, after receiving the instruction, the terminal device may replace a module related to executing the handwriting mode in the application layer with a module related to executing the cursor mode to execute the cursor processing logic, and specific implementation will be described in detail in the following embodiments, which is not described herein.
In another possible implementation manner, the stylus pen may also unlock the command for switching the cursor mode through a specific gesture motion, and send the command to the terminal device. The specific gesture motion may include the pen tip being up or the pen body being rotated by a specific angle.
In another possible implementation manner, an instruction for waking up the handwriting setting menu can be set in the handwriting pen in a customized manner. And after receiving the operation that a user presses a button of the handwriting pen or executes a specific gesture action, the handwriting pen sends an instruction for awakening the handwriting setting menu to the terminal equipment. And after receiving the instruction, the terminal equipment activates a handwriting setting menu interface, and the user manually starts a cursor mode on the interface by using a handwriting pen. The implementation manner of entering the cursor mode by the terminal device is not limited herein.
It will be appreciated that the terminal device shown in fig. 5 d and 5 e and fig. 6 d and 6 e is in one implementation of the cursor mode. In a possible implementation, when the terminal device is in the cursor mode, the following operations may also be performed.
Fig. 7 is an interface schematic diagram of a terminal device in a cursor mode according to an embodiment of the present disclosure.
The terminal device displays an interface as shown in fig. 7, where the interface includes display content, and after the terminal device receives two consecutive click operations from the user, the cursor pointer is changed from the floating cursor to the focus cursor. And if the user continuously clicks the touch screen twice and the finger leaves the touch screen, the terminal equipment controls the focus cursor to be displayed at the clicked and touched position. And if the user slides for a certain displacement after clicking the touch screen twice continuously, the display content at the displacement position on the touch screen is highlighted.
For example, as shown in an interface a in fig. 7, the terminal device is in the cursor mode, and the display content in the interface includes text. The user has clicked on the location of the "cotton" word, and the floating cursor appears below the "cotton" word. As shown in b in fig. 7, the user lifts the finger upward, so that the finger is away from the touch screen, and the position of the floating cursor is unchanged. The user then clicks the "cotton" word a second time within a short time, as shown by c in FIG. 7. After receiving the operation of two consecutive clicks by the user, the terminal device changes the floating cursor into a focus cursor 701, where the focus cursor 701 is shown as d in fig. 7 and is displayed behind the "cotton" word at the click position. After the second click operation shown in d in fig. 7, in a possible case, the user's finger does not leave the touch screen, and the finger moves a distance on the touch screen, and the finger slides from the word "cotton" to the word "summer", then the terminal device enters the e interface in fig. 7, and the display content "summer to no cotton" where the finger generates displacement on the touch screen is highlighted by the terminal device. In another possible case, the user's finger leaves the touch screen, as shown in f in fig. 7, and the terminal device receives the user's lifting operation, the focus cursor stays at a position behind the "cotton" word clicked for the second time. It can be understood that several possible implementations of entering the cursor mode by the terminal device are provided in the embodiments of the present application, and the implementation does not limit the specific implementation of the cursor mode.
Optionally, when the terminal device is in the cursor mode, the terminal device may also be used to connect with a large-screen instrument, and the content displayed by the terminal device in the cursor mode may be projected on the large-screen device.
Optionally, after the terminal device is connected with the large-screen device, the terminal device may also be switched from the handwriting mode to the cursor mode. The terminal device can project the content displayed in the cursor mode to the large-screen device.
Fig. 8 is a schematic interface diagram of a cursor mode during screen projection according to an embodiment of the present application.
Optionally, the terminal device may project the content displayed in the terminal device on a large screen device.
As shown in fig. 8, the terminal device may be connected to a large-screen device, which may include a projector, an intelligent appliance, and other electronic devices distinguished from the terminal device. Optionally, for example, a tablet computer with a touch screen as the terminal device is used, the content on the interface of the tablet computer may be projected onto a screen matched with the projector, the tablet computer may be connected with the television device and then projected onto the screen of the television, and the tablet computer may also be projected onto other electronic devices through meeting applet, APP, remote connection, or other manners, which is not limited herein.
Illustratively, as shown in a in fig. 8, the terminal device is in a cursor mode, and after the terminal device establishes a connection with the large-screen device, the user performs a ppt document presentation on the terminal device. In explaining the oval shape among common shapes, the user wants to indicate to the listener which shape is the oval shape, and then clicks on the position on the touch screen where the oval shape is located. And the terminal equipment receives the touch operation, and adjusts the position of the floating cursor to the position clicked by the user on the touch screen. And the display interface of the terminal equipment is synchronized to the display interface of the large-screen equipment, and the suspended cursor is displayed below the oval.
Illustratively, as shown in b in fig. 8, the user habitually circles which is an ellipse when explaining the ellipse to the listener. The terminal equipment receives the sliding touch of the user, the floating cursor in the display interface can change along with the change of the touch position, and a listener can clearly see the shaking condition of the floating cursor near the oval on the large-screen equipment. The dotted line in a large screen device is the sliding track of the floating cursor, and the listener cannot really observe the actual line of the sliding track.
The above embodiment shows an example of an interface in which the terminal device enters and uses the cursor mode, and it can be understood that the terminal device may also switch from the cursor mode to the handwriting mode.
Optionally, fig. 9 is an interface schematic diagram of a handwriting mode according to an embodiment of the present application.
Illustratively, as shown in a in fig. 9, the terminal device is in a handwriting mode, and after the terminal device receives an operation of clicking a "call" icon 901 by a user, the terminal device responds to the touch operation, and the terminal device enters a call interface shown in b in fig. 9. The user may dial, talk or query contacts, etc. based on the interface.
Illustratively, as shown in c in fig. 9, the terminal device is in a handwriting mode, when a user draws on a "drawing board" interface, the terminal device receives a sliding touch operation from a point a to a point B, and in response to the sliding touch operation, the terminal device enters an interface shown as d in fig. 9. The interface displays a sliding touch track from the point A to the point B.
The above is an application scenario of a partial handwriting mode, and the application does not limit the application scenario of the handwriting mode.
The application scenario of the cursor mode in the embodiment of the present application has been described above, and a flow for executing the above touch screen display method provided in the embodiment of the present application is described below. The touch screen display method comprises the following steps:
s901, when the terminal equipment is switched from a first mode to a second mode, the terminal equipment displays a floating cursor; when the terminal device receives operation aiming at the touch screen in the first mode, the terminal device executes handwriting event processing.
In the embodiment of the present application, the first mode may correspond to the handwriting mode, and the second mode may correspond to the cursor mode.
And when the terminal equipment receives the operation aiming at the touch screen in the first mode, the terminal equipment executes the handwriting event processing. The terminal device executing the handwriting event processing may be understood as that after the terminal device receives a touch operation directed to the touch screen, the terminal device determines a position of the touch screen where the touch operation is received, and triggers execution of a corresponding function of an application at the touch position. For example, after the terminal device receives an operation for a call application on the touch screen, the terminal device opens the call application and displays a call interface, thereby implementing a call function of the terminal device. The terminal device may switch from the first mode to the second mode. And when the terminal equipment is switched from the first mode to the second mode, the terminal equipment displays the floating cursor.
In this embodiment of the application, the terminal device may switch from the first mode to the second mode based on a trigger of a user in the terminal device, and specifically, refer to the relevant descriptions of fig. 5 to fig. 6. The terminal device may also switch from the first mode to the second mode based on the trigger of the stylus pen, which is not described herein again.
S902, when the terminal device receives a first touch operation on the touch screen, the terminal device controls the floating cursor to move along with the position of the first touch operation.
For example, the first touch operation may include a sliding operation. When the terminal equipment receives the sliding operation of the touch object on the touch screen, the terminal equipment controls the floating cursor to move along with the sliding position of the touch object.
The touch object may be any object capable of triggering the touch screen, such as a finger or a stylus pen. The touch object performs sliding operation on the touch screen, and the terminal device can control the floating cursor to move synchronously along with the position of the sliding operation based on the received sliding operation.
For example, the first touch operation may include a click operation. When the terminal equipment receives the clicking operation of the touch object on the touch screen, the terminal equipment controls the floating cursor to appear at the position where the touch object is clicked.
And the touch object performs clicking operation on the touch screen, and the terminal equipment can control the floating cursor to appear at the position clicked by the touch object based on the received clicking operation. It can be understood that after the terminal device switches to the second mode, the initial position of the floating cursor may be any position, and after the touch object is clicked, the floating cursor appears at the clicked position of the touch object.
In this embodiment of the application, when the terminal device is in the cursor mode, the terminal device may change the position of the floating cursor based on a touch operation of a user in the terminal device, and specifically, refer to the description related to d in fig. 5 and e in fig. 5 to d in fig. 6 and e in fig. 6. When the terminal device is in the handwriting mode, the terminal device may trigger to execute a corresponding function of the application at the touch position based on a touch operation of the user in the terminal device, which may specifically refer to the related description of fig. 9 and is not described herein again.
According to the embodiment of the application, the cursor mode is set in the terminal equipment, the probability that the terminal equipment mistakenly responds to the touch operation of the user and then displays other irrelevant interfaces is reduced, and the use experience of the user is improved.
The following provides a detailed description of a method for a terminal device to execute a touch screen display. Fig. 10 is a schematic flowchart of a touch screen display method according to an embodiment of the present disclosure, where the method includes:
s1001, a touch screen of the terminal device receives touch operation of switching from a first mode to a second mode.
In this embodiment of the application, the terminal device may receive a touch operation switched from the first mode to the second mode, and specifically refer to the description related to a in fig. 5 and b in fig. 5 to a in fig. 6 and b in fig. 6. The terminal device may also switch from the first mode to the second mode based on the trigger of the stylus pen, which is not described herein again.
In this embodiment, after the application layer of the terminal device receives the touch operation from the first mode to the second mode, the application layer may notify the system framework layer of switching the handwriting device to the cursor device through the existing interface capability of the system, and further switch the input device to an Event hub (Event hub).
Exemplarily, fig. 11 shows a schematic diagram of terminal device internal interaction.
And the setting application of the application program layer receives the touch operation of switching the terminal equipment into the second mode by the touch object, and the application program layer informs the system framework layer of switching the handwriting equipment to the cursor equipment through an input management service interface in the system framework layer. The Event Hub input device management module switches the handwriting device to the cursor device, and the management report processing module adopts the Event adaptive processing module to report points, so that the terminal device enters a cursor mode.
And when the terminal equipment receives touch operation in the cursor mode, the event adaptation processing module processes the touch operation received by the application program layer to obtain a cursor input event. And the input event distribution is used for adaptively distributing the cursor input event to each thread to perform corresponding processing. And the application program layer receives the processing result of the cursor input event and correspondingly displays the processing result on the touch screen. According to the embodiment of the application, the virtual cursor equipment is arranged in the system framework layer from the software layer and the event adaptation processing module is switched, so that the realization of hardware is not relied on, and the reconnection of the bottom layer equipment and the change of the equipment node can not be triggered.
The detailed flow implementation of fig. 11 can be described with reference to the following steps:
and S1002, the terminal equipment registers the virtual cursor equipment.
After the touch screen of the terminal device receives the touch operation of switching from the first mode to the second mode, the application program layer of the terminal device informs the application program framework layer to prepare for switching to the second mode. The terminal device can simulate the connection state of the registered cursor device in a system framework through the existing interface capability of the system. Taking the android platform as an example, a virtual cursor device can be added to an Input Reader (Input Reader) or an Event hub, and a system layer can initialize a cursor state.
Illustratively, the terminal device registering the virtual cursor device includes: the method comprises the steps that terminal equipment creates and initializes a virtual equipment identifier; the terminal device uses the virtual device identifier to create a virtual input device; the terminal device sets the input device as a touch object, the touch object can comprise a finger and a stylus pen, and the virtual cursor device is added to the system framework layer.
For example, the terminal device creating and initializing an identifier of the virtual device may be implemented based on:
an Input Device Identifier; // input device identifier
Name = "Virtual-style"; name for device identifier
identifier. unique id = "< virtual >"; // identifier unique Id
assign Descriptor Locked (identifier); // the assignment descriptor is locked (identifier).
For example, the creation of a virtual input device by the terminal device using a virtual device identifier may be based on:
std::unique_ptr <Device> device =
std::make_unique <Device> (-1,Reserved Input DeviceId ::VIRTUAL_KEYBOARD_ID,"<virtual>",identifier)。
for example, the terminal device sets the input device as a touch object, and the touch object may include a finger and a stylus pen, which may be implemented based on the following:
device->classes = Input Device Class::STYLUS| Input Device Class::VIRTUAL;
device->load Key Map Locked ()。
for example, adding a virtual cursor device to the system framework layer may be implemented based on:
Add Device Locked (std::move(device))。
after the virtual cursor device is successfully registered, the system framework layer of the terminal device can inquire the connection of the cursor device, so that the cursor resource and the state are initialized and displayed. The application layer may also query for a connection to the cursor device. For example, after the terminal device successfully registers the virtual cursor device, the application layer may query the connection of the cursor device, and pop up a prompt window such as "cursor device successfully registered" or "cursor device has been accessed" on the touch screen of the terminal device.
It can be understood that, in the embodiment of the present application, when the terminal device is connected with a stylus pen device, the touch object is a stylus pen. Before performing step S1002, the terminal device further includes:
exemplarily, the terminal device determines whether a stylus device is currently connected, and if the terminal device identifies the stylus device, step S1002 is executed; and if the terminal equipment does not recognize the handwriting pen equipment, informing the application interface of returning failure.
S1003, the terminal equipment switches a module for processing the event generated in the touch screen from a handwriting event conversion module to an event adaptation processing module; the handwriting event conversion module is used for processing a handwriting event in the touch screen, and the event adaptation processing module is used for processing a cursor input event in the touch screen.
For example, the terminal device may add an "event adaptation processing" module to the report processing module, and switch the original handwriting event conversion module.
And when the terminal equipment is in the first mode, the terminal equipment processes the handwriting event in the touch screen through the handwriting event conversion module. And when the terminal equipment is in the second mode, the terminal equipment processes the cursor event in the touch screen through the event adaptation processing module.
The terminal device switches a module for processing an event generated in the touch screen from the handwriting event conversion module to the event adaptation processing module, and may include the following possible implementation manners:
in a first possible implementation manner, the terminal device deletes the handwriting event conversion module and adds an event adaptation processing module. When the terminal equipment processes the cursor event generated in the touch screen, an in-use event adaptation processing module is reserved in the terminal equipment, and an unused handwriting event conversion module is deleted in the terminal equipment, so that the memory of the terminal equipment is reduced.
In a second possible implementation manner, the terminal device reserves a handwriting event conversion module and adds an event adaptation processing module.
In a third possible implementation manner, the terminal device is provided with a handwriting event conversion module and an event adaptation processing module, and an event adaptation processing module does not need to be added newly. When the terminal equipment is switched to the second mode, the terminal equipment switches the module for processing the event generated in the touch screen from the handwriting event conversion module to the event adaptation processing module.
The terminal device simultaneously reserves the used event adaptation processing module and the unused handwritten event conversion module, and when the terminal device is switched back to the first mode from the second mode, the terminal device can directly call the handwritten event conversion module, so that the time for increasing the use time of the handwritten event conversion module is reduced.
The virtual cursor equipment is arranged on the software layer and the event adaptation processing module is switched, so that the method and the device do not depend on hardware implementation, and reconnection of the bottom layer equipment and change of equipment nodes cannot be triggered.
And S1004, the terminal device processes the cursor input event based on the event adaptation processing module, so that when the terminal device receives the operation of the touch object on the touch screen, the terminal device executes the processing flow of the cursor input event.
Illustratively, the terminal device may employ a state machine to process the cursor input event, and fig. 12 shows a processing flow of the cursor input event. As shown in fig. 12, includes:
when the terminal equipment is switched from the first mode to the second mode, the state machine is in an initial state (Init state).
The method comprises the steps that when the terminal equipment detects a first contact event of a touch object on a touch screen, the terminal equipment enters a first state; the first state may also be referred to as a pressed state (Down state). In the Down state, a floating cursor in a static state can be displayed on a touch screen of the terminal device.
When the terminal equipment is in a first state, if the terminal equipment detects that the touch object does not leave the touch screen and the touch object generates displacement in the touch screen, the terminal equipment enters a second state; the second state may also be referred to as a pointer hovering state (Hover state), and when the terminal device is in the second state, the terminal device controls the hovering cursor to move according to the pointing displacement of the touch object. In the Hover state, the floating cursor on the touch screen of the terminal equipment can move along with the displacement of the touch object from a static state. The terminal device provided in the embodiment of the present application controls the floating cursor movement manner according to the pointer displacement of the touch object, which may specifically refer to the description related to d in fig. 5 and e in fig. 5 to d in fig. 6 and e in fig. 6, and is not described herein again.
Optionally, when the terminal device is in the first state, if the terminal device detects that the touch object is not displaced on the touch screen, the terminal device leaves the touch screen, and the terminal device enters a third state; the third state may also be referred to as a transient state (Pending state). In the Pending state, the terminal device does not receive the sliding operation of the touch object, and does not detect that the touch object leaves the touch screen, and at this time, the floating cursor may be in a static state. The third state is used for further judging whether the gesture of the touch object is continuous double-click operation.
Optionally, when the terminal device is in the second state, if the terminal device detects that the touch object leaves the touch screen, the terminal device returns to the Init state to wait for the next touch operation of the touch object.
Optionally, when the terminal device is in the third state, if the terminal device detects a second contact event of the touch object on the touch screen, and a time interval between the second contact event and the first contact event is smaller than a time threshold, and a distance between respective positions of the second contact event and the first contact event corresponding to the touch screen is smaller than a distance threshold, the terminal device enters the fourth state. The fourth state may also be referred to as a Drag & Move state. The time threshold and the distance threshold may be automatically set by the system, and may also be manually adjusted by the user, which is not limited herein.
Optionally, when the terminal device is in the third state, if the terminal device does not receive the second contact event of the touch object on the touch screen, or a time interval between the received second contact event and the first contact event is not lower than a time threshold, or a distance between positions corresponding to the touch screen of the second contact event and the first contact event is not lower than a distance threshold, the terminal device returns to the Init state.
Optionally, when the terminal device is in the fourth state, if the terminal device detects that the touch object generates displacement on the touch screen, the display content at the position where the touch screen generates displacement is highlighted. In the Drag & Move state, the cursor is converted into a focus cursor in a static state from a floating cursor in the static state. If the terminal device detects the sliding operation of the touch object, the focus cursor selects the area of the touch screen through which the sliding operation of the touch object passes, and therefore the selected display content is highlighted. The display content highlighting may specifically refer to the related description of e in fig. 7, and is not described herein again.
Optionally, when the terminal device is in the fourth state, if the terminal device detects that the touch object leaves the touch screen, the terminal device displays the focus cursor at the position of the second contact event. In the Drag & Move state, if the terminal device detects that the touch object leaves the touch screen, the focus cursor stays at the coordinate position of the touch operation when the terminal device enters the Drag & Move state. The touch object can edit or modify the display content based on the current coordinate position. The focus cursor may specifically refer to the description related to d in fig. 7 and f in fig. 7, and will not be described herein again. And when the terminal equipment detects that the touch object leaves the touch screen, the terminal equipment returns to a Pending state and waits for the next touch operation of the touch object.
When the terminal device enters the fourth state from the third state, in a possible manner, if the terminal device in the third state detects a second contact event of the touch object on the touch screen, the terminal device records a time point and a position coordinate of the second contact event. The terminal device judges whether a time difference value between a time point of the first contact event and a time point of the second contact event is smaller than a preset time threshold value or not, and the terminal device judges whether a distance between a position coordinate of the first contact event and a position coordinate of the second contact event is smaller than a preset distance threshold value or not. And if the time difference value and the distance are both smaller than the preset threshold value, determining that the first contact event and the second contact event are continuous double-click, and enabling the terminal equipment to enter a Drag & Move state. And if the time interval and the distance do not meet the condition that both are smaller than the preset value, the terminal equipment returns from the Pending state to the Init state.
In another possible mode, after the terminal device detects the first contact event of the touch object on the touch screen, the terminal device records the touch position of the touch object and starts timing. And after the terminal equipment detects a second contact event of the touch object on the touch screen, the terminal equipment ends timing. The terminal device determines whether the timing time interval is smaller than a time threshold, and determines whether the touch position of the second contact event is within a preset range of the touch position of the first contact event, where the preset range may be a circular area with the touch position of the first contact event as a circle center and the preset value as a radius. And if the time interval is smaller than the time threshold and the second contact event is within the preset range, determining that the first contact event and the second contact event are continuous double-click, and enabling the terminal equipment to enter a Drag & Move state. And if the time interval and the distance do not meet the condition that both are smaller than the preset value, the terminal equipment returns from the Pending state to the Init state.
According to the touch screen display method provided by the embodiment of the application, the terminal device controls the state machine to convert among the initial state, the first state, the second state, the third state and the fourth state based on the touch operation of the touch object, so that the terminal device controls the floating cursor to move according to the report point displacement of the touch object, the display content of the displacement position of the touch screen of the terminal device is highlighted, and the terminal device displays the focus cursor at the position of the second contact event, and therefore the operations of cursor movement, clicking, left mouse button dragging and the like in the cursor mode are achieved.
S1005, the touch screen of the terminal device receives a touch operation switched from the second mode to the first mode.
When the terminal device is in the second mode, if the user wants to use some handwriting functions of the terminal device to realize convenient handwriting input or click operation, and the like, the user can switch the terminal device from the second mode to the first mode. After the touch screen of the terminal device receives the touch operation switched from the second mode to the first mode, the terminal device can be switched from the second mode to the first mode. The touch screen of the terminal device receives the touch operation method for switching from the second mode to the first mode, which may specifically refer to the related descriptions in fig. 5 and fig. 6, and details are not repeated here.
For example, after the application layer of the terminal device receives the touch operation of switching from the second mode to the first mode, the application layer notifies the system framework layer to switch the cursor device to the handwriting device through the existing interface capability of the system, and further switches the input device to the Event hub (Event hub).
Specific implementations can be described with reference to the following steps.
And S1006, when the terminal equipment is switched from the second mode to the first mode, the terminal equipment unregisters the virtual cursor equipment.
After the terminal device receives the touch operation switched from the second mode to the first mode, an application program layer of the terminal device informs an application program framework layer to prepare for switching to the first mode. The terminal device can unregister the virtual cursor device in the system framework through the existing interface capability of the system, and the cursor device is removed from the system framework layer.
The system framework layer of the terminal device can inquire that the cursor device is disconnected, so that the initial display of the handwriting resources and the state is carried out, and the application program layer can also inquire that the cursor device is disconnected. For example, after the terminal device successfully unregisters the virtual cursor device, the application layer may query that the cursor device is disconnected, and pop up a prompt window such as "the cursor device successfully unregisters" or "the cursor device has popped up" on the touch screen of the terminal device.
And the terminal equipment switches the module for processing the event generated in the touch screen from the event adaptation processing module to the handwriting event conversion module.
In a first possible implementation, the terminal device deletes the event adaptation processing module and adds the handwriting event conversion module. When the terminal equipment processes the handwriting event generated in the touch screen, a module for converting the handwriting event in use is reserved in the terminal equipment, and the unused event adaptive processing module is deleted in the terminal equipment, so that the memory of the terminal equipment is reduced.
In a second possible implementation manner, the terminal device reserves an event adaptation processing module and adds a handwriting event conversion module.
In a third possible implementation manner, the terminal device is provided with a handwritten event conversion module and an event adaptation processing module, and a newly added handwritten event conversion module is not required. When the terminal equipment is switched to the first mode, the terminal equipment switches the module for processing the event generated in the touch screen from the event adaptation processing module to the handwriting event conversion module.
And S1007, the terminal device processes the handwriting input event based on the handwriting event conversion module.
In this embodiment of the application, the terminal device processes the handwriting input event based on the handwriting event conversion module, which may specifically refer to the related description of fig. 9. And will not be described in detail herein.
Optionally, in the cursor event processing flow (i.e., in step S1004) in the embodiment of the present application, the terminal device may control the floating cursor to move according to the pointer displacement of the touch object, including: and the terminal equipment converts the report point information of the touch object into coordinate information, and controls the floating cursor to move according to the coordinate information.
It can be understood that the report point information of the touch object acquired by the terminal device includes other information besides the coordinate information. When the terminal device is in the cursor mode, the coordinate fixed point of the floating cursor needs the coordinate information in the report point information, so the terminal device can process the report point information and then perform the subsequent steps. Illustratively, the terminal device removes other information except the coordinate information in the report point information to obtain the coordinate information. For example, the terminal device may retain coordinate information in the report information of the touch object, discard other information, and then, the terminal device may calculate an actual position of the cursor according to parameters such as the coordinate information, the resolution of the original Android screen, and the screen direction (horizontal and vertical screens), and draw the cursor at the coordinate position through the Pointer Controller cursor display module.
For example, when a touch object performs a touch operation on a touch screen of a terminal device, the terminal device generates a series of report information of the touch object, and the report information may include: the X coordinate of the touch object, the Y coordinate of the touch object, the physical pressure perceived by the touch object or the signal strength of the touch area, the cross-sectional area or width of the touch area or the touch object, the distance between the touch object and the touch screen surface, the inclination of the touch object along the X axis of the touch screen surface, the inclination of the touch object along the Y axis of the touch screen surface, and the like.
When the terminal device uses the event adaptation processing module to process the cursor input event, the terminal device reserves the X coordinate of the touch object and the Y coordinate of the touch object in the plurality of report point information and discards other report point information. And the event adaptation processing module converts the X coordinate of the touch object and the Y coordinate of the touch object in the report point information into coordinate information of a cursor input event.
Taking the touch object as a stylus pen as an example, the hit information of the stylus pen may include the following contents:
ABS _ X: the X coordinate of the stylus is (necessarily) reported.
ABS _ Y: the Y-coordinate of the stylus is (necessarily) reported.
ABS _ PRESSURE: (optionally) reporting the physical pressure applied to the stylus tip or the signal strength of the touch area.
ABS _ touch _ WIDTH: (optional) report the cross-sectional area or width of the touch area or stylus itself.
ABS _ DISTANCE: (optional) reporting the distance between the stylus and the touch screen surface.
ABS _ TILT _ X: (optional) reporting the tilt of the stylus along the X-axis of the touch screen surface.
ABS _ TILT _ Y: (optional) reporting the tilt of the stylus along the Y-axis of the touch screen surface.
After converting the report information of the stylus pen into the coordinate information, the terminal device can obtain ABS _ X and ABS _ Y, and discard other ABS _ events.
In a possible implementation, ABS _ X and ABS _ Y may be displacement information, and the terminal device may calculate an actual position of the cursor based on parameters such as a resolution of an original screen of ABS _ X, ABS _ Y, Android and a screen direction (horizontal and vertical screens), and then may draw the cursor at the actual position of the cursor through a Pointer Controller cursor display module. For example, in the process that the stylus pen slides on the touch screen, the terminal device calculates the actual position of the cursor at the current moment in real time, and the cursor display module draws the cursor at the coordinate position, so that the cursor moves along with the stylus pen.
In another possible implementation, ABS _ X and ABS _ Y may be real coordinate information, and the terminal device may be positioned to a specific actual cursor position based on ABS _ X and ABS _ Y, and then draw a cursor at the actual cursor position.
Taking the touch object as a finger as an example, the touch point information of the finger may include the following contents:
ABS _ MT _ POSITION _ X (required) reports the X coordinate of the finger.
ABS _ MT _ POSITION _ Y (required) reports the Y coordinate of the finger.
ABS _ MT _ PRESSURE (optional) reports the signal strength of the finger PRESSURE on the touch screen.
ABS _ MT _ TRACKING _ ID (optional) reports the event set ID of the finger from the touch screen start to the release process.
ABS _ MT _ TOUCH _ MAJOR (optional) reports the length of the long axis of the finger contact area primary contact surface.
ABS _ MT _ TOUCH _ MINOR (optional) reports the MINOR axis length of the finger contact area primary contact surface.
ABS _ MT _ origin (optional) reports the direction of the main contact surface elliptical area of the finger contact area.
After converting the report point information of the finger into coordinate information, the terminal device can obtain ABS _ MT _ POSITION _ X and ABS _ MT _ POSITION _ Y, and discard other ABS _ MT _ events.
In a possible implementation, ABS _ MT _ POSITION _ X and ABS _ MT _ POSITION _ Y may be displacement information, and the terminal device may calculate an actual POSITION of a cursor based on parameters such as a resolution of an original screen of ABS _ MT _ POSITION _ X, ABS _ MT _ POSITION _ Y, Android and a screen direction (horizontal and vertical screens), and then draw the cursor at the actual POSITION of the cursor through a cursor display module Pointer Controller. For example, in the process of sliding a finger on the touch screen, the terminal device calculates the actual position of the cursor at the current moment in real time, and the cursor display module draws the cursor at the coordinate position, so that the cursor can move along with the finger.
In another possible implementation, ABS _ MT _ POSITION _ X and ABS _ MT _ POSITION _ Y may be real coordinate information, and the terminal device may be positioned to a specific actual cursor POSITION based on ABS _ MT _ POSITION _ X and ABS _ MT _ POSITION _ Y, and then draw a cursor at the actual cursor POSITION.
It should be understood that the interface of the terminal device provided in the embodiment of the present application is only an example, and is not limited to the embodiment of the present application.
The method provided by the embodiment of the present application is explained above with reference to fig. 1 to 12, and the apparatus provided by the embodiment of the present application for performing the method is described below. As shown in fig. 13, fig. 13 is a schematic structural diagram of a touch screen display device provided in the embodiment of the present application, where the touch screen display device may be a terminal device in the embodiment of the present application, and may also be a chip or a chip system in the terminal device.
As shown in fig. 13, the touch screen display device 130 may be used in a communication device, a circuit, a hardware component or a chip, and includes: a processor 1302, interface circuitry 1303, and a touch screen 1304. The touch screen 1304 is used for supporting the display steps executed by the touch screen display method; the processor 1302 is configured to support the touch screen display device to perform information processing, and the interface circuit 1303 is configured to support the touch screen display device to perform receiving or transmitting. The touch screen 1304 is used for receiving a touch operation of a touch object, and may also be referred to as a display unit; the processor 1302 may also be referred to as a processing unit and the interface circuit 1303 may also be referred to as a communication unit.
Specifically, in the touch screen display device 130 provided in the embodiment of the present application, when the terminal device is in the first mode, the touch screen 1304 receives a trigger operation from a touch object; in the first mode, if the touch screen 1304 receives a first sliding operation, the processor 1302 controls the user to change the content of the displayed page along with the first sliding operation; in response to the trigger operation, the touch screen 1304 displays a floating cursor in the second mode; when the terminal device is in the second mode, if the touch screen 1304 receives a second sliding operation on the user interface, the processor 1302 controls the floating cursor to move along with the sliding position of the second sliding operation, and the content of the page in the user interface displayed by the touch screen 1304 does not change.
In a possible implementation manner, when the terminal device is in the first mode, if the touch screen 1304 receives a click operation for the target control, the terminal device jumps to a page corresponding to the target control; and/or when the terminal device is in the second mode, if the touch screen 1304 receives a click operation for the target control, the processor 1302 moves the floating cursor to a position where the click operation is triggered in the touch screen 1304.
In one possible implementation manner, if the touch screen 1304 receives the second sliding operation, the processor 1302 controls the floating cursor to move along with the sliding position of the second sliding operation, including: when the processor 1302 detects a first contact event of the touch object, the terminal device enters a first state; when the terminal device is in the first state, if the processor 1302 detects that the touch object does not leave the touch screen and the touch object generates displacement in the touch screen, the terminal device enters the second state; when the terminal device is in the second state, the processor 1302 controls the floating cursor in the touch screen 1304 to move according to the pointing displacement of the touch object.
In one possible implementation, the processor 1302 controls the floating cursor in the touch screen 1304 to move according to the pointing displacement of the touch object, including: the processor 1302 converts the report point information of the touch object into coordinate information; the processor 1302 controls the floating cursor in the touch screen 1304 to move according to the coordinate information.
In one possible implementation, the processor 1302 converts the hit information of the touch object into coordinate information, including: the processor 1302 removes other information except the coordinate information from the report point information to obtain the coordinate information.
In a possible implementation manner, after the terminal device enters the first state, when the terminal device is in the first state, if the processor 1302 detects that the touch object does not displace on the touch screen 1304, that is, leaves the touch screen 1304, the terminal device enters the third state; when the terminal device is in the third state, if the processor 1302 detects a second contact event of the touch object, and a time interval between the second contact event and the first contact event is smaller than a time threshold, and a distance between positions corresponding to the touch screen 1304 of the second contact event and the first contact event is smaller than a distance threshold, the terminal device enters the fourth state; when the terminal device is in the fourth state, if the processor 1302 detects that the touch object generates displacement on the touch screen 1304, the display content at the position where the touch screen 1304 generates displacement is highlighted; alternatively, if the processor 1302 detects that the touch object is away from the touch screen 1304, the touch screen 1304 displays the focus cursor at the location of the second touch event.
In one possible implementation manner, when the terminal device is in the first mode, the touch screen 1304 receives a trigger operation from a touch object, including: when the terminal device is in the first mode, the touch screen 1304 displays a first interface, and the first interface includes a floating button; when triggering of the floating button is received, the touch screen 1304 expands the floating button in a first interface, and the expanded floating button comprises a first control corresponding to a first mode and a second control corresponding to a second mode; the touch screen 1304 receives a trigger operation on the second control.
In one possible implementation, before the touch screen 1304 displays the floating cursor in the second mode, the method includes: the terminal device switches from the first mode to the second mode.
In one possible implementation manner, switching, by the terminal device, from the first mode to the second mode includes: processor 1302 registers the virtual cursor device; the processor 1302 switches a module for processing an event generated in the touch screen 1304 from the handwriting event conversion module to the event adaptation processing module; the handwriting event conversion module is used for processing a handwriting event in the touch screen 1304, and the event adaptation processing module is used for processing a cursor input event in the touch screen 1304.
In one possible implementation, processor 1302 registers a virtual cursor device including: the processor 1302 creates a virtual device identifier; the processor 1302 creates a virtual input device using the virtual device identifier; the processor 1302 sets the input device as a touch object.
In one possible implementation, the processor 1302 switches the module for processing the event generated in the touch screen 1304 from the handwriting event conversion module to the event adaptation processing module, including: processor 1302 deletes the handwriting event conversion module and adds the event adaptation processing module.
In one possible implementation, processor 1302 de-registers the virtual cursor device when the terminal device switches from the second mode to the first mode.
In one possible implementation, processor 1302 de-registers the virtual cursor device including: processor 1302 deletes the event adaptation process modules and adds the handwriting event conversion modules.
In one possible implementation manner, when the terminal device is in the first mode, the receiving the trigger operation from the touch object includes: when the terminal device is in the first mode, the interface circuit 1303 receives a trigger instruction from the stylus pen; the triggering instruction is as follows: the target button of the stylus pen is generated when the user performs a single-click operation, a double-click operation or a long-press operation, or when the stylus pen performs a preset gesture action.
In a possible implementation manner, when the terminal device is in the second mode, if the touch screen 1304 receives an operation of a touch object for switching to the first mode, the touch screen 1304 cancels display of a floating cursor and enters the first mode; when the terminal device is in the first mode, if the touch screen 1304 receives a sliding operation on the user interface, the processor 1302 implements one or more of the following functions based on the sliding operation: page turning, page sliding, displaying a sliding track on a page, displaying a dynamic effect, or displaying a prompt box for deleting a message.
In a possible implementation manner, when the terminal device is in the second mode, the interface circuit 1303 establishes a connection with the large-screen device, and projects content displayed in the touch screen 1304 onto the large-screen device; or, after the interface circuit 1303 is connected to the large-screen device, the terminal device enters the second mode, and the content displayed in the touch screen 1304 is projected onto the large-screen device.
In a possible embodiment, the touch screen display device 130 may further include: a memory unit 1301. The memory unit 1301, the processor 1302, the interface circuit 1303, and the touch screen 1304 are connected by wires.
The storage unit 1301 may include one or more memories, which may be devices in one or more devices or circuits for storing programs or data.
The storage unit 1301 may be independent and connected to the processor 1302 of the touch screen display device through a communication line. The memory unit 1301 may be integrated with the processor 1302.
The memory unit 1301 may store computer-executable instructions of the method in the terminal device to cause the processor 1302 to perform the method in the above-described embodiments.
The storage unit 1301 may be a register, a cache, or a RAM, etc., and the storage unit 1301 may be integrated with the processor 1302. The memory unit 1301 may be a read-only memory (ROM) or other type of static storage device that may store static information and instructions, and the memory unit 1301 may be separate from the processor 1302.
In a possible implementation manner, the computer execution instructions in the embodiment of the present application may also be referred to as application program codes, which is not specifically limited in the embodiment of the present application.
Optionally, the interface circuit 1303 may further include a transmitter and/or a receiver. Optionally, the processor 1302 may include one or more CPUs, and may also be other general-purpose processors, Digital Signal Processors (DSPs), Application Specific Integrated Circuits (ASICs), and the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of a method disclosed in connection with the present application may be embodied directly in a hardware processor, or in a combination of the hardware and software modules in the processor.
The embodiment of the application also provides a computer readable storage medium. The methods described in the above embodiments may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. If implemented in software, the functions may be stored on or transmitted over as one or more instructions or code on a computer-readable medium. Computer-readable media may include computer storage media and communication media, and may include any medium that can communicate a computer program from one place to another. A storage medium may be any target medium that can be accessed by a computer.
In one possible implementation, the computer-readable medium may include RAM, ROM, a compact disk read-only memory (CD-ROM) or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer. Also, any connection is properly termed a computer-readable medium. For example, if the software is transmitted from a website, server, or other remote source using a coaxial cable, fiber optic cable, twisted pair, Digital Subscriber Line (DSL), or wireless technologies such as infrared, radio, and microwave, then the coaxial cable, fiber optic cable, twisted pair, DSL, or wireless technologies such as infrared, radio, and microwave are included in the definition of medium. Disk and Disc, as used herein, includes Disc, laser Disc, optical Disc, Digital Versatile Disc (DVD), floppy disk and blu-ray Disc where disks usually reproduce data magnetically, while discs reproduce data optically with lasers. Combinations of the above should also be included within the scope of computer-readable media.
Embodiments of the present application are described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processing unit of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processing unit of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
The above embodiments are only for illustrating the embodiments of the present invention and are not to be construed as limiting the scope of the present invention, and any modifications, equivalent substitutions, improvements and the like made on the basis of the embodiments of the present invention shall be included in the scope of the present invention.

Claims (18)

1. A touch screen display method is applied to terminal equipment comprising a touch screen, and the method comprises the following steps:
when the terminal equipment is in a first mode, receiving trigger operation from a touch object; in the first mode, if the terminal device receives a first sliding operation on a user interface, the page content of the user interface changes along with the first sliding operation;
responding to the triggering operation, and displaying the floating cursor in a second mode by the terminal equipment;
when the terminal device is in the second mode, if the terminal device receives a second sliding operation on the user interface, the terminal device controls the floating cursor to move along with the sliding position of the second sliding operation, and the page content in the user interface is not changed.
2. The method of claim 1, further comprising:
when the terminal equipment is in the first mode, if the terminal equipment receives a clicking operation aiming at a target control on the user interface, the terminal equipment jumps to a page corresponding to the target control;
and/or when the terminal device is in the second mode, if the terminal device receives a click operation for a target control on the user interface, the terminal device moves the floating cursor to a position where the click operation is triggered in the touch screen.
3. The method according to claim 1 or 2, wherein if the terminal device receives a second sliding operation on the user interface, the terminal device controls the floating cursor to move along with a sliding position of the second sliding operation, including:
when the terminal equipment detects a first contact event of the touch object, the terminal equipment enters a first state;
when the terminal equipment is in the first state, if the terminal equipment detects that the touch object does not leave the touch screen and the touch object generates displacement in the touch screen, the terminal equipment enters a second state;
and when the terminal equipment is in the second state, the terminal equipment controls the floating cursor to move according to the report point displacement of the touch object.
4. The method of claim 3, wherein the terminal device controls the floating cursor to move according to the pointing displacement of the touch object, and the method comprises:
the terminal equipment converts the report point information of the touch object into coordinate information;
and the terminal equipment controls the floating cursor to move according to the coordinate information.
5. The method of claim 4, wherein the terminal device converts the hit information of the touch object into coordinate information, and comprises:
and the terminal equipment removes other information except the coordinate information in the report point information to obtain the coordinate information.
6. The method of claim 3, wherein after the terminal device enters the first state, the method further comprises:
when the terminal equipment is in the first state, if the terminal equipment detects that the touch object does not generate displacement on the touch screen, the terminal equipment leaves the touch screen, and the terminal equipment enters a third state;
when the terminal device is in the third state, if the terminal device detects a second contact event of the touch object, a time interval between the second contact event and the first contact event is smaller than a time threshold, and a distance between positions corresponding to the touch screen of the second contact event and the first contact event is smaller than a distance threshold, the terminal device enters a fourth state;
when the terminal device is in the fourth state, if the terminal device detects that the touch object generates displacement on the touch screen, the display content of the displacement position of the touch screen of the terminal device is highlighted; or, if the terminal device detects that the touch object leaves the touch screen, the terminal device displays a focus cursor at the position of the second contact event.
7. The method according to claim 1 or 2, wherein the receiving a trigger operation from a touch object when the terminal device is in the first mode comprises:
when the terminal equipment is in the first mode, the terminal equipment displays a first interface, and the first interface comprises a suspension button;
when the trigger of the suspension button is received, the terminal equipment expands the suspension button in the first interface, and the expanded suspension button comprises a first control corresponding to the first mode and a second control corresponding to the second mode;
and the terminal equipment receives the triggering operation of the second control.
8. The method according to claim 1 or 2, wherein before the terminal device displays the floating cursor in the second mode, the method comprises:
the terminal device switches from the first mode to the second mode.
9. The method of claim 8, wherein switching the terminal device from the first mode to the second mode comprises:
the terminal equipment registers virtual cursor equipment;
the terminal equipment switches a module for processing the event generated in the touch screen from a handwriting event conversion module to an event adaptation processing module; the handwriting event conversion module is used for processing a handwriting event in the touch screen, and the event adaptation processing module is used for processing a cursor input event in the touch screen.
10. The method of claim 9, wherein the terminal device registering a virtual cursor device comprises:
the terminal device creates a virtual device identifier;
the terminal device creates a virtual input device using the virtual device identifier;
and the terminal equipment sets the input equipment as a touch object.
11. The method of claim 9, wherein the terminal device switches a module for processing the event generated in the touch screen from a handwriting event conversion module to an event adaptation processing module, and comprises:
and the terminal equipment deletes the handwriting event conversion module and adds the event adaptation processing module.
12. The method of claim 9, further comprising:
when the terminal device is switched from the second mode to the first mode, the terminal device unregisters the virtual cursor device.
13. The method of claim 12, wherein the terminal device de-registering the virtual cursor device comprises:
and the terminal equipment deletes the event adaptation processing module and adds the handwriting event conversion module.
14. The method according to claim 1 or 2, wherein the receiving a trigger operation from a touch object when the terminal device is in the first mode comprises:
when the terminal equipment is in the first mode, receiving a trigger instruction from a stylus pen; the trigger instruction is as follows: the target button of the stylus pen is generated when the target button receives the single-click operation, the double-click operation or the long-time pressing operation of the user, or the target button of the stylus pen is generated when the stylus pen executes the preset gesture action.
15. The method of claim 1 or 2, further comprising:
when the terminal equipment is in the second mode, if receiving the operation of the touch object for switching to the first mode, the terminal equipment cancels the display of the floating cursor and enters the first mode;
when the terminal device is in the first mode, if the terminal device receives a sliding operation on the user interface, the terminal device implements one or more of the following functions based on the sliding operation: page turning, page sliding, displaying a sliding track on a page, displaying a dynamic effect, or displaying a prompt box for deleting a message.
16. The method of claim 1 or 2, further comprising:
when the terminal equipment is in the second mode, the terminal equipment is connected with large-screen equipment, and content displayed in the terminal equipment is projected to the large-screen equipment;
or after the terminal equipment is connected with the large-screen equipment, the terminal equipment enters the second mode and projects the content displayed in the terminal equipment to the large-screen equipment.
17. An electronic device, comprising: a processor and a memory, the processor to invoke a program in the memory to cause the electronic device to perform the method of any of claims 1-16.
18. A computer-readable storage medium having instructions stored thereon that, when executed, cause a computer to perform the method of any of claims 1-16.
CN202210012992.4A 2022-01-07 2022-01-07 Touch screen display method and device and storage medium Pending CN114035721A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210012992.4A CN114035721A (en) 2022-01-07 2022-01-07 Touch screen display method and device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210012992.4A CN114035721A (en) 2022-01-07 2022-01-07 Touch screen display method and device and storage medium

Publications (1)

Publication Number Publication Date
CN114035721A true CN114035721A (en) 2022-02-11

Family

ID=80141360

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210012992.4A Pending CN114035721A (en) 2022-01-07 2022-01-07 Touch screen display method and device and storage medium

Country Status (1)

Country Link
CN (1) CN114035721A (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160260410A1 (en) * 2015-03-03 2016-09-08 Seiko Epson Corporation Display apparatus and display control method
WO2016188317A1 (en) * 2016-01-15 2016-12-01 中兴通讯股份有限公司 Projection device, control method thereof and computer readable storage medium
US20170024105A1 (en) * 2015-07-22 2017-01-26 Xiaomi Inc. Method and Apparatus for Single-Hand Operation on Full Screen
CN110058755A (en) * 2019-04-15 2019-07-26 广州视源电子科技股份有限公司 A kind of method, apparatus, terminal device and the storage medium of PowerPoint interaction
CN110995923A (en) * 2019-11-22 2020-04-10 维沃移动通信(杭州)有限公司 Screen projection control method and electronic equipment
WO2020244623A1 (en) * 2019-06-06 2020-12-10 华为技术有限公司 Air-mouse mode implementation method and related device
US20210232294A1 (en) * 2020-01-27 2021-07-29 Fujitsu Limited Display control method and information processing apparatus
CN113641283A (en) * 2021-07-05 2021-11-12 华为技术有限公司 Electronic device, screen writing mode switching method and medium thereof

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160260410A1 (en) * 2015-03-03 2016-09-08 Seiko Epson Corporation Display apparatus and display control method
US20170024105A1 (en) * 2015-07-22 2017-01-26 Xiaomi Inc. Method and Apparatus for Single-Hand Operation on Full Screen
WO2016188317A1 (en) * 2016-01-15 2016-12-01 中兴通讯股份有限公司 Projection device, control method thereof and computer readable storage medium
CN110058755A (en) * 2019-04-15 2019-07-26 广州视源电子科技股份有限公司 A kind of method, apparatus, terminal device and the storage medium of PowerPoint interaction
WO2020244623A1 (en) * 2019-06-06 2020-12-10 华为技术有限公司 Air-mouse mode implementation method and related device
CN110995923A (en) * 2019-11-22 2020-04-10 维沃移动通信(杭州)有限公司 Screen projection control method and electronic equipment
US20210232294A1 (en) * 2020-01-27 2021-07-29 Fujitsu Limited Display control method and information processing apparatus
CN113641283A (en) * 2021-07-05 2021-11-12 华为技术有限公司 Electronic device, screen writing mode switching method and medium thereof

Similar Documents

Publication Publication Date Title
WO2020052529A1 (en) Method for quickly adjusting out small window in fullscreen display during video, graphic user interface and terminal
CN110377250B (en) Touch method in screen projection scene and electronic equipment
EP3952263A1 (en) Notification message preview method and electronic device
EP3800876A1 (en) Method for terminal to switch cameras, and terminal
US20220137713A1 (en) Gesture Processing Method and Device
WO2021017889A1 (en) Display method of video call appliced to electronic device and related apparatus
WO2021000881A1 (en) Screen splitting method and electronic device
WO2021052214A1 (en) Hand gesture interaction method and apparatus, and terminal device
CN111147660B (en) Control operation method and electronic equipment
CN110737386A (en) screen capturing method and related equipment
WO2021052282A1 (en) Data processing method, bluetooth module, electronic device, and readable storage medium
WO2021213120A1 (en) Screen projection method and apparatus, and electronic device
WO2020177622A1 (en) Method for displaying ui assembly and electronic device
WO2021036770A1 (en) Split-screen processing method and terminal device
WO2021180089A1 (en) Interface switching method and apparatus and electronic device
WO2021115194A1 (en) Application icon display method and electronic device
WO2021032097A1 (en) Air gesture interaction method and electronic device
WO2022007707A1 (en) Home device control method, terminal device, and computer-readable storage medium
CN112383664B (en) Device control method, first terminal device, second terminal device and computer readable storage medium
WO2019072178A1 (en) Method for processing notification, and electronic device
CN113254120A (en) Data processing method and related device
CN112558825A (en) Information processing method and electronic equipment
US20210377642A1 (en) Method and Apparatus for Implementing Automatic Translation by Using a Plurality of TWS Headsets Connected in Forwarding Mode
CN114035721A (en) Touch screen display method and device and storage medium
WO2020062310A1 (en) Stylus detection method, system, and related device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination