WO2022218352A1 - Procédé et appareil de fonctionnement tactile - Google Patents

Procédé et appareil de fonctionnement tactile Download PDF

Info

Publication number
WO2022218352A1
WO2022218352A1 PCT/CN2022/086662 CN2022086662W WO2022218352A1 WO 2022218352 A1 WO2022218352 A1 WO 2022218352A1 CN 2022086662 W CN2022086662 W CN 2022086662W WO 2022218352 A1 WO2022218352 A1 WO 2022218352A1
Authority
WO
WIPO (PCT)
Prior art keywords
input
sliding
movement
contact
contact mark
Prior art date
Application number
PCT/CN2022/086662
Other languages
English (en)
Chinese (zh)
Inventor
董川
Original Assignee
维沃移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 维沃移动通信有限公司 filed Critical 维沃移动通信有限公司
Publication of WO2022218352A1 publication Critical patent/WO2022218352A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04812Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus

Definitions

  • the present application belongs to the field of communication technologies, and in particular relates to a touch operation method and device.
  • touch-sensitive electronic devices such as mobile phones and tablet computers can be seen everywhere in people's lives.
  • a finger and a stylus are used to perform touch operations on the display screen of an electronic device, if the size of the display screen of the electronic device is small, the fingers and the stylus often block the user's sight, making it impossible for the user to accurately Control the finger and the stylus to touch the to-be-operated position corresponding to the touch operation on the display screen. Invalid touch operation is caused, and the touch operation efficiency is reduced.
  • the purpose of the embodiments of the present application is to provide a touch operation method and device, which can solve the problem of low touch operation efficiency.
  • an embodiment of the present application provides a touch operation method, and the method includes:
  • the operation area is an area in the display interface other than the area where the touch point identifier is located;
  • the function corresponding to the third input is executed at the termination position of the movement of the contact mark.
  • an embodiment of the present application provides a control operation device, the device comprising:
  • a receiving module for receiving the first input
  • a display module configured to display a contact identifier on a display interface in response to the first input
  • the receiving module is further configured to receive a second input in an operation area, where the operation area is an area other than the area where the contact mark is located in the display interface;
  • control module for controlling the movement of the contact mark based on the second input in response to the second input
  • the receiving module is further configured to receive a third input in the operation area
  • An execution module configured to, in response to the third input, execute the function corresponding to the third input at the termination position of the movement of the contact mark.
  • embodiments of the present application provide an electronic device, the electronic device includes a processor, a memory, and a program or instruction stored on the memory and executable on the processor, the program or instruction being The processor implements the steps of the method according to the first aspect when executed.
  • an embodiment of the present application provides a readable storage medium, where a program or an instruction is stored on the readable storage medium, and when the program or instruction is executed by a processor, the steps of the method according to the first aspect are implemented .
  • an embodiment of the present application provides a chip, the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is configured to run a program or an instruction, and implement the first aspect the method described.
  • the contact identifier may be displayed on the display interface in response to the first input.
  • the movement of the contact mark is controlled based on the second input, and in response to the third input received in the operation area, the function corresponding to the third input is executed at the termination position of the movement of the contact mark.
  • the operation area is an area on the display screen other than the area where the contact mark is located, there is a certain distance between the operation area and the contact mark. Therefore, when a finger and a stylus are used to move the contact mark on the operation area through the second input, the finger and the stylus cannot block the user's sight, and the user can precisely locate the moving position of the contact mark. Therefore, the contact mark can be precisely controlled to move to the position to be operated, so that the finger and the stylus can precisely trigger the function corresponding to the third input at the position to be operated.
  • FIG. 1 is a flowchart of a touch operation method provided by an embodiment of the present application
  • FIG. 2 is a schematic diagram of a target pop-up window interface provided by an embodiment of the present application.
  • FIG. 3 is a schematic diagram of a quick operation interface provided by an embodiment of the present application.
  • FIG. 4 is a flowchart of another touch operation method provided by an embodiment of the present application.
  • FIG. 5 is one of the operation schematic diagrams provided by the embodiment of the present application.
  • FIG. 6 is the second operation schematic diagram provided by the embodiment of the present application.
  • FIG. 7 is the third operation schematic diagram provided by the embodiment of the present application.
  • FIG. 8 is the fourth schematic diagram of operation provided by the embodiment of the present application.
  • FIG. 9 is the fifth schematic diagram of operation provided by the embodiment of the present application.
  • FIG. 10 is the sixth schematic diagram of operation provided by the embodiment of the present application.
  • FIG. 11 is a block diagram of a touch operation device provided by an embodiment of the present application.
  • FIG. 12 is a schematic structural diagram of an electronic device provided by an embodiment of the present application.
  • FIG. 13 is a schematic diagram of a hardware structure of an electronic device provided by an embodiment of the present application.
  • first, second and the like in the description and claims of the present application are used to distinguish similar objects, and are not used to describe a specific order or sequence. It is to be understood that the data so used are interchangeable under appropriate circumstances so that the embodiments of the present application can be practiced in sequences other than those illustrated or described herein, and distinguish between “first”, “second”, etc.
  • the objects are usually of one type, and the number of objects is not limited.
  • the first object may be one or more than one.
  • “and/or” in the description and claims indicates at least one of the connected objects, and the character “/" generally indicates that the associated objects are in an "or” relationship.
  • touch-sensitive electronic devices such as mobile phones and tablet computers can be seen everywhere in people's lives.
  • the touch electronic device may be a terminal equipped with a touch display screen.
  • the terminal may be a mobile phone, a tablet computer, an e-book reader, smart glasses, a smart watch, a music player, a notebook computer, a laptop portable computer, a desktop computer, and the like.
  • the touch operation method may include:
  • Step 101 Receive a first input.
  • the terminal may have a precise operation mode (also known as a precise operation module).
  • the precise operation mode the user can control the finger and the stylus to precisely touch the to-be-operated position, so as to precisely control the contact piece to perform the touch operation at the to-be-operated position.
  • the to-be-operated position refers to a position where the user wants to perform a touch operation. For example, the words "Hello everyone" are displayed in the display interface. The user wants to click on the word "home”. The position where the word "home” is located is the position to be operated.
  • the precise operation mode may include various touch operation modes such as a click operation mode, a long press operation mode, and a slide operation mode.
  • the terminal may receive the first input on the setting page to enable the precise operation mode.
  • the settings page displays a precise operation indicator.
  • the terminal may receive the first input for the precise operation identifier on the setting page to enable the precise operation mode.
  • the setting page may display precise operation identifiers corresponding to different types of precise operation modes one-to-one.
  • the setting page may be any page displayed by the terminal, such as the program page of the target application, the target pop-up window interface, or the target floating frame page.
  • the first input may be a type of input such as a tap, a long press, a swipe, a hovering gesture, or a voice input.
  • the user can slide to input the word "W" in a black screen state.
  • the terminal can receive the sliding input of the word W.
  • the user can press and hold anywhere in the program interface of the target application.
  • the terminal may display the target pop-up window interface 201 after receiving the long-press input.
  • the target pop-up window interface 201 includes an operation bar area 202 , and a precise operation identifier 203 is displayed in the operation bar area 202 .
  • the operation bar area 202 may be located on the right side of the target pop-up window interface 201 .
  • the user can click the precise operation logo. This enables the terminal to receive the click input for the precise operation identifier on the target pop-up window interface.
  • the user may slide down from the top of the touch display screen in the display interface to perform a pull-down operation.
  • the terminal may display the shortcut operation interface 301 after receiving the sliding down input.
  • the shortcut operation interface 301 includes a precise operation identifier 302 . That is, a shortcut setting entry for precise operation mode is displayed on the shortcut operation interface.
  • the user can click the precise operation logo.
  • the terminal is made to receive the click input for the precise operation identifier on the shortcut operation interface.
  • Step 102 In response to the first input, display the contact identifier on the display interface.
  • the terminal may display the contact identifier on the display interface in response to the first input.
  • the contact identification may be a red dot icon, a finger icon, or a colored arrow icon, or the like.
  • the terminal can display the contact mark anywhere on the display interface.
  • the terminal can display the contact mark at a set position in the display interface. The set position may be the center position of the touch display screen or the like.
  • the terminal may also display the contact identifier on the display interface based on the user's input operation.
  • the process of displaying the contact identifier on the display interface by the terminal may include: the terminal receives a sixth input in the display interface, and in response to the sixth input, displays the contact identifier on the display interface based on the sixth input.
  • the sixth input may be a type of input such as a click, a long press, a slide, a hovering gesture, or a voice input.
  • the terminal displaying the contact identifier on the display interface based on the sixth input may include: the terminal is on the display interface, and the operation corresponding to the sixth input The location shows the contact identification.
  • the terminal displaying the contact identifier on the display interface based on the sixth input may include: on the display interface, the terminal displays the touch point at the end position of the sixth input. Point ID.
  • the termination position of the sliding input may be the position when the finger is lifted in the sliding input.
  • the user may tap the screen within the range of the set area where the position to be operated is located, that is, near the position to be operated.
  • the terminal is made to receive the click input in the display interface, and in response to the click input, the touch point identifier is displayed at the click position corresponding to the click input.
  • the user can control the finger and the stylus to slide on the touch screen to the set area where the position to be operated is located.
  • the terminal is made to receive the sliding input, and in response to the sliding input, display the contact mark at the end position of the sliding input.
  • the touch point identification can be displayed at the set position based on the sixth input. Therefore, the display position of the contact mark can be controlled manually, and the flexibility is high.
  • the user can display the contact mark near the to-be-operated position through the sixth input, thereby reducing the moving distance of subsequently moving the contact mark from the current position to the to-be-operated position, that is, reducing the subsequent operation of moving the contact mark and improving the operation. efficiency and improve user experience.
  • Step 103 Receive a second input in an operation area, where the operation area is an area in the display interface other than the area where the contact mark is located.
  • the terminal displays the contact mark its display interface can be divided into an area where the contact mark is located and an operation area, where the operation area is an area other than the area where the contact mark is located.
  • the operation area may be the entire area except the area where the contact mark is located, or the operation area may be a set part area except the area where the contact mark is located.
  • the area where the contact point identifier is located may refer to a circular area with the contact point identifier as the center and a set numerical radius. Or, take the contact mark as the center point, and set the length and width as a rectangular area.
  • the operation area in this embodiment of the present application is not limited, and it needs to be ensured that when the user uses a finger and a stylus to perform an operation in the operation area, the finger and the stylus do not block the line of sight of the contact mark.
  • Step 104 In response to the second input, control the movement of the contact mark based on the second input.
  • the second input may be a type of input such as a click, a long press, a slide, a hovering gesture, or a voice input.
  • the terminal controls the movement of the contact mark based on the second input to further describe.
  • the second input is a sliding input.
  • the process of the terminal controlling the movement of the contact identifier based on the second input may include: the terminal acquiring the sliding direction information and the track length information of the sliding input. According to the sliding direction information and the track length information, the contact mark is controlled to move the track length along the sliding direction.
  • the terminal may acquire the sliding direction information and track length information of the sliding input in real time. Therefore, it is possible to control the length of the track of the contact mark moving along the sliding direction in real time.
  • the terminal can respond to the sliding input, acquire the sliding direction information and track length information of the finger and the stylus in real time, and control the contact identification. Move the track length in the sliding direction.
  • the contact mark can be made to follow the movement of the finger and the stylus in real time, which is convenient for the user to determine the display position of the contact mark more intuitively, thereby facilitating precise control of the movement of the contact mark to the position to be operated.
  • the process in which the terminal may acquire the sliding direction information and the track length information of the sliding input in real time may include: the terminal may collect the starting position and the ending position of the sliding input. According to the vector from the start position to the end position, the sliding direction of the sliding input is the direction of the vector, and the track length is the length of the vector. to obtain the sliding direction information and track length information of the sliding input.
  • the origin of the terminal screen coordinate system is the position of the first pixel in the upper left corner of the display screen.
  • the X coordinate axis of the screen coordinate system is the row direction where the row pixels are located
  • the Y coordinate axis is the column direction where the column pixels are located.
  • the angle between the vector and the horizontal direction is arcsin3/5 ⁇ 36.87°, that is, the sliding direction is 36.87° with the positive direction of the X coordinate axis. direction.
  • the terminal control contact mark moves 5 pixels along a direction that forms an angle of 36.87° with the positive direction of the X coordinate axis.
  • the process of controlling the contact mark to move the track length along the sliding direction by the terminal may include: the terminal calculates and obtains the sliding input according to a preset scaling ratio between the track length and the moving distance of the contact mark.
  • the terminal control contact mark moves the target moving distance along the sliding direction.
  • the moving distance of the contact mark and the track length have a preset scaling ratio.
  • the terminal calculates that the moving distance of the target is equal to the track length of the sliding input.
  • the terminal controls the contact mark to move the track length along the sliding direction.
  • the ratio of the moving distance of the touch point identifier to the track length is less than 1, the calculated target moving distance is reduced according to the preset scaling ratio compared to the track length of the sliding input.
  • the actual moving distance of the terminal control contact mark along the sliding direction is less than the track length of the sliding input, which is convenient for the user to control the actual moving distance of the contact mark.
  • the ratio of the moving distance of the contact mark to the track length may also be greater than 1, which is not limited in this embodiment of the present application.
  • the track length is P pixels. If the preset scaling ratio between the track length and the moving distance of the contact mark is: 2:1, the target moving distance corresponding to the track length is 1/2 ⁇ P pixel; if the preset scaling ratio between the track length and the moving distance of the contact mark is Set the zoom ratio as: 1:1, then the target moving distance corresponding to the track length is P pixels; if the preset zoom ratio between the track length and the moving distance of the contact mark is: 2:1, then the target corresponding to the track length The moving distance is 2 ⁇ P pixels.
  • the process of the terminal controlling the movement of the contact identifier based on the second input may further include: acquiring sliding speed information of the sliding input. Based on this, according to the sliding direction information and the track length information, the process of controlling the contact mark to move the track length along the sliding direction may include: the terminal controls the contact mark to adopt the sliding speed according to the sliding direction information, the track length information and the sliding speed information, Move the track length in the sliding direction. In this way, the contact mark and the finger and the stylus ensure the same speed and movement trajectory, that is, the contact mark moves in parallel with the finger and the stylus, and maintains the same distance instantaneously.
  • the user can more directly control the movement of the contact mark through the finger and the stylus, and intuitively determine the display position of the contact mark. It is convenient to precisely control the movement of the contact mark to the position to be operated. Improve the visual effect of operation, improve operation efficiency, and improve user experience.
  • the process in which the terminal may acquire the sliding speed information of the sliding input may include: the terminal may collect the starting position, the ending position and the sliding time of the sliding input. According to the length of the vector from the start position to the end position and the sliding time, the sliding speed is obtained to obtain the sliding speed information.
  • Terminal control contact identification adopts Velocity, which moves the track length in the sliding direction.
  • the second input is a click input.
  • the process of the terminal controlling the movement of the touch point identifier based on the second input may include: the terminal may acquire the target distance and the movement direction between the click position of the click input and the display position of the touch point identifier.
  • the control contact marks move the target distance in the moving direction.
  • the moving direction may be the direction from the display position to the click position.
  • the process of the terminal controlling the contact mark to move the target distance along the moving direction according to the sliding direction information and the track length information may include: the terminal calculates the click input according to a preset scaling ratio between the target distance and the moving distance of the contact mark.
  • the terminal control contact point identifies the moving distance of the moving target along the moving direction.
  • the terminal calculates and obtains the target moving distance corresponding to the target distance input by clicking on the basis of the preset scaling ratio between the target distance and the moving distance of the touch point identifier.
  • the terminal calculates and obtains the explanation and implementation manner of the target moving distance corresponding to the trajectory length of the sliding input according to the preset scaling ratio between the trajectory length and the moving distance of the touch point identifier, which is not repeated in this embodiment of the present application.
  • the terminal may receive the second input multiple times in the operation area, and in response to receiving the second input each time, control the movement of the contact identifier based on the second input.
  • the method further includes: the terminal receives another second input in the operation area, and in response to the other second input, controlling the contact identifier to continue to move based on the second input.
  • the terminal after the terminal receives another second input in the operation area, in response to the second input, it can control the contact point identifier based on the other second input, and continue from the end position where the contact point identifier moves based on the previous second input. move. That is, before the terminal receives the third input in the operation area, the user can execute the second input in the operation area again after completing a second input and control the finger and the stylus to leave the touch screen display. The terminal is caused to receive the second input again, and in response to the second input, the contact identifier is controlled to continue to move based on the second input.
  • the terminal controls the contact mark to move to the position A based on the second input. Then, after receiving the second input, in response to the second input, the terminal controls the contact identifier to move from the position A based on the second input. It should be noted that, for the explanation and implementation manner of the terminal controlling the movement of the contact point identifier based on the second input, reference may be made to the explanation and implementation manner of the terminal controlling the movement of the contact point identifier based on the second input in the foregoing step 102. I won't go into details.
  • Step 105 Receive a third input in the operation area.
  • the third input when the user thinks that the desired position to be operated has been reached, the third input may be performed in the operation area.
  • the terminal may receive the third input in the operating area.
  • the input type of the third input is different from that of the second input. In this way, the terminal can distinguish whether the function corresponding to the third input needs to be executed at the terminal position where the contact mark moves.
  • the second input may be a type of input such as a click, a long press, a slide, a hovering gesture, or a voice input.
  • Step 106 In response to the third input, execute the function corresponding to the third input at the termination position of the movement of the contact mark.
  • the function corresponding to the third input may be an operation function.
  • the function corresponding to the third input may be a function set according to the actual situation. Wherein, when the third input is a different type of input, the functions corresponding to the third input may be different, thereby realizing different types of precise operation modes.
  • the function corresponding to the third input may be to trigger a click event at the termination position of the movement of the contact mark, in order to terminate the operation.
  • the position performs the real click operation, and realizes the click operation mode of the precise operation mode.
  • the corresponding function of the third input can be to trigger a long-press event at the end position of the movement of the contact mark, so as to perform a real long-press operation at the end position and realize the long-press in the precise operation mode operating mode.
  • the third input when the third input is a trigger operation of the same type, the third input may also correspond to different functions.
  • precise operation identifiers corresponding to different precise operation mode types one-to-one are displayed on the setting page.
  • the function corresponding to the third input may be determined according to the precise operation identifier corresponding to the first input.
  • the function corresponding to the third input is to trigger a click event at the end position of the movement of the contact mark, so as to perform a real click operation for the end position, and realize the click operation mode of the precise operation mode.
  • the touch operation method can display the touch point identifier on the display interface in response to the first input.
  • the movement of the contact mark is controlled based on the second input, and in response to the third input received in the operation area, the function corresponding to the third input is executed at the termination position of the movement of the contact mark.
  • the operation area is an area on the display screen other than the area where the contact mark is located, there is a certain distance between the operation area and the contact mark.
  • the finger and the stylus cannot block the user's sight, and the user can precisely locate the moving position of the contact mark. Therefore, the contact mark can be precisely controlled to move to the position to be operated, so that the finger and the stylus can precisely trigger the function corresponding to the third input at the position to be operated.
  • invalid touch operations are reduced, the false touch rate of operations is reduced, the touch operation efficiency is improved, and the user experience is improved.
  • FIG. 4 shows a flowchart of another touch operation method provided by an embodiment of the present application.
  • the triggering operation method can also be applied to the aforementioned touch electronic device.
  • the touch operation method may include:
  • Step 401 Receive a first input.
  • Step 402 In response to the first input, display the contact identifier on the display interface.
  • Step 403 Receive a second input in an operation area, where the operation area is an area in the display interface other than the area where the contact mark is located.
  • Step 404 In response to the second input, control the movement of the contact mark based on the second input.
  • Step 405 Receive a third input in the operation area.
  • Step 406 In response to the third input, execute the function corresponding to the third input at the termination position of the movement of the contact mark.
  • the precise operation mode may also include a trajectory drawing mode.
  • a third input of a specific input type may correspond to a trajectory drawing mode. Then, when the third input received by the terminal in step 405 is the third input of a specific input type, the function of the third input in step 406 is the function of determining the position of the starting point of drawing.
  • the setting page may display a precise operation identifier corresponding to the trajectory drawing mode, and the precise operation identifier is a trajectory drawing identifier.
  • the function of the third input in step 406 is the function of determining the position of the starting point for drawing.
  • the process of performing the function corresponding to the third input by the terminal at the end position of the movement of the contact mark may include: recording the end position of the movement of the contact mark.
  • the method further includes:
  • Step 407 Receive a fourth input in the operation area.
  • the fourth input may be a type of input such as a click, a long press, a slide, a hovering gesture, or a voice input.
  • the input type of the fourth input is different from that of the third input.
  • the fourth input may be of the same input type as the second input.
  • Step 408 In response to the fourth input, control the movement of the contact mark based on the fourth input, and according to the movement track of the contact mark, draw the set pattern with the end position as the drawing starting point, and display the set pattern.
  • the terminal in response to the fourth input, may draw the set pattern with the end position as the drawing starting point according to the movement trajectory of the movement of the contact mark based on the fourth input, and display the set pattern.
  • the setting pattern may include other shapes such as selection box patterns, line patterns, or other types of patterns.
  • the selection frame pattern may also include a rectangular selection frame pattern or a circular selection frame pattern.
  • the method further includes: the terminal displays pattern selection information, where the pattern selection information may include pattern identifiers corresponding to multiple types of patterns.
  • the terminal receives a seventh input of a target pattern identifier among the plurality of pattern identifiers, in response to the seventh input. Select the type of set pattern.
  • the terminal draws the selected type setting pattern with the end position as the drawing starting point.
  • the seventh input may be a type of input such as a click, a long press, a slide, a hovering gesture, or a voice input.
  • the pattern selection information may include a pattern identifier corresponding to the selection frame pattern, and a pattern identifier corresponding to the line pattern.
  • the setting pattern is a selection frame pattern.
  • the process that the terminal draws the set pattern with the termination position as the starting point for drawing according to the movement trajectory of the contact mark may include: using the termination position as the starting point, drawing a selection box pattern with the movement trajectory as the diagonal.
  • the terminal may acquire the position of the movement track point of the contact mark in real time during the process of controlling the movement of the contact mark based on the fourth input.
  • the positions of the four vertices constituting the target rectangle are calculated. Connect the vertices at the four positions to draw a selection box pattern.
  • the target rectangle is a rectangle whose diagonal is the line connecting the end position and the position of the moving track point.
  • the terminal acquires the pixel coordinates of the position of the first movement track point of the touch point identifier in real time as (2, 2). Then the pixel coordinates of the positions of the four vertices constituting the target rectangle are calculated as (1,1), (1,2), (2,2) and (2,2) in sequence. Connect the four vertices to draw the first selection box pattern. The pixel coordinates of the position of the second movement track point identified by the contact point obtained in real time are (2, 3).
  • the pixel coordinates of the positions of the four vertices constituting the target rectangle are calculated as (1,1), (1,3), (2,3) and (2,1) in sequence. Connect the four vertices to draw the second selection box pattern and delete the first selection box pattern.
  • the set pattern is a line pattern.
  • the process that the terminal draws the set pattern with the termination position as the drawing starting point according to the movement trajectory of the touch point identifier may include: drawing a line pattern of the movement trajectory.
  • the terminal can acquire the position of each movement trajectory point in the movement trajectory of the contact mark in real time, and use the termination position as the starting point to connect the movement trajectory points of each position. , to draw a line pattern.
  • the terminal acquires in real time the pixel coordinates of the position of the first movement track point in the movement track of the contact mark as (2, 2).
  • the pixel coordinates of the position of the second movement track point identified by the contact point obtained in real time are (2, 3).
  • the terminal may receive the fourth input in the operation area multiple times, and in response to each received fourth input, control the movement of the contact point identifier based on the fourth input.
  • the terminal receives the fifth input in the operation area, and before executing the function corresponding to the setting pattern, the method further includes: the terminal receives another input in the operation area.
  • a fourth input in response to another fourth input, controls the contact marker to continue to move based on the fourth input.
  • the terminal continuously draws the line pattern of the moving track according to the moving track of the contact point identification.
  • the continuous movement of the terminal based on another fourth input control contact identifier may refer to that the terminal moves for a certain distance based on a fourth input control contact identifier.
  • the terminal receives another fourth input, and in response to the other fourth input, the terminal controls the contact mark to continue to move from the stop position based on the other fourth input.
  • the stop position is the position at which the contact mark stops moving while the terminal moves for a certain distance based on a fourth input control contact mark.
  • the pixel coordinates of the termination position recorded by the terminal are (1, 1).
  • the terminal moves to the position where the pixel coordinates are (1, 2) based on a fourth input control contact identifier.
  • the terminal receives another fourth input, and in response to the other fourth input, the terminal controls the contact mark to continue to move from the position where the pixel coordinates are (1, 2).
  • Step 409 Receive a fifth input in the operation area.
  • the fifth input when the user thinks that the desired pattern termination position has been reached, the fifth input may be performed in the operation area. So that the terminal can receive the fifth input in the operation area.
  • the input type of the fifth input is different from that of the fourth input. In this way, the terminal can distinguish whether the function corresponding to the set pattern needs to be executed.
  • the fifth input may be a type of input such as a click, a long press, a slide, a hovering gesture, or a voice input.
  • Step 410 In response to the fifth input, execute the function corresponding to the set pattern.
  • the function corresponding to the setting pattern may be an operation function such as a frame selection function or a drawing and display function.
  • the function corresponding to the set pattern may be a function set according to the actual situation.
  • the types of the setting patterns are different, and the functions corresponding to the setting patterns are different.
  • the function corresponding to the setting pattern may be a frame selection function.
  • the function corresponding to the set pattern may be a drawing and display function, that is, displaying the drawn line pattern.
  • the types of setting patterns are different, and the functions corresponding to the setting patterns are the same.
  • the function corresponding to the setting pattern is a selection function, that is, the setting pattern is used to select the function of the information desired by the user.
  • the content of the area where the set pattern is located in the display interface is selected.
  • the terminal may display an operation selection that can be performed on the selected content, for example, a deletion operation selection, a copy operation selection, etc., so that the user can perform subsequent selections on the selected content.
  • the function corresponding to the setting pattern may be a frame selection function.
  • the user wants to box-select the three words "Hello".
  • the termination position of the contact identifier recorded by the terminal may be "you" in the upper left corner.
  • the fourth input is a sliding input
  • the fifth input is a two-click input, that is, a double-click input.
  • the user controls the finger and stylus to slide the input in the lower right direction in the operation area.
  • the terminal receives the sliding input, and in response to the sliding input, controls the contact mark to slide synchronously in the sliding direction of the sliding input.
  • the terminal receives the double-click input, and in response to the double-click input, "Hello" is box-selected.
  • the method before step 405 and/or step 409, the method further includes: when it is detected that the contact mark stops moving, displaying prompt information.
  • the prompt information is used to prompt the user for the executable input and the corresponding function of the input.
  • the executable operations may include at least one of: a second input and a third input.
  • step 402 and before step 405 it is assumed that the second input is sliding input, the third input is double-clicking input, and the function corresponding to the double-clicking input is a click operation.
  • a prompt message can be displayed.
  • the prompt information may include "You can continue to slide the screen to control the movement of the contact mark. Alternatively, you can double-tap the screen to end the movement of the contact mark and perform the click operation".
  • the function of the third input is the function of determining the position of the starting point of drawing
  • the fourth input is sliding input
  • the fifth input is double-clicking input
  • the function corresponding to the double-clicking input is the drawing display function.
  • the prompt information can include "You can continue to slide the screen to control the movement of the contact mark to draw the movement track. Alternatively, you can double-click the screen to end the movement of the contact mark and execute the display track".
  • the embodiment of the present application uses the following two examples to re-describe the aforementioned touch operation method.
  • the user wants to select a phrase from an article with a smaller font displayed on the terminal. If the user wants to select the phrase "high-level security service” framed by the selection box 501 in the display interface shown in FIG. 5 . At this point, the user needs to click the starting position of the phrase "high-level security service”, which is the position of the word "high”, and then click the end position of the phrase "high-level security service”, which is the position of the word "service”, to complete the "high-level security service". "The choice of the phrase.
  • the first input is a click input
  • the second input is a sliding input
  • the third input is two click inputs within a set duration, that is, a double-click input
  • the contact mark is a red dot icon
  • the function corresponding to the third input is Trigger a click event at the end position of the touchpoint marker movement to perform the actual click action for the end position.
  • the user can slide down from the top of the touch screen to make the terminal display a shortcut operation interface.
  • the shortcut operation interface includes a precise operation identifier corresponding to the click operation mode. The user clicks the precise operation identifier corresponding to the click operation mode, so that the terminal enables the click operation mode in the precise operation mode.
  • the user can move his finger on the touch screen at will, and finally click on the screen at the initial position A near the position of the word "high” in the phrase "high-level security service” displayed on the display interface, that is, the first position near the position of the word "high”. Drop your finger.
  • the terminal receives the click input, and displays a red dot icon at the click position (initial position A) corresponding to the click input. At this time, as shown in FIG. 6 , the user can see that a red dot icon 601 is displayed at the initial position A in the upper right corner direction near the word "high".
  • the user slides a finger in the lower left corner of the touch screen (the direction indicated by the arrow in Figure 8) in the lower right operation area of the area where the red dot icon is located.
  • the terminal controls the red dot icon to move from the initial position A to the lower left corner in synchronization with the sliding of the finger.
  • the user can lift the finger at any time to end the sliding operation, and the red dot icon stays and displayed at the end position of the sliding operation.
  • the user can swipe the finger in the operation area again.
  • the terminal controls the red dot icon to continue to move in response to the sliding input.
  • the user can double-tap the screen in the operation area.
  • the terminal receives the double-click input in the operation area, and in response to the double-click input, performs the click operation at the end position of the movement of the contact mark, that is, the position of the word "high” currently displayed by the red dot icon.
  • the user has completed the operation of clicking the starting position of the phrase "high-level security service", that is, the position of the word "high”.
  • the user can repeat the above process, so that the terminal performs the click operation again at the position of the word "service”. Complete the selection of the phrase "High Security Service”.
  • the user wants to draw an arc-shaped pattern with the position point B as the starting point.
  • the function corresponding to the arc pattern is the drawing display function.
  • the first input is a click input
  • the second input is a sliding input
  • the third input is a double-click input
  • the contact mark is a red dot icon
  • the function corresponding to the third input is the function of recording the starting point of trajectory drawing
  • the fourth input is sliding input
  • the fifth input is a double-click input.
  • the user can slide down from the top of the touch screen to make the terminal display a shortcut operation interface.
  • the shortcut operation interface includes a trajectory drawing mark corresponding to the trajectory drawing mode. The user clicks on the trajectory drawing logo, so that the terminal turns on the trajectory drawing mode in the precise operation mode.
  • the user can move his finger on the touch screen at will, and click the screen at the initial position C near the final position point B.
  • the terminal receives the click input, and displays a red dot icon at the click position (initial position C) corresponding to the click input.
  • a red dot icon 901 is displayed at the initial position C near the position point B.
  • the user slides his finger towards the lower right corner of the touch screen in the lower right operation area where the red dot icon is located.
  • the terminal controls the red dot icon 901 to move from the initial position C to the lower right corner in synchronization with the sliding of the finger.
  • the user double-taps the screen in the operating area.
  • the terminal receives the double-click input in the operation area, and records the position point B in response to the double-click input.
  • the user starts from any position in the operation area and slides his finger in the direction indicated by the arrow in FIG. 10 .
  • the direction indicated by the arrow in FIG. 10 is the arc direction of the circular arc pattern desired by the user.
  • the terminal controls the red dot icon to move from position B to the direction indicated by the arrow in synchronization with the sliding of the finger, and draws and displays the movement track with position B as the drawing starting point. line pattern.
  • the user can lift the finger at any time to end the sliding operation, and the red dot icon stays and displayed at the end position of the sliding operation.
  • the user can swipe the finger in the operation area again.
  • the terminal controls the red dot icon to continue to move in response to the sliding input, and continues to draw the line pattern of the moving trajectory starting from the end position of the last sliding operation.
  • the user determines to complete the operation of drawing the line pattern.
  • the user can double-click the screen in the operation area to make the terminal exit the trajectory drawing mode in the precise operation mode.
  • the terminal receives the double-click input in the operation area, and displays the drawn line pattern in response to the double-click input.
  • the touch operation method can display the touch point identifier on the display interface in response to the first input.
  • the movement of the contact mark is controlled based on the second input, and in response to the third input received in the operation area, the function corresponding to the third input is executed at the termination position of the movement of the contact mark.
  • the operation area is an area on the display screen other than the area where the contact mark is located, there is a certain distance between the operation area and the contact mark.
  • the finger and the stylus cannot block the user's sight, and the user can precisely locate the moving position of the contact mark. Therefore, the contact mark can be precisely controlled to move to the position to be operated, so that the finger and the stylus can precisely trigger the function corresponding to the third input at the position to be operated.
  • invalid touch operations are reduced, the false touch rate of operations is reduced, the touch operation efficiency is improved, and the user experience is improved.
  • the execution body may be a touch operation device, or a control module in the touch operation device for executing the method for touch operation.
  • a method for performing a touch operation by a touch operation device is used as an example to describe the device for touch operation provided by the embodiments of the present application.
  • FIG. 11 shows a block diagram of a touch operation device provided by an embodiment of the present application.
  • the touch operation device 1100 includes a receiving module 1101 , a display module 1102 , a control module 1103 and an execution module 1104 .
  • a receiving module 1101, configured to receive a first input
  • a display module 1102 configured to display the contact identifier on the display interface in response to the first input
  • the receiving module 1101 is further configured to receive the second input in the operation area, and the operation area is an area other than the area where the contact mark is located in the display interface;
  • control module 1103 configured to control the movement of the contact identifier based on the second input in response to the second input;
  • the receiving module 1101 is further configured to receive a third input in the operation area
  • the execution module 1104 is configured to, in response to the third input, execute the function corresponding to the third input at the termination position of the movement of the contact mark.
  • the second input includes a sliding input; the control module 1103 is further configured to:
  • the contact mark is controlled to move the track length along the sliding direction.
  • control module 1103 is further configured to:
  • the target moving distance corresponding to the track length of the sliding input is calculated
  • the control contact marks the moving distance of the moving target along the sliding direction.
  • control module 1103 is further configured to:
  • the control contact identification adopts the sliding speed to move the track length along the sliding direction.
  • the display module 1102 is further configured to:
  • prompt information is displayed, and the prompt information is used to prompt the user to perform operations and functions corresponding to the operations, and the executable operations include at least one of the following: a second input and a third input.
  • a track drawing marker is displayed, and the first input includes a first input for the track drawing marker; the execution module 1104 is further configured to record the end position of the movement of the contact marker.
  • the receiving module 1101 is further configured to receive the fourth input in the operation area.
  • the control module 1103 is further configured to, in response to the fourth input, control the movement of the contact identifier based on the fourth input;
  • the device also includes: a drawing module, used for drawing a set pattern with the end position as the drawing starting point according to the movement track of the contact mark.
  • the display module 1102 is also used for displaying the setting pattern.
  • the receiving module 1101 is further configured to receive the fifth input in the operation area.
  • the execution module 1104 is further configured to execute a function corresponding to the set pattern in response to the fifth input.
  • the drawing module is further configured to take the end position as a starting point, to draw a selection box pattern with a movement trajectory as a diagonal line; or, to draw a line pattern of the movement trajectory.
  • the touch operation device provided by the embodiment of the present application can display the touch point identifier on the display interface through the display module in response to the first input.
  • the control module controls the movement of the contact marker based on the second input in response to the second input received at the operating area, such that the execution module can execute the third input at the termination position of the movement of the contact marker in response to the third input received at the operating area corresponding function.
  • the operation area is an area on the display screen other than the area where the contact mark is located, there is a certain distance between the operation area and the contact mark.
  • the finger and the stylus cannot block the user's sight, and the user can precisely locate the moving position of the contact mark. Therefore, the contact mark can be precisely controlled to move to the position to be operated, so that the finger and the stylus can precisely trigger the function corresponding to the third input at the position to be operated.
  • invalid touch operations are reduced, the false touch rate of operations is reduced, the touch operation efficiency is improved, and the user experience is improved.
  • the touch operation device in the embodiments of the present application may be a device, or may be a component, an integrated circuit, or a chip in a terminal.
  • the apparatus may be a mobile electronic device or a non-mobile electronic device.
  • the mobile electronic device may be a mobile phone, a tablet computer, a notebook computer, a palmtop computer, an in-vehicle electronic device, a wearable device, an ultra-mobile personal computer (UMPC), a netbook, or a personal digital assistant (personal digital assistant). assistant, PDA) and so on.
  • the non-mobile electronic device can be a server, a network attached storage (Network Attached Storage, NAS), a personal computer (personal computer, PC), a television (television, TV), a teller machine or a self-service machine, etc., which are not specifically limited in the embodiments of the present application.
  • Network Attached Storage NAS
  • PC personal computer
  • TV television
  • teller machine a self-service machine
  • the touch operation device in the embodiment of the present application may be a device with an operating system.
  • the operating system may be an Android (Android) operating system, an iOS operating system, or other possible operating systems, which are not specifically limited in the embodiments of the present application.
  • the touch operation device provided in this embodiment of the present application can implement each process implemented by the method embodiments in FIG. 1 and FIG. 4 , and to avoid repetition, details are not described here.
  • an embodiment of the present application further provides an electronic device 1200, including a processor 1201, a memory 1202, a program or instruction stored in the memory 1202 and executable on the processor 1201,
  • an electronic device 1200 including a processor 1201, a memory 1202, a program or instruction stored in the memory 1202 and executable on the processor 1201,
  • the program or instruction is executed by the processor 1201
  • each process of the above-mentioned embodiments of the touch operation method can be realized, and the same technical effect can be achieved. To avoid repetition, details are not described here.
  • the electronic devices in the embodiments of the present application include the aforementioned mobile electronic devices and non-mobile electronic devices.
  • FIG. 13 is a schematic diagram of a hardware structure of an electronic device implementing an embodiment of the present application.
  • the electronic device 1300 includes but is not limited to: a radio frequency unit 1301 , a network module 1302 , an audio output unit 1303 , an input unit 1304 , a sensor 1305 , a display unit 1306 , a user input unit 1307 , an interface unit 1308 , and a memory 1309 , and the processor 1310 and other components.
  • the electronic device 1300 may also include a power source (such as a battery) for supplying power to various components, and the power source may be logically connected to the processor 1310 through a power management system, so as to manage charging, discharging, and power consumption through the power management system. consumption management and other functions.
  • a power source such as a battery
  • the structure of the electronic device shown in FIG. 13 does not constitute a limitation on the electronic device.
  • the electronic device may include more or less components than the one shown, or combine some components, or arrange different components, which will not be repeated here. .
  • the processor 1310 is configured to receive the first input.
  • the display unit 1306 is configured to display the contact identification on the display interface in response to the first input.
  • the processor 1310 is configured to receive a second input in an operation area, where the operation area is an area in the display interface other than the area where the touch point identifier is located. In response to the second input, movement of the contact marker is controlled based on the second input. A third input is received within the operating area. In response to the third input, the function corresponding to the third input is executed at the termination position of the movement of the contact mark.
  • the operation area is an area on the display screen other than the area where the contact mark is located, there is a certain distance between the operation area and the contact mark. Therefore, when a finger and a stylus are used to move the contact mark on the operation area through the second input, the finger and the stylus cannot block the user's sight, and the user can precisely locate the moving position of the contact mark.
  • the contact mark can be precisely controlled to move to the position to be operated, so that the finger and the stylus can precisely trigger the function corresponding to the third input at the position to be operated.
  • invalid touch operations are reduced, the false touch rate of operations is reduced, the touch operation efficiency is improved, and the user experience is improved.
  • the second input includes a sliding input; the processor 1310 is further configured to acquire sliding direction information and track length information of the sliding input. According to the sliding direction information and the track length information, the contact mark is controlled to move the track length along the sliding direction.
  • the processor 1310 is further configured to calculate and obtain the target movement distance corresponding to the trajectory length of the sliding input according to the preset scaling ratio between the trajectory length and the movement distance of the touch point identifier.
  • the touch point identifier is controlled to move the target moving distance along the sliding direction.
  • the processor 1310 is further configured to acquire sliding speed information of the sliding input. According to the sliding direction information, the track length information and the sliding speed information, the contact identifier is controlled to use the sliding speed to move the track length along the sliding direction.
  • the processor 1310 is further configured to display prompt information when it is detected that the contact mark stops moving, where the prompt information is used to prompt the user for an operation that can be performed and a function corresponding to the operation.
  • the operation includes at least one of: the second input and the third input.
  • a track drawing marker is displayed, and the first input includes a first input for the track drawing marker; the processor 1310 is further configured to record the end position of the movement of the contact marker;
  • the processor 1310 is further configured to receive a fourth input in the operation area.
  • the contact mark is controlled to move based on the fourth input, and according to the movement track of the contact mark, a set pattern is drawn with the end position as a drawing starting point.
  • the display unit 1306 is further configured to display the setting pattern.
  • the processor 1310 is further configured to receive a fifth input in the operation area. In response to the fifth input, the function corresponding to the setting pattern is executed.
  • the processor 1310 is further configured to use the termination position as a starting point to draw a selection box pattern with the movement trajectory as a diagonal line; or, to draw a line pattern of the movement trajectory.
  • the operation area is an area on the display screen other than the area where the contact mark is located, there is a certain distance between the operation area and the contact mark. Therefore, when a finger and a stylus are used to move the contact mark on the operation area through the second input, the finger and the stylus cannot block the user's sight, and the user can precisely locate the moving position of the contact mark.
  • the contact mark can be precisely controlled to move to the position to be operated, so that the finger and the stylus can precisely trigger the function corresponding to the third input at the position to be operated.
  • invalid touch operations are reduced, the false touch rate of operations is reduced, the touch operation efficiency is improved, and the user experience is improved.
  • the input unit 1304 may include a graphics processor (Graphics Processing Unit, GPU) 13041 and a microphone 13042. Such as camera) to obtain still pictures or video image data for processing.
  • the display unit 1306 may include a display panel 13061, which may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like.
  • the user input unit 1307 includes a touch panel 13071 and other input devices 13072 .
  • the touch panel 13071 is also called a touch screen.
  • the touch panel 13071 may include two parts, a touch detection device and a touch controller.
  • Other input devices 13072 may include, but are not limited to, physical keyboards, function keys (such as volume control keys, switch keys, etc.), trackballs, mice, and joysticks, which will not be repeated here.
  • Memory 1309 may be used to store software programs as well as various data, including but not limited to application programs and operating systems.
  • the processor 1310 may integrate an application processor and a modem processor, wherein the application processor mainly processes the operating system, user interface, and application programs, and the like, and the modem processor mainly processes wireless communication. It can be understood that, the above-mentioned modulation and demodulation processor may not be integrated into the processor 1310.
  • the embodiments of the present application further provide a readable storage medium, where a program or an instruction is stored on the readable storage medium, and when the program or instruction is executed by a processor, each process of the above-mentioned embodiments of the touch operation method is implemented, and can achieve The same technical effect, in order to avoid repetition, will not be repeated here.
  • the processor is the processor in the electronic device described in the foregoing embodiments.
  • the readable storage medium includes a computer-readable storage medium, such as a computer read-only memory (Read-Only Memory, ROM), a random access memory (Random Access Memory, RAM), a magnetic disk or an optical disk, and the like.
  • An embodiment of the present application further provides a chip, where the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is used for running a program or an instruction to implement the above embodiments of the touch operation method and can achieve the same technical effect, in order to avoid repetition, it will not be repeated here.
  • the chip mentioned in the embodiments of the present application may also be referred to as a system-on-chip, a system-on-chip, a system-on-a-chip, or a system-on-a-chip, or the like.
  • the disclosed apparatus and method may be implemented in other manners.
  • the apparatus embodiments described above are only illustrative.
  • the division of the units is only a logical function division. In actual implementation, there may be other division methods.
  • multiple units or components may be combined or Can be integrated into another system, or some features can be ignored, or not implemented.
  • the shown or discussed mutual coupling or direct coupling or communication connection may be through some interfaces, indirect coupling or communication connection of devices or units, and may be in electrical, mechanical or other forms.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

La présente demande concerne un procédé et un appareil pour une opération tactile, qui appartiennent au domaine technique des communications. Le procédé consiste à : recevoir une première entrée ; en réponse à la première entrée, afficher un identifiant de point de contact sur une interface d'affichage ; recevoir une deuxième entrée dans une zone de fonctionnement, la zone de fonctionnement étant une zone dans l'interface d'affichage autre que la zone dans laquelle se situe l'identifiant du point tactile ; en réponse à la deuxième entrée et d'après la deuxième entrée, amener l'identifiant du point tactile à se déplacer ; recevoir une troisième entrée dans la zone de fonctionnement ; et en réponse à la troisième entrée, exécuter une fonction correspondant à la troisième entrée au niveau d'une position d'arrêt vers laquelle l'identifiant du point tactile a été déplacé.
PCT/CN2022/086662 2021-04-16 2022-04-13 Procédé et appareil de fonctionnement tactile WO2022218352A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110413329.0A CN113485590A (zh) 2021-04-16 2021-04-16 触控操作方法及装置
CN202110413329.0 2021-04-16

Publications (1)

Publication Number Publication Date
WO2022218352A1 true WO2022218352A1 (fr) 2022-10-20

Family

ID=77932963

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/086662 WO2022218352A1 (fr) 2021-04-16 2022-04-13 Procédé et appareil de fonctionnement tactile

Country Status (2)

Country Link
CN (1) CN113485590A (fr)
WO (1) WO2022218352A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113485590A (zh) * 2021-04-16 2021-10-08 维沃移动通信有限公司 触控操作方法及装置

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103019588A (zh) * 2012-11-26 2013-04-03 中兴通讯股份有限公司 一种触摸定位方法、装置及终端
CN105892786A (zh) * 2015-01-16 2016-08-24 张凯 一种在触摸屏界面上实现文本选择的方法
US20180121076A1 (en) * 2016-10-17 2018-05-03 Gree, Inc. Drawing processing method, drawing program, and drawing device
CN108132752A (zh) * 2017-12-21 2018-06-08 维沃移动通信有限公司 一种文本编辑方法及移动终端
CN109426410A (zh) * 2017-09-05 2019-03-05 华为终端(东莞)有限公司 控制光标移动的方法、内容选择方法、控制页面滚动的方法及电子设备
CN113485590A (zh) * 2021-04-16 2021-10-08 维沃移动通信有限公司 触控操作方法及装置

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103019588A (zh) * 2012-11-26 2013-04-03 中兴通讯股份有限公司 一种触摸定位方法、装置及终端
CN105892786A (zh) * 2015-01-16 2016-08-24 张凯 一种在触摸屏界面上实现文本选择的方法
US20180121076A1 (en) * 2016-10-17 2018-05-03 Gree, Inc. Drawing processing method, drawing program, and drawing device
CN109426410A (zh) * 2017-09-05 2019-03-05 华为终端(东莞)有限公司 控制光标移动的方法、内容选择方法、控制页面滚动的方法及电子设备
CN108132752A (zh) * 2017-12-21 2018-06-08 维沃移动通信有限公司 一种文本编辑方法及移动终端
CN113485590A (zh) * 2021-04-16 2021-10-08 维沃移动通信有限公司 触控操作方法及装置

Also Published As

Publication number Publication date
CN113485590A (zh) 2021-10-08

Similar Documents

Publication Publication Date Title
US11808562B2 (en) Devices and methods for measuring using augmented reality
US11907446B2 (en) Devices and methods for creating calendar events based on hand-drawn inputs at an electronic device with a touch-sensitive display
US11314407B2 (en) Device, method, and graphical user interface for providing feedback for changing activation states of a user interface object
US20200371676A1 (en) Device, Method, and Graphical User Interface for Providing and Interacting with a Virtual Drawing Aid
KR102367838B1 (ko) 동시에 열린 소프트웨어 애플리케이션들을 관리하기 위한 디바이스, 방법, 및 그래픽 사용자 인터페이스
US10437360B2 (en) Method and apparatus for moving contents in terminal
US8806369B2 (en) Device, method, and graphical user interface for managing and interacting with concurrently open software applications
US8976140B2 (en) Touch input processor, information processor, and touch input control method
EP2204729A2 (fr) Appareil de traitement des informations, procédé de traitement des informations et programme
US20130055119A1 (en) Device, Method, and Graphical User Interface for Variable Speed Navigation
WO2021203724A1 (fr) Procédé et appareil de sélection d'écriture manuscrite, dispositif informatique et support d'enregistrement
EP2474896A2 (fr) Appareil et procédé de traitement d'informations et programme informatique
JP2015007949A (ja) 表示装置、表示制御方法及びコンピュータプログラム
US9836211B2 (en) Device, method, and graphical user interface for selection of views in a three-dimensional map based on gesture inputs
US10521101B2 (en) Scroll mode for touch/pointing control
US20140298223A1 (en) Systems and methods for drawing shapes and issuing gesture-based control commands on the same draw grid
WO2019119799A1 (fr) Procédé d'affichage d'icone d'application et dispositif terminal
KR20090102727A (ko) 디스플레이 장치의 화면 크기 제어 방법 및 장치
WO2022218352A1 (fr) Procédé et appareil de fonctionnement tactile
JP2013109529A (ja) 入力装置、入力装置の制御方法、制御プログラム、および記録媒体

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22787574

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE