WO2021203821A1 - Procédé et dispositif de manipulation de page, support de stockage et terminal - Google Patents

Procédé et dispositif de manipulation de page, support de stockage et terminal Download PDF

Info

Publication number
WO2021203821A1
WO2021203821A1 PCT/CN2021/075278 CN2021075278W WO2021203821A1 WO 2021203821 A1 WO2021203821 A1 WO 2021203821A1 CN 2021075278 W CN2021075278 W CN 2021075278W WO 2021203821 A1 WO2021203821 A1 WO 2021203821A1
Authority
WO
WIPO (PCT)
Prior art keywords
control
pressing
controls
floating window
touch
Prior art date
Application number
PCT/CN2021/075278
Other languages
English (en)
Chinese (zh)
Inventor
邓俊杰
Original Assignee
Oppo广东移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Oppo广东移动通信有限公司 filed Critical Oppo广东移动通信有限公司
Publication of WO2021203821A1 publication Critical patent/WO2021203821A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop

Definitions

  • This application relates to the field of computer technology, and in particular to a page manipulation method, device, storage medium, and terminal.
  • the size of the terminal display screen is getting larger and larger, which improves the display effect of the terminal to a certain extent.
  • the display screen is operated with one hand, some areas are difficult to reach. In this case, the operation is easy to hold and unstable, resulting in the risk of the terminal falling.
  • the current mainstream practices include double-clicking the Home button to make the overall interface drop, or one-handed triggering the interface to scale down, or clicking the floating ball on the interface to make a cursor appear at the top of the interface, mapping corresponding operations, such as moving, Click, long press, etc.
  • mapping corresponding operations such as moving, Click, long press, etc.
  • the embodiments of the application provide a page manipulation method, device, storage medium, and terminal, which only need to perform a touch operation on the edge area of the display screen that can be touched by a finger, and the display can be displayed at the touch position corresponding to the touch operation
  • the corresponding function controls in the page are simple and convenient to operate without changing the size of the actual page.
  • an embodiment of the present application provides a page manipulation method, the method includes:
  • the set of functional controls is displayed at the touch position.
  • an embodiment of the present application provides a page manipulation device, the device including:
  • a position obtaining module configured to receive a touch command input in the edge area of the display screen, and obtain a touch position corresponding to the touch command;
  • a control acquisition module configured to acquire a collection of functional controls in the page currently displayed on the display screen
  • the control display module is used to display the functional control set at the touch position.
  • an embodiment of the present application provides a computer storage medium that stores a plurality of instructions, and the instructions are suitable for being loaded by a processor and executing the above method steps.
  • an embodiment of the present application provides a terminal, which may include: a processor and a memory; wherein the memory stores a computer program, and the computer program is adapted to be loaded by the processor and execute the above method steps.
  • the terminal receives the touch command input in the edge area of the display screen, obtains the touch position corresponding to the touch command, and obtains the set of function controls in the page currently displayed on the display screen, and then click
  • the touch position displays the functional control set. You only need to perform a touch operation on the edge area of the display screen that can be touched by your fingers, and the corresponding function controls in the page can be displayed at the touch position corresponding to the touch operation. The operation is simple and convenient without changing the size of the displayed page.
  • FIG. 1 is a schematic structural diagram of a terminal provided by an embodiment of the present application.
  • FIG. 2 is a schematic diagram of the structure of an operating system and user space provided by an embodiment of the present application
  • Figure 3 is an architecture diagram of the Android operating system in Figure 1;
  • FIG 4 is a structural diagram of the IOS operating system in Figure 1;
  • FIG. 5 is a schematic flowchart of a page manipulation method provided by an embodiment of the present application.
  • FIG. 6 is a schematic diagram of an example of an edge area of a display screen provided by an embodiment of the present application.
  • FIG. 7 is a schematic diagram of an example of a single-handed control terminal provided by an embodiment of the present application.
  • FIG. 8 is a schematic flowchart of a page manipulation method provided by an embodiment of the present application.
  • FIG. 9 is a schematic diagram of an example of a floating window display method provided by an embodiment of the present application.
  • FIG. 10 is a schematic structural diagram of a page manipulation device provided by an embodiment of the present application.
  • FIG. 11 is a schematic structural diagram of a page manipulation device provided by an embodiment of the present application.
  • plural means two or more.
  • “And/or” describes the association relationship of the associated objects, indicating that there can be three types of relationships, for example, A and/or B, which can mean: A alone exists, A and B exist at the same time, and B exists alone.
  • the character “/” generally indicates that the associated objects before and after are in an "or” relationship.
  • FIG. 1 shows a structural block diagram of a terminal provided by an exemplary embodiment of the present application.
  • the terminal in this application may include one or more of the following components: a processor 110, a memory 120, an input device 130, an output device 140, and a bus 150.
  • the processor 110, the memory 120, the input device 130, and the output device 140 may be connected by a bus 150.
  • the processor 110 may include one or more processing cores.
  • the processor 110 uses various interfaces and lines to connect various parts of the entire terminal, and executes the terminal 100 by running or executing instructions, programs, code sets, or instruction sets stored in the memory 120, and calling data stored in the memory 120.
  • the various functions and processing data may use at least one of digital signal processing (DSP), field-programmable gate array (FPGA), and programmable logic array (PLA).
  • DSP digital signal processing
  • FPGA field-programmable gate array
  • PDA programmable logic array
  • the processor 110 may integrate one or a combination of a central processing unit (CPU), a graphics processing unit (GPU), a modem, and the like.
  • the CPU mainly processes the operating system, user pages and applications, etc.; the GPU is used for rendering and drawing of display content; the modem is used for processing wireless communication. It can be understood that the above-mentioned modem may not be integrated into the processor 110, but may be implemented by a communication chip alone.
  • the memory 120 may include random access memory (RAM) or read-only memory (ROM).
  • the memory 120 includes a non-transitory computer-readable storage medium.
  • the memory 120 may be used to store instructions, programs, codes, code sets or instruction sets.
  • the memory 120 may include a program storage area and a data storage area, where the program storage area may store instructions for implementing the operating system and instructions for implementing at least one function (such as touch function, sound playback function, image playback function, etc.) , Instructions for implementing the following various method embodiments, etc., the operating system may be an Android system, including a system based on the in-depth development of the Android system, an IOS system developed by Apple, including a system based on the in-depth development of the IOS system or Other systems.
  • the storage data area can also store data created by the terminal in use, such as phone book, audio and video data, chat record data, and so on.
  • the memory 120 can be divided into an operating system space and a user space.
  • the operating system runs in the operating system space, and native and third-party applications run in the user space.
  • the operating system allocates corresponding system resources for different third-party applications.
  • different application scenarios in the same third-party application also have different requirements for system resources. For example, in the local resource loading scenario, the third-party application has higher requirements for disk reading speed; in the animation rendering scenario, the first Third-party applications have higher requirements for GPU performance.
  • the operating system and third-party applications are independent of each other, and the operating system often cannot perceive the current application scenarios of the third-party applications in time, resulting in the operating system being unable to perform targeted system resource adaptation according to the specific application scenarios of the third-party applications.
  • the memory 120 may store a Linux kernel layer 320, a system runtime library layer 340, an application framework layer 360, and an application layer 380.
  • the Linux kernel layer 320, the system runtime library layer 340, and the application framework layer 360 belong to the operating system space
  • the application layer 380 belongs to the user space.
  • the Linux kernel layer 320 provides low-level drivers for various hardware of the terminal, such as display drivers, audio drivers, camera drivers, Bluetooth drivers, Wi-Fi drivers, power management, and so on.
  • the system runtime layer 340 provides major feature support for the Android system through some C/C++ libraries.
  • SQLite library provides database support
  • OpenGL/ES library provides 3D drawing support
  • Webkit library provides browser kernel support.
  • An Android runtime library (Android runtime) is also provided in the system runtime library layer 340, which mainly provides some core libraries that can allow developers to write Android applications in Java language.
  • the application framework layer 360 provides various APIs that may be used when building applications. Developers can also use these APIs to build their own applications, such as activity management, window management, view management, notification management, content providers, Package management, call management, resource management, location management.
  • At least one application program runs in the application layer 380.
  • These applications can be native applications of the operating system, such as contact programs, SMS programs, clock programs, camera applications, etc.; they can also be developed by third-party developers Third-party applications, such as game applications, instant messaging programs, photo beautification programs, text translation programs, etc.
  • the IOS system includes: the core operating system layer 420 (Core OS layer), the core service layer 440 (Core Services layer), and the media layer. 460 (Media layer), 480 (Cocoa Touch Layer).
  • the core operating system layer 420 includes an operating system kernel, drivers, and underlying program frameworks. These underlying program frameworks provide functions closer to hardware for use by the program frameworks located in the core service layer 440.
  • the core service layer 440 provides system services and/or program frameworks required by the application program, such as a foundation framework, an account framework, an advertisement framework, a data storage framework, a network connection framework, a geographic location framework, a sports framework, and so on.
  • the media layer 460 provides audio-visual interfaces for applications, such as graphics and image-related interfaces, audio technology-related interfaces, video technology-related interfaces, and audio and video transmission technology wireless playback (AirPlay) interfaces.
  • the touchable layer 480 provides various commonly used page-related frameworks for application development, and the touchable layer 480 is responsible for the user's touch interaction operations on the terminal. For example, local notification service, remote push service, advertising framework, game tool framework, message user interface (UI) framework, user page UIKit framework, map framework, and so on.
  • UI message user interface
  • frameworks related to most applications include but are not limited to: the basic framework in the core service layer 440 and the UIKit framework in the touchable layer 480.
  • the basic framework provides many basic object classes and data types, provides the most basic system services for all applications, and has nothing to do with UI.
  • the classes provided by the UIKit framework are the basic UI class libraries used to create touch-based user pages. iOS applications can provide UI based on the UIKit framework, so it provides the basic architecture of the application for building user pages and drawing , Processing and user interaction events, responding to gestures, etc.
  • the method and principle of implementing data communication between a third-party application program and an operating system in the IOS system can be referred to the Android system, which will not be repeated in this application.
  • the input device 130 is used to receive input instructions or data.
  • the input device 130 includes, but is not limited to, a keyboard, a mouse, a camera, a microphone, or a touch device.
  • the output device 140 is used to output instructions or data.
  • the output device 140 includes but is not limited to a display device and a speaker.
  • the input device 130 and the output device 140 may be co-located, and the input device 130 and the output device 140 are touch screens, and the touch screen is used to receive the user's finger, touch pen and other suitable objects on it or Touch operation nearby, and display the user page of each application.
  • the touch screen is usually set on the front panel of the terminal.
  • the touch screen can be designed as a full screen, curved screen or special-shaped screen.
  • the touch display screen can also be designed as a combination of a full screen and a curved screen, or a combination of a special-shaped screen and a curved screen, which is not limited in the embodiments of the present application.
  • the structure of the terminal shown in the above drawings does not constitute a limitation on the terminal, and the terminal may include more or less components than those shown in the figure, or some components may be combined. Or different component arrangements.
  • the terminal also includes components such as a radio frequency circuit, an input unit, a sensor, an audio circuit, a wireless fidelity (WiFi) module, a power supply, and a Bluetooth module, which will not be repeated here.
  • WiFi wireless fidelity
  • the execution subject of each step may be the terminal described above.
  • the execution subject of each step is the operating system of the terminal.
  • the operating system may be an Android system, an IOS system, or other operating systems, which is not limited in the embodiment of the present application.
  • a display device may also be installed on it.
  • the display device may be various devices capable of realizing display functions, such as: cathode ray tube display (CR), light-emitting diode display (light-emitting diode) Emitting diode display (LED for short), electronic ink screen, liquid crystal display (LCD), plasma display panel (PDP for short), etc.
  • CR cathode ray tube display
  • LED light-emitting diode
  • LED light-emitting diode
  • LCD liquid crystal display
  • PDP plasma display panel
  • the terminal may be a smart phone, a tablet computer, a game device, an AR (Augmented Reality) device, a car, a data storage device, an audio playback device, a video playback device, a notebook, a desktop computing device, a wearable device such as an electronic watch , Electronic glasses, electronic helmets, electronic bracelets, electronic necklaces, electronic clothing and other equipment.
  • the processor 110 may be used to call an application program stored in the memory 120, and specifically execute the remote assistance method of the embodiment of the present application.
  • the terminal receives the touch command input in the edge area of the display screen, obtains the touch position corresponding to the touch command, and obtains the set of function controls in the page currently displayed on the display screen, and then click
  • the touch position displays the functional control set. You only need to perform a touch operation on the edge area of the display screen that can be touched by a finger, and the corresponding function controls in the page can be displayed at the touch position corresponding to the touch operation. The operation is simple and convenient without changing the size of the displayed page.
  • a page manipulation method is proposed, which can be implemented by relying on a computer program and can be run on a page manipulation device based on the von Neumann system.
  • the computer program can be integrated in the application or run as an independent tool application.
  • the page manipulation method includes:
  • S101 Receive a touch command input in an edge area of the display screen, and obtain a touch position corresponding to the touch command;
  • the display screen is a touch screen, which is an inductive liquid crystal display device that can receive input signals such as contacts.
  • the tactile feedback system on the screen can drive each of them according to a pre-programmed program.
  • This kind of connection device can be used to replace the mechanical button panel, and use the liquid crystal display screen to create dynamic audio-visual effects.
  • the edge area of the display screen refers to a display area within a certain distance range from the edge of the display screen.
  • the distance range may be preset, customized by the user, or set at the factory.
  • the edge area may include a left edge area, a right edge area, an upper edge area, and a lower edge area, as shown in FIG. 6.
  • the touch command may include, but is not limited to, a pressing command, a sliding screen command, and the pressing command may be a heavy press command, a light press command, a single click command, a double click command, and the like.
  • a sensor is arranged in the edge area. After the user inputs a touch operation, the sensor can collect information such as the touch pressure value, touch position, touch fingerprint, and touch trajectory of the touch operation.
  • the terminal when the user holds the terminal with one hand and touches the edge area of the display screen with a finger (such as a thumb), the terminal receives the touch command and then collects the coordinate information of the touch position.
  • the position where the touch command is output may also be the side of the terminal. It is understandable that a light sensor is provided on the side, and when the user holds the terminal in hand, the light sensor can obtain information that the light is blocked.
  • S102 Acquire a set of function controls in a page currently displayed on the display screen
  • different display pages include different functional control sets.
  • the different display pages can be display pages of different applications, and can also be understood as display pages of different levels of the same application.
  • the functional control set of the home page of the A application includes: search control, scan control, add friend control, and payment control; the functional control set in the next page of the main page of the A application includes friend permission setting control, complaint control, Delete controls, etc.
  • the acquired control After receiving the touch command input in the edge area of the display screen, collect all the functional controls contained in the currently displayed page, or the controls whose use frequency is greater than the frequency threshold among all the functional controls, or rank the first few of the use frequencies among all the functional controls. For the control (such as the first three), the acquired control is determined as a functional control set.
  • the touch position is located at the edge area of the display screen. This area is small and displays more controls. On the one hand, it occupies a larger display space and easily covers the display content on the currently displayed page, but on the other hand it is inconvenient. User operation.
  • Displaying the set of functional controls at the touch position may be displayed directly at the touch position on the currently displayed page, such as displaying in a single column or multiple columns parallel to the edge of the touch position, or displaying at the touch position.
  • the position generates a layer of floating windows, and displays a set of functional controls on the floating window.
  • the set of functional controls in the floating window can also be displayed in a single column or multiple columns parallel to the edge of the touch position.
  • the generated floating window can be suspended on the current display page with a certain degree of transparency, so as to avoid obstructing the display content in the current display page.
  • the size of the floating window can be adaptively adjusted according to the number of display controls, and the floating window can also be displayed in a specified shape, such as a rectangle, a circle, a diamond, a sector, an arc, and the like.
  • the terminal receives the touch command input in the edge area of the display screen, obtains the touch position corresponding to the touch command, and obtains the set of function controls in the page currently displayed on the display screen, and then click
  • the touch position displays the functional control set. It is only necessary to perform a touch operation on the edge area of the display screen that can be touched by a finger, and the corresponding function control in the page can be displayed at the touch position corresponding to the touch operation. The operation is simple and convenient without changing the size of the displayed page.
  • FIG. 8 is a schematic flowchart of another embodiment of a page manipulation method proposed in this application. specific:
  • S201 Receive a touch command input in an edge area of the display screen, where the touch command is a pressing command;
  • the display screen is a touch screen, which is an inductive liquid crystal display device that can receive input signals such as contacts.
  • the entire screen of the display screen is a touchable area, and the display area within a certain distance from the edge of the display screen is the edge area of the display screen, which may include a left edge area, a right edge area, an upper edge area, and a lower edge area.
  • the terminal When the user performs a pressing operation on the edge area of the display screen, the terminal receives the pressing instruction.
  • the embodiment of the present application is mainly applied to a scenario where a terminal is held with one hand for operation.
  • a pressure sensor is provided in the edge area of the display screen to collect the pressure value when the user presses.
  • the pressure value When the pressure value is greater than the pressure threshold, it is determined to be a heavy pressure operation, and when the pressure value is less than or equal to the pressure threshold, it is determined to be a light pressure operation.
  • the obtained pressing pressure value is judged, and when the pressing pressure value is greater than the pressure threshold, and it is confirmed as a heavy pressing operation, the coordinate information of the pressed position is collected.
  • the coordinate information can be understood as the coordinates of the center pixel of the fingerprint, or the coordinates of all pixels in the area corresponding to the fingerprint.
  • the pressing duration corresponding to the pressing instruction may also be acquired, and when the pressing duration is greater than a preset duration, the pressing position corresponding to the pressing instruction is triggered to be acquired. That is, when the user performs a long-press operation on the edge area of the display screen, the acquisition of the pressing position corresponding to the pressing instruction is triggered.
  • S204 Acquire a set of function controls in a page currently displayed on the display screen
  • the floating window can be suspended on the current display page in any shape, such as a rectangle, a circle, a diamond, a fan, an ellipse, an arc, and so on.
  • the size of the floating window can be adjusted according to the number of functional controls.
  • the floating window is set to be transparent, and the transparency can be adjusted according to the user's needs. If the user does not adjust, the default transparency (such as 50% transparency) is used for display.
  • Fig. 9 it is an arc-shaped floating window, which is symmetrically arranged at the pressing position.
  • the terminal when the user presses the floating window to move, the terminal receives a movement instruction, acquires a movement track corresponding to the movement instruction, and then controls the floating window to move synchronously according to the movement track. For example, move from the right edge area of the screen to the left edge area, so as to facilitate the user to change hands and operate, or for example, move up and down on the right edge to move to the position closest to the user's thumb.
  • the set of functional controls may be all the functional controls contained in the currently displayed page, or the controls whose frequency of use is greater than the frequency threshold among all functional controls, and may also be the N controls with the highest frequency of use among all functional controls, where N It is a preset positive integer.
  • the functional controls on the currently displayed page include a scan control, a search control, and a payment control.
  • the three controls are displayed in a row parallel to the edge of the display screen.
  • the display area is expanded to display all the function controls in the function set, thereby facilitating Operate any function control displayed.
  • S207 Receive a selection instruction for a target function control in the function control set, mark the target function control, and display the title of the target function control at a preset position corresponding to the target function control;
  • the terminal When the user clicks the target function control in the floating window, the terminal receives a selection instruction for the target function control, indicating that the function of the control is triggered, and then the control is marked.
  • the method for marking the target function control may be to light up the target function control, light up and enlarge the target function control, light up and adjust the position of the target function control, and so on.
  • the title of the target function control is displayed at a preset position corresponding to the target function control for prompting.
  • the terminal when the user selects a search control, the terminal responds to the selection instruction, lights up and enlarges the search control, and displays the title of "Search" in the attachment of the control to prompt.
  • the selected search control can be located in the middle of the three controls, or can be located above or below, and when selected, it will be adjusted to the middle, which is not specifically limited here.
  • the target function control is a search control, it responds to the search operation and calls up the search page; another example, if the target function control is a scan control, it responds to the scan operation and calls up the scan page.
  • S209 Receive an operation completion instruction for the target function control, and close the floating window.
  • the floating window is automatically closed or a prompt message for closing the floating window is displayed, and the floating window is closed after the user confirms.
  • the floating window is automatically closed.
  • the corresponding functional controls are simple and convenient to operate without changing the size of the displayed page, which solves the problem of one-handed control experience and reduces the impact on the original experience; unified operations and icon styles through floating windows are used as auxiliary functions of the system to enhance user experience ;
  • the size of the display area can also be adaptively adjusted according to the user's operation on the displayed functional control area, which is convenient for the user to operate the required control; in addition, the user can also mark and prompt after the user selects the functional control to avoid misoperation.
  • FIG. 10 shows a schematic structural diagram of a page manipulation apparatus provided by an exemplary embodiment of the present application.
  • the page control device can be implemented as all or a part of the terminal through software, hardware or a combination of the two.
  • the device 1 includes a position acquisition module 11, a control acquisition module 12 and a control display module 13.
  • the position obtaining module 11 is configured to receive a touch command input in the edge area of the display screen, and obtain a touch position corresponding to the touch command;
  • the control acquisition module 12 is configured to acquire a set of functional controls in the page currently displayed on the display screen;
  • the control display module 13 is configured to display the functional control set at the touch position.
  • the touch command is a pressing command
  • the position obtaining module 11 is specifically configured to:
  • the pressing position corresponding to the pressing instruction is acquired.
  • control display module 13 is specifically used for:
  • the set of functional controls is displayed on the floating window.
  • the device further includes a floating window moving module 14 for:
  • the device further includes a function trigger module 15 for:
  • the device further includes a floating window closing module 16 for:
  • the device further includes a frequency obtaining module 17, configured to obtain the use frequency of each functional control in the functional control set;
  • the control display module 13 is specifically used for:
  • N is a preset positive integer.
  • the device further includes an area expansion module 18 for:
  • the display area is enlarged to display all the function controls in the function set.
  • the corresponding functional controls are simple and convenient to operate without changing the size of the displayed page, which solves the problem of one-handed control experience and reduces the impact on the original experience; unified operations and icon styles through floating windows are used as auxiliary functions of the system to enhance user experience ;
  • the size of the display area can also be adaptively adjusted according to the user's operation on the displayed functional control area, which is convenient for the user to operate the required control; in addition, the user can also mark and prompt after the user selects the functional control to avoid misoperation.
  • the embodiment of the present application also provides a computer storage medium.
  • the computer storage medium may store a plurality of instructions, and the instructions are suitable for being loaded by a processor and executing the method steps of the embodiments shown in FIGS. 1 to 9 above.
  • the specific execution process please refer to the specific description of the embodiment shown in FIG. 1 to FIG. 9, which will not be repeated here.
  • This application also provides a computer program product, the computer program product stores at least one instruction, and the at least one instruction is loaded by the processor and executes the method steps of the embodiment shown in FIG. 1 to FIG. 9, specifically For the execution process, please refer to the specific description of the embodiment shown in FIG. 1 to FIG. 9, which will not be repeated here.
  • the program can be stored in a computer readable storage medium. During execution, it may include the procedures of the above-mentioned method embodiments.
  • the storage medium can be a magnetic disk, an optical disc, a read-only storage memory, or a random storage memory, etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

La présente invention porte, dans des modes de réalisation, sur un procédé et sur un dispositif de manipulation de page, sur un support de stockage et sur un terminal, le procédé consistant : à recevoir une instruction tactile entrée au niveau d'une région de bord d'un écran de visualisation et à obtenir une position tactile correspondant à l'instruction tactile ; à obtenir une commande de fonction définie dans une page d'affichage actuelle sur l'écran de visualisation ; et à afficher la commande de fonction définie au niveau de la position tactile. En utilisant les modes de réalisation de la présente invention, la commande de fonction correspondante dans la page peut être affichée à la position tactile correspondant à l'opération tactile en réalisant simplement l'opération tactile au niveau de la zone de bord de l'écran de visualisation qui peut être touchée par un doigt et l'opération est simple et pratique dans la situation dans laquelle la taille de la page d'affichage n'est pas modifiée.
PCT/CN2021/075278 2020-04-07 2021-02-04 Procédé et dispositif de manipulation de page, support de stockage et terminal WO2021203821A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010265400.0A CN111443863A (zh) 2020-04-07 2020-04-07 页面操控方法、装置、存储介质及终端
CN202010265400.0 2020-04-07

Publications (1)

Publication Number Publication Date
WO2021203821A1 true WO2021203821A1 (fr) 2021-10-14

Family

ID=71651040

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/075278 WO2021203821A1 (fr) 2020-04-07 2021-02-04 Procédé et dispositif de manipulation de page, support de stockage et terminal

Country Status (2)

Country Link
CN (1) CN111443863A (fr)
WO (1) WO2021203821A1 (fr)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114116089A (zh) * 2021-11-08 2022-03-01 广州鸿大智能科技有限公司 一种数据可视化方法、装置、设备及存储介质
CN114153368A (zh) * 2021-12-07 2022-03-08 Oppo广东移动通信有限公司 应用控制方法和系统
CN114327183A (zh) * 2021-12-24 2022-04-12 Oppo广东移动通信有限公司 应用控制方法、装置、电子设备、芯片及存储介质
CN114518832A (zh) * 2022-02-15 2022-05-20 网易(杭州)网络有限公司 触控终端的显示控制方法、装置及电子设备
CN114661205A (zh) * 2022-03-11 2022-06-24 支付宝(杭州)信息技术有限公司 一种应用提示信息展示方法、装置及设备

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111443863A (zh) * 2020-04-07 2020-07-24 Oppo广东移动通信有限公司 页面操控方法、装置、存储介质及终端
CN111913621B (zh) * 2020-07-29 2022-04-19 海信视像科技股份有限公司 屏幕界面交互显示方法及显示设备
CN111913622B (zh) * 2020-07-29 2022-04-19 海信视像科技股份有限公司 屏幕界面交互显示方法及显示设备
CN112882634A (zh) * 2021-02-10 2021-06-01 维沃移动通信有限公司 控件参数的调整方法及装置
CN112925457A (zh) * 2021-02-19 2021-06-08 深圳市云基航空科技有限责任公司 应用程序的控制方法、装置、存储介质及终端
CN112905296A (zh) * 2021-03-31 2021-06-04 读书郎教育科技有限公司 一种解决全面屏手势导航与应用逻辑冲突的系统及方法
CN112860156B (zh) * 2021-04-06 2023-02-24 展讯通信(天津)有限公司 应用的模块标签交互控制方法及装置、存储介质、计算机设备
CN113419650A (zh) * 2021-06-08 2021-09-21 Oppo广东移动通信有限公司 一种数据移动方法、装置、存储介质及电子设备

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107547750A (zh) * 2017-09-11 2018-01-05 广东欧珀移动通信有限公司 终端的控制方法、装置和存储介质
EP3432130A1 (fr) * 2016-09-09 2019-01-23 HTC Corporation Dispositif électronique portable, procédé de fonctionnement correspondant et support d'enregistrement lisible par ordinateur non transitoire
CN109375863A (zh) * 2018-09-27 2019-02-22 Oppo广东移动通信有限公司 目标功能的触发方法、装置、终端及存储介质
CN109656443A (zh) * 2018-10-31 2019-04-19 百度在线网络技术(北京)有限公司 页面显示方法和装置
CN111443863A (zh) * 2020-04-07 2020-07-24 Oppo广东移动通信有限公司 页面操控方法、装置、存储介质及终端

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102855056B (zh) * 2012-07-09 2015-09-30 宇龙计算机通信科技(深圳)有限公司 终端和终端控制方法
KR20150039293A (ko) * 2013-10-02 2015-04-10 주식회사 엘지유플러스 사용자 인터페이스 제공을 위한 장치, 방법, 및 기록 매체
CN110275658A (zh) * 2019-06-03 2019-09-24 Oppo广东移动通信有限公司 显示控制方法、装置、移动终端以及存储介质

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3432130A1 (fr) * 2016-09-09 2019-01-23 HTC Corporation Dispositif électronique portable, procédé de fonctionnement correspondant et support d'enregistrement lisible par ordinateur non transitoire
CN107547750A (zh) * 2017-09-11 2018-01-05 广东欧珀移动通信有限公司 终端的控制方法、装置和存储介质
CN109375863A (zh) * 2018-09-27 2019-02-22 Oppo广东移动通信有限公司 目标功能的触发方法、装置、终端及存储介质
CN109656443A (zh) * 2018-10-31 2019-04-19 百度在线网络技术(北京)有限公司 页面显示方法和装置
CN111443863A (zh) * 2020-04-07 2020-07-24 Oppo广东移动通信有限公司 页面操控方法、装置、存储介质及终端

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114116089A (zh) * 2021-11-08 2022-03-01 广州鸿大智能科技有限公司 一种数据可视化方法、装置、设备及存储介质
CN114153368A (zh) * 2021-12-07 2022-03-08 Oppo广东移动通信有限公司 应用控制方法和系统
CN114327183A (zh) * 2021-12-24 2022-04-12 Oppo广东移动通信有限公司 应用控制方法、装置、电子设备、芯片及存储介质
CN114518832A (zh) * 2022-02-15 2022-05-20 网易(杭州)网络有限公司 触控终端的显示控制方法、装置及电子设备
CN114518832B (zh) * 2022-02-15 2024-05-28 网易(杭州)网络有限公司 触控终端的显示控制方法、装置及电子设备
CN114661205A (zh) * 2022-03-11 2022-06-24 支付宝(杭州)信息技术有限公司 一种应用提示信息展示方法、装置及设备

Also Published As

Publication number Publication date
CN111443863A (zh) 2020-07-24

Similar Documents

Publication Publication Date Title
WO2021203821A1 (fr) Procédé et dispositif de manipulation de page, support de stockage et terminal
US11467715B2 (en) User interface display method, terminal and non-transitory computer-readable storage medium for splitting a display using a multi-finger swipe
US11301131B2 (en) Method for split-screen display, terminal, and non-transitory computer readable storage medium
EP3761161B1 (fr) Procédé et dispositif d'affichage d'interface de méthode de saisie, et terminal et support de stockage
WO2020147665A1 (fr) Procédé et dispositif de traitement de fichiers, terminal et support de stockage
EP3842905B1 (fr) Procédé et appareil d'affichage d'icônes, terminal et support de stockage
WO2019174477A1 (fr) Procédé et dispositif d'affichage d'interface utilisateur, et terminal
WO2020038168A1 (fr) Procédé et dispositif de partage de contenu, terminal et support de stockage
WO2019233306A1 (fr) Procédé, dispositif et terminal d'affichage d'icône
US20150339018A1 (en) User terminal device and method for providing information thereof
WO2019233313A1 (fr) Procédé et dispositif d'affichage d'onglet flottant, terminal et support d'informations
EP3454199B1 (fr) Procédé permettant de répondre à une opération tactile et dispositif électronique
WO2019047147A1 (fr) Procédé et dispositif de déplacement d'icône
KR20150045121A (ko) 멀티윈도우 운용 방법 및 이를 지원하는 전자 장치
WO2019233307A1 (fr) Procédé et appareil d'affichage d'interface utilisateur, terminal et support d'informations
CN107608550B (zh) 触摸操作响应方法及装置
WO2021190184A1 (fr) Procédé et appareil d'assistance à distance, support de stockage et terminal
CN110442267B (zh) 触摸操作响应方法、装置、移动终端及存储介质
US11086442B2 (en) Method for responding to touch operation, mobile terminal, and storage medium
WO2021254201A1 (fr) Procédé et appareil d'affichage de pages, support de stockage et dispositif électronique
KR20180098080A (ko) 멀티태스킹을 위한 인터페이스 제공 방법 및 이를 구현하는 전자 장치
WO2019047231A1 (fr) Procédé et dispositif de réaction aux opérations tactiles
US11194425B2 (en) Method for responding to touch operation, mobile terminal, and storage medium
CN110377220B (zh) 一种指令响应方法、装置、存储介质及电子设备
WO2019047187A1 (fr) Procédé et dispositif de commande de barre de navigation

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21783687

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21783687

Country of ref document: EP

Kind code of ref document: A1