WO2020125405A1 - Procédé de commande pour appareil terminal et appareil terminal - Google Patents

Procédé de commande pour appareil terminal et appareil terminal Download PDF

Info

Publication number
WO2020125405A1
WO2020125405A1 PCT/CN2019/122682 CN2019122682W WO2020125405A1 WO 2020125405 A1 WO2020125405 A1 WO 2020125405A1 CN 2019122682 W CN2019122682 W CN 2019122682W WO 2020125405 A1 WO2020125405 A1 WO 2020125405A1
Authority
WO
WIPO (PCT)
Prior art keywords
content
area
interface
terminal device
input
Prior art date
Application number
PCT/CN2019/122682
Other languages
English (en)
Chinese (zh)
Inventor
温号
Original Assignee
维沃移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 维沃移动通信有限公司 filed Critical 维沃移动通信有限公司
Publication of WO2020125405A1 publication Critical patent/WO2020125405A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/725Cordless telephones

Definitions

  • the embodiments of the present invention relate to the technical field of terminals, and in particular, to a method for controlling terminal devices and terminal devices.
  • the prior art proposes to switch the terminal device to the one-handed operation mode when operating with one hand.
  • the terminal device scales the interface displayed on the original display screen and displays it in an area accessible to the user with one hand, and then the user can input any position of the interface with one hand.
  • Embodiments of the present invention provide a terminal device control method and a terminal device, which are used to solve the problem of sacrificing the visual experience of the terminal device when implementing one-handed operation on the terminal device.
  • an embodiment of the present invention provides a terminal device control method, including:
  • the second interface includes at least a part of the first content
  • the first content is the display content of the first interface in the target area
  • the one-handed operation area is located in the terminal device
  • the target area includes other areas in the display screen of the terminal device except the one-handed operation area;
  • the display content of the second interface is the first content, or the display content of the second interface includes at least a part of the second content, and the second content is in the one-handed operation area Display content of the first interface.
  • an embodiment of the present invention provides a terminal device, including:
  • a receiving unit configured to receive a user's first input to the first interface
  • a display unit for displaying a second interface in response to the first input
  • the second interface includes at least a part of the first content
  • the first content is the display content of the first interface in the target area
  • the one-handed operation area is located in the terminal device
  • the target area includes other areas in the display screen of the terminal device except the one-handed operation area;
  • the display content of the second interface is the first content, or the display content of the second interface includes at least a part of the second content, and the second content is in the one-handed operation area Display content of the first interface.
  • an embodiment of the present invention provides a terminal device, including a processor, a memory, and a computer program stored on the memory and executable on the processor, and the computer program is executed by the processor To implement the steps of the method for controlling a terminal device as described in the first aspect.
  • an embodiment of the present invention provides a computer-readable storage medium storing a computer program on the computer-readable storage medium, where the computer program is executed by a processor to implement control of the terminal device according to the first aspect Method steps.
  • the terminal device control method provided by an embodiment of the present invention first receives a user's first input to a first interface in a one-hand operation area, and then displays a second interface in response to the first input; since the one-hand operation area Inside, the second interface includes at least a part of the first content, the first content is the display content of the first interface in the target area, the one-handed operation area is located in the display screen of the terminal device, the The target area includes areas other than the one-hand operation area on the display screen of the terminal device, so the embodiment of the present invention can first display the display contents of other display areas other than the one-hand operation area in the one-hand operation area To achieve one-handed operation of the terminal device.
  • the embodiments of the present invention can also reduce the impact on the visual experience of the terminal device.
  • the embodiments of the present invention can solve the problem of sacrificing the visual experience of the terminal device when implementing one-handed operation on the terminal device.
  • FIG. 1 is an architectural diagram of an Android operating system provided by an embodiment of the present invention
  • FIG. 2 is a flowchart of steps of a method for controlling a terminal device according to an embodiment of the present invention
  • FIG. 3 is one of interface diagrams of application scenarios of a method for controlling a terminal device according to an embodiment of the present invention
  • FIG. 4 is a second interface diagram of an application scenario of a method for controlling a terminal device according to an embodiment of the present invention.
  • FIG. 5 is a third interface diagram of an application scenario of a control method of a terminal device according to an embodiment of the present invention.
  • FIG. 6 is a fourth interface diagram of an application scenario of a method for controlling a terminal device according to an embodiment of the present invention.
  • FIG. 7 is a schematic structural diagram of a terminal device according to an embodiment of the present invention.
  • FIG. 8 is a schematic diagram of a hardware structure of a terminal device provided by an embodiment of the present invention.
  • the words “first” and “second” are used to distinguish the same or similar items whose functions or functions are basically the same. Personnel can understand that the words “first” and “second” do not limit the number and order of execution.
  • the words “exemplary” or “for example” are used as examples, illustrations or illustrations. Any embodiment or design described in the embodiments of the present invention as “exemplary” or “for example” should not be construed as being more preferred or advantageous than other embodiments or design. Rather, the use of words such as “exemplary” or “for example” is intended to present related concepts in a specific manner. In the embodiments of the present invention, unless otherwise specified, the meaning of "plurality” refers to two or more.
  • embodiments of the present invention provide a terminal device control method and terminal device.
  • the terminal device control method first receives a user's first input to a first interface in a one-hand operation area, and then responds to the The first input displays a second interface; since in the one-handed operation area, the second interface includes at least a portion of the first content, the first content is the display content of the first interface in the target area ,
  • the one-handed operation area is located in the display screen of the terminal device, and the target area includes other areas in the display screen of the terminal device except for the one-handed operation area, so the embodiment of the present invention can first The display content of the display area other than the operation area is displayed in the one-hand operation area, thereby realizing the one-hand operation of the terminal device.
  • the display content of the second interface is the first content, or ,
  • the display content of the second interface includes at least a part of the second content
  • the second content is the display content of the first interface in the one-hand operation area, that is, for other display areas other than the one-hand operation area
  • the embodiment of the present invention can also reduce the impact on the visual experience of the terminal device.
  • the embodiments of the present invention can solve the problem of sacrificing the visual experience of the terminal device when implementing one-handed operation on the terminal device.
  • the terminal device in the embodiment of the present invention may be a mobile phone, tablet computer, notebook computer, ultra-mobile personal computer (UMPC), netbook, personal digital assistant (PDA), smart watch, smart
  • the terminal device such as a wristband, or the terminal device may also be another type of terminal device, which is not limited in the embodiments of the present application.
  • the following uses the Android operating system as an example to introduce the software environment to which the method for controlling a terminal device provided by the embodiments of the present application is applied.
  • FIG. 1 is a schematic structural diagram of a possible Android operating system applied to the method for controlling a terminal device provided by an embodiment of the present application.
  • the architecture of the Android operating system includes four layers, namely: an application program layer, an application program framework layer, a system runtime library layer, and a kernel layer (specifically, a Linux kernel layer).
  • the application layer includes various applications in the Android operating system (including system applications and third-party applications).
  • the application framework layer is the framework of the application. Developers can develop some applications based on the application framework layer while observing the development principles of the application framework.
  • the system runtime library layer includes a library (also called a system library) and an Android operating system operating environment.
  • the library mainly provides various resources required by the Android operating system.
  • the operating environment of the Android operating system is used to provide a software environment for the Android operating system.
  • the kernel layer is the operating system layer of the Android operating system, and belongs to the bottom layer of the Android operating system software layer.
  • the kernel layer provides core system services and hardware-related drivers for the Android operating system based on the Linux kernel.
  • the developer may develop a software program that implements the control method of the terminal device provided in the embodiment of the present application based on the system architecture of the Android operating system shown in FIG. 1, so that The control method of the terminal device may be based on the Android operating system shown in FIG. 1. That is, the processor or the terminal device may implement the control method of the terminal device provided by the embodiment of the present application by running the software program in the Android operating system.
  • An embodiment of the present invention provides a terminal device control method. Specifically, referring to FIG. 2, the terminal device control method includes the following steps step 201 to step 202.
  • Step 201 Receive a user's first input to the first interface.
  • the first input may be an input in the one-hand operation area.
  • the first input in the embodiment of the present invention may be a touch click input to the first interface, or a specific voice instruction, or a specific gesture, or a press input to a physical key of the terminal device.
  • the specific gesture may be any one of a click gesture, a sliding gesture, a pressure recognition gesture, a long-press gesture, an area change gesture, a double-press gesture, and a double-click gesture.
  • Step 202 In response to the first input, display a second interface.
  • the second interface includes at least a part of the first content, the first content is the display content of the first interface in the target area, and the target area is the Display area other than one-hand operation area;
  • the display content of the second interface is the first content, or the display content of the second interface includes at least a part of the second content, and the second content is in the one-handed operation area Display content of the first interface.
  • the single-hand operation area in the embodiment of the present invention refers to an area that can still be touched by the hand holding the terminal device when holding the terminal device with one hand.
  • the size, shape and position of the one-handed operation area can be predefined, for example: the one-handed operation area is predetermined to be 70% horizontal and 50% vertical; the size, shape and position of the one-handed operation area can also be used by the user During the process, the user's holding gesture and the area that can be touched by one hand are collected, and determined according to the user's holding gesture and the area that can be touched by one hand.
  • the terminal device control method provided by the embodiment of the present invention further includes:
  • the one-handed operation area is determined.
  • the first logo 31, the second logo 32, the third logo 33, and the fourth logo 34 are displayed on the boundary of the one-handed operation area 300 (that is, the target logo includes: the first logo 31 , The second sign 32, the third sign 33 and the fourth sign 34); wherein, operating the first sign 31 can adjust the height of the one-handed operation area 300, and operating the second sign 32 can adjust the one-handed operation area 300
  • the width and the height and width of the one-handed operation area 300 can be adjusted simultaneously by operating the third mark 33, and the overall position of the one-handed operation area 300 can be adjusted by operating the fourth mark 34.
  • the terminal device control method provided by an embodiment of the present invention first receives a user's first input to a first interface in a one-hand operation area, and then displays a second interface in response to the first input; since the one-hand operation area Inside, the second interface includes at least a part of the first content, the first content is the display content of the first interface in the target area, the one-handed operation area is located in the display screen of the terminal device, the The target area includes areas other than the one-hand operation area on the display screen of the terminal device, so the embodiment of the present invention can first display the display contents of other display areas other than the one-hand operation area in the one-hand operation area To achieve one-handed operation of the terminal device.
  • the embodiments of the present invention can also reduce the impact on the visual experience of the terminal device.
  • the embodiments of the present invention can solve the problem of sacrificing the visual experience of the terminal device when implementing one-handed operation on the terminal device.
  • step 202 the display manner of step 202 in the foregoing embodiment will be described in detail below.
  • a first content scaled according to a preset ratio is displayed in the one-handed operation area, and the first content is displayed in the target area.
  • the first input in manner 1 of the foregoing embodiment may be a leftward sliding operation or an upward sliding operation input in the one-hand operation area.
  • the user can control the switching of the display content in the one-hand operation area by an action similar to page-turning on the screen. Improve user experience.
  • a second interface is displayed; wherein, the content displayed in the one-hand operation area 41 is as follows: The display content of the first interface in the target area 42 that is scaled, the content displayed in the target area 42 is the same as the content displayed in the target area 42 when the first interface is displayed.
  • the first input includes: a first operation and a second operation
  • the displaying a second interface in response to the first input includes:
  • moving at least one content block in the one-handed operation area to the target area, and moving at least one content block in the target area to all Describe the one-handed operation area including:
  • i, j are positive integers, i is less than or equal to m, j is less than or equal to n.
  • controlling each of the content blocks in the i-th row of the content block array to perform cyclic shift, and/or controlling each of the j-th column of the content block array includes:
  • the display content of the first interface is divided into a content block array of 4 rows*4 columns, and the second input is a content block of the third row and the third column
  • the drag operation along the column direction of the content block array will be described as an example.
  • each location in the third column of the content block array The content block is cyclically shifted.
  • the content block originally located in the first row and third column in the target area 52 is shifted into the one-handed operation area 51, but originally located in the The content blocks in the fourth row and third column in the one-handed operation area 51 are shifted to the target area 52. That is, after the cyclic shift is completed, in the one-hand operation area, the second interface includes at least a portion of the first content, and in the target area, the display content of the second interface includes at least a portion of the second content.
  • the first input includes: a first operation and a second operation
  • the displaying a second interface in response to the first input includes:
  • the first area of the first interface is displayed with preset transparency in the one-handed operation area, and the first area of the first interface is the The area corresponding to the one-hand operation area;
  • the preset transparency may be any transparency between 0% and 100%, which is not limited in the embodiment of the present invention.
  • the user can adjust the transparency according to actual needs during use.
  • the second operation may be a drag operation on the first interface displayed with preset transparency.
  • the first area 601 of the first interface is displayed with preset transparency in the one-hand operation area 61, and a downward sliding operation input by the user in the one-hand operation area is received At this time, the first interface display area moves downward, and the second area 602 of the first interface is displayed in the one-handed operation area 61 with preset transparency. In addition, the content displayed in the target area 62 has not changed.
  • the embodiments of the present application may divide the functional modules of the terminal device and the like according to the above method examples.
  • each function module may be divided corresponding to each function, or two or more functions may be integrated into one module.
  • the above integrated modules can be implemented in the form of hardware or software function modules. It should be noted that the division of the modules in the embodiments of the present application is schematic, and is only a division of logical functions. In actual implementation, there may be another division manner.
  • FIG. 7 shows a possible structural schematic diagram of the terminal device involved in the first embodiment, and the terminal device 700 includes:
  • the receiving unit 71 is configured to receive a user's first input to the first interface
  • the display unit 72 is configured to display a second interface in response to the first input
  • the second interface includes at least a part of the first content
  • the first content is the display content of the first interface in the target area
  • the one-handed operation area is located in the terminal device
  • the target area includes other areas in the display screen of the terminal device except the one-handed operation area;
  • the display content of the second interface is the first content, or the display content of the second interface includes at least a part of the second content, and the second content is in the one-handed operation area Display content of the first interface.
  • the display unit 72 is specifically configured to display the first content scaled according to a preset ratio in the one-hand operation area in response to the first input, and display the first content in the target area One content.
  • the first input includes: a first operation and a second operation
  • the display unit 72 is specifically configured to divide the display content of the first interface into a content block array of m rows*n columns in response to the first operation, and in response to the second operation, divide the single Moving at least one content block in the hand operation area to the target area, and moving at least one content block in the target area to the one-hand operation area;
  • n and n are positive integers
  • the display unit 72 is specifically configured to cyclically shift each content block in the i-th row of the content block array in response to the second operation, and/or Cyclically shift each of the content blocks in the jth column of the block array;
  • i, j are positive integers, i is less than or equal to m, j is less than or equal to n.
  • the first input includes: a first operation and a second operation
  • the display unit 72 is specifically configured to display the first area of the first interface with preset transparency in the one-handed operation area in response to the first operation, and respond to the second operation in the Displaying the second area of the first interface with preset transparency in the one-handed operation area;
  • the first area of the first interface is an area corresponding to the one-handed operation area in the first interface, and the second area of the first interface is different from the first area of the first interface.
  • the receiving unit 71 is further configured to receive a second user input to a target identification, and the target identification is used to identify the one-handed operation area;
  • the display unit 72 is also used to determine the one-handed operation area based on the second input.
  • the terminal device provided by the embodiment of the present invention first receives the user's first input to the first interface in the one-hand operation area, and then displays the second interface in response to the first input; since it is in the one-hand operation area ,
  • the second interface includes at least a part of the first content, the first content is the display content of the first interface in the target area, the one-handed operation area is located in the display screen of the terminal device, the target The area includes other areas on the display screen of the terminal device than the one-handed operation area, so the embodiment of the present invention can first display the display content of the other display area other than the one-handed operation area in the one-handed operation area, Thus, one-handed operation of the terminal device is realized.
  • the embodiment of the present invention can also reduce the impact on the visual experience of the terminal device.
  • the embodiments of the present invention can solve the problem of sacrificing the visual experience of the terminal device when implementing one-handed operation on the terminal device.
  • the terminal device 800 includes but is not limited to: a radio frequency unit 101, a network module 102, an audio output unit 103, an input unit 104, a sensor 105, and a display unit 106 , User input unit 107, interface unit 108, memory 109, processor 110, power supply 111 and other components.
  • a radio frequency unit 101 for example, a radio frequency unit
  • a network module 102 for example, a radio frequency unit
  • an audio output unit 103 includes an audio signals from a terminal device
  • input unit 104 includes a sensor 105, and a display unit 106 .
  • User input unit 107 User input unit 107
  • interface unit 108 interface unit
  • memory 109 Memory 109
  • processor 110 power supply 111 and other components.
  • power supply 111 power supply 111
  • terminal devices include but are not limited to mobile phones, tablet computers, notebook computers, palmtop computers, vehicle-mounted terminals, wearable devices, and pedometers.
  • the user input unit 107 is configured to receive the user's first input to the first interface
  • the display unit 106 is configured to display a second interface in response to the first input
  • the second interface includes at least a part of the first content
  • the first content is the display content of the first interface in the target area
  • the one-handed operation area is located in the terminal device
  • the target area includes other areas in the display screen of the terminal device except the one-handed operation area;
  • the display content of the second interface is the first content, or the display content of the second interface includes at least a part of the second content, and the second content is in the one-handed operation area Display content of the first interface.
  • the terminal device provided by the embodiment of the present invention first receives the user's first input to the first interface in the one-hand operation area, and then displays the second interface in response to the first input; since it is in the one-hand operation area ,
  • the second interface includes at least a part of the first content, the first content is the display content of the first interface in the target area, the one-handed operation area is located in the display screen of the terminal device, the target The area includes other areas on the display screen of the terminal device than the one-handed operation area, so the embodiment of the present invention can first display the display content of the other display area other than the one-handed operation area in the one-handed operation area, Thus, one-handed operation of the terminal device is realized.
  • the embodiment of the present invention can also reduce the impact on the visual experience of the terminal device.
  • the embodiments of the present invention can solve the problem of sacrificing the visual experience of the terminal device when implementing one-handed operation on the terminal device.
  • the radio frequency unit 101 may be used to receive and send signals during sending and receiving information or during a call. Specifically, after receiving the downlink data from the base station, it is processed by the processor 110; The uplink data is sent to the base station.
  • the radio frequency unit 101 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like.
  • the radio frequency unit 101 can also communicate with the network and other devices through a wireless communication system.
  • the terminal device provides wireless broadband Internet access to the user through the network module 102, such as helping the user to send and receive emails, browse web pages, and access streaming media.
  • the audio output unit 103 may convert the audio data received by the radio frequency unit 101 or the network module 102 or stored in the memory 109 into an audio signal and output as sound. Moreover, the audio output unit 103 can also provide audio output (eg, call signal reception sound, message reception sound, etc.) associated with a specific function performed by the terminal device.
  • the audio output unit 103 includes a speaker, a buzzer, a receiver, and the like.
  • the input unit 104 is used to receive audio or video signals.
  • the input unit 104 may include a graphics processor (Graphics, Processing, Unit, GPU) 1041 and a microphone 1042, and the graphics processor 1041 pairs images of still images or videos obtained by an image capturing device (such as a camera) in a video capturing mode or an image capturing mode
  • the data is processed.
  • the processed image frame may be displayed on the display unit 106.
  • the image frame processed by the graphics processor 1041 may be stored in the memory 109 (or other storage medium) or sent via the radio frequency unit 101 or the network module 102.
  • the microphone 1042 can receive sound, and can process such sound into audio data.
  • the processed audio data can be converted into a format that can be sent to the mobile communication base station via the radio frequency unit 101 in the case of a telephone call mode and output.
  • the terminal device further includes at least one sensor 105, such as a light sensor, a motion sensor, and other sensors.
  • the light sensor includes an ambient light sensor and a proximity sensor, where the ambient light sensor can adjust the brightness of the display panel 1061 according to the brightness of the ambient light, and the proximity sensor can close the display panel 1061 and/or when the terminal device moves to the ear Or backlight.
  • the accelerometer sensor can detect the magnitude of acceleration in multiple directions (generally three axes), and can detect the magnitude and direction of gravity when at rest, and can be used to identify the posture of the terminal device (such as horizontal and vertical screen switching, correlation Games, magnetometer attitude calibration), vibration recognition related functions (such as pedometer, tap), etc.; sensor 105 can also include fingerprint sensor, pressure sensor, iris sensor, molecular sensor, gyroscope, barometer, hygrometer, thermometer , Infrared sensors, etc., will not repeat them here.
  • the display unit 106 is used to display information input by the user or information provided to the user.
  • the display unit 106 may include a display panel 1061, and the display panel 1061 may be configured in the form of a liquid crystal display (Liquid Crystal) (LCD), an organic light emitting diode (Organic Light-Emitting Diode, OLED), or the like.
  • LCD Liquid Crystal
  • OLED Organic Light-Emitting Diode
  • the user input unit 107 may be used to receive input numeric or character information and generate key signal input related to user settings and function control of the terminal device.
  • the user input unit 107 includes a touch panel 1071 and other input devices 1072.
  • the touch panel 1071 also known as a touch screen, can collect user's touch operations on or near it (for example, the user uses any suitable objects or accessories such as fingers, stylus, etc. on or near the touch panel 1071 operating).
  • the touch panel 1071 may include a touch detection device and a touch controller.
  • the touch detection device detects the user's touch orientation, and detects the signal brought by the touch operation, and transmits the signal to the touch controller; the touch controller receives touch information from the touch detection device and converts it into contact coordinates, and then sends To the processor 110, the command sent by the processor 110 is received and executed.
  • the touch panel 1071 may be implemented in various types such as resistive, capacitive, infrared, and surface acoustic waves.
  • the user input unit 107 may also include other input devices 1072.
  • other input devices 1072 may include, but are not limited to, physical keyboards, function keys (such as volume control keys, switch keys, etc.), trackballs, mice, and joysticks, which will not be repeated here.
  • the touch panel 1071 may be overlaid on the display panel 1061.
  • the touch panel 1071 detects a touch operation on or near it, it is transmitted to the processor 110 to determine the type of touch event, and then the processor 110 according to the touch The type of event provides a corresponding visual output on the display panel 1061.
  • the touch panel 1071 and the display panel 1061 are implemented as two independent components to realize the input and output functions of the terminal device, in some embodiments, the touch panel 1071 and the display panel 1061 may be integrated
  • the input and output functions of the terminal device are not specifically limited here.
  • the interface unit 108 is an interface for connecting an external device with a terminal device.
  • the external device may include a wired or wireless headset port, an external power (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device with an identification module, audio input/output (I/O) port, video I/O port, headphone port, etc.
  • the interface unit 108 may be used to receive input (eg, data information, power, etc.) from an external device and transmit the received input to one or more elements within the terminal device or may be used between the terminal device and the external device Transfer data between.
  • the memory 109 may be used to store software programs and various data.
  • the memory 109 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, application programs required for at least one function (such as a sound playback function, an image playback function, etc.), etc.; the storage data area may store Data created by the use of mobile phones (such as audio data, phonebooks, etc.), etc.
  • the memory 109 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, or other volatile solid-state storage devices.
  • the processor 110 is the control center of the terminal device, and uses various interfaces and lines to connect multiple parts of the entire terminal device, by running or executing the software programs and/or modules stored in the memory 109 and calling the data stored in the memory 109 , Perform various functions and process data of the terminal device, so as to monitor the terminal device as a whole.
  • the processor 110 may include one or more processing units; optionally, the processor 110 may integrate an application processor and a modem processor, where the application processor mainly processes an operating system, a user interface, and application programs, etc.
  • the modulation processor mainly handles wireless communication. It can be understood that the foregoing modem processor may not be integrated into the processor 110.
  • the terminal device can also include a power supply 111 (such as a battery) that supplies power to multiple components.
  • a power supply 111 (such as a battery) that supplies power to multiple components.
  • the power supply 111 can be logically connected to the processor 110 through a power management system, thereby managing charge, discharge, and power consumption management through the power management system And other functions.
  • the terminal device includes some function modules not shown, which will not be repeated here.
  • Embodiments of the present invention also provide a computer-readable storage medium, and a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, a plurality of processes for implementing the control method of the terminal device described in Embodiment 1 above And can achieve the same technical effect, in order to avoid repetition, no more details here.
  • computer-readable storage media such as read-only memory (Read-Only Memory, ROM for short), random access memory (Random Access Memory, RAM for short), magnetic disk or optical disk, etc.
  • the methods in the above embodiments can be implemented by means of software plus a necessary general hardware platform, and of course, can also be implemented by hardware, but in many cases the former is better Implementation.
  • the technical solution of the present application can essentially be embodied in the form of software products, and the computer software products are stored in a storage medium (such as ROM/RAM, magnetic disk,
  • the CD-ROM includes several instructions to enable a terminal (which may be a mobile phone, computer, server, air conditioner, or network-side device, etc.) to execute the methods described in multiple embodiments of the present application.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention concerne un procédé de commande pour un appareil terminal et un appareil terminal. Le procédé consiste à : recevoir une première entrée d'un utilisateur sur une première interface (201) ; et afficher une seconde interface en réponse à la première entrée (202). Dans une région d'actionnement d'une seule main, la seconde interface comprend au moins une partie d'un premier contenu, le premier contenu est un contenu d'affichage de la première interface dans une région cible, la région d'actionnement d'une seule main est située à l'intérieur d'un écran d'affichage d'un appareil terminal, la région cible comprend une région restante de l'écran d'affichage de l'appareil terminal à l'exception de la région d'actionnement d'une seule main et, dans la région cible, un contenu d'affichage de la seconde interface est le premier contenu ou le contenu d'affichage de la seconde interface comprend au moins une partie du second contenu, le second contenu étant un contenu d'affichage de la première interface dans la région d'actionnement d'une seule main.
PCT/CN2019/122682 2018-12-21 2019-12-03 Procédé de commande pour appareil terminal et appareil terminal WO2020125405A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201811574630.4A CN109857317A (zh) 2018-12-21 2018-12-21 一种终端设备的控制方法及终端设备
CN201811574630.4 2018-12-21

Publications (1)

Publication Number Publication Date
WO2020125405A1 true WO2020125405A1 (fr) 2020-06-25

Family

ID=66891953

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/122682 WO2020125405A1 (fr) 2018-12-21 2019-12-03 Procédé de commande pour appareil terminal et appareil terminal

Country Status (2)

Country Link
CN (1) CN109857317A (fr)
WO (1) WO2020125405A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112486330A (zh) * 2020-11-30 2021-03-12 维沃移动通信有限公司 显示控制方法、装置及设备

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109857317A (zh) * 2018-12-21 2019-06-07 维沃移动通信有限公司 一种终端设备的控制方法及终端设备
CN111078114A (zh) * 2019-12-26 2020-04-28 上海传英信息技术有限公司 单手控制方法、控制装置及终端设备
CN112800273B (zh) * 2021-02-05 2024-08-09 北京字节跳动网络技术有限公司 一种页面内容展示方法及终端设备

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106445354A (zh) * 2016-11-24 2017-02-22 北京小米移动软件有限公司 终端设备的触摸控制方法及装置
US20180039403A1 (en) * 2016-08-05 2018-02-08 Beijing Xiaomi Mobile Software Co., Ltd. Terminal control method, terminal, and storage medium
CN108710459A (zh) * 2018-05-11 2018-10-26 维沃移动通信有限公司 一种界面操作方法和移动终端
CN108733282A (zh) * 2018-04-16 2018-11-02 维沃移动通信有限公司 一种页面移动方法及终端设备
CN109857317A (zh) * 2018-12-21 2019-06-07 维沃移动通信有限公司 一种终端设备的控制方法及终端设备

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103019545B (zh) * 2012-12-10 2015-08-12 广东欧珀移动通信有限公司 电子设备触摸屏显示界面的放缩方法
JP6147830B2 (ja) * 2015-10-28 2017-06-14 京セラ株式会社 携帯電子機器および携帯電子機器の表示方法
CN105867821A (zh) * 2016-03-31 2016-08-17 宇龙计算机通信科技(深圳)有限公司 一种图标排列方法、图标排列装置以及终端
CN105975163A (zh) * 2016-07-12 2016-09-28 无锡天脉聚源传媒科技有限公司 一种移动终端界面的调整方法和装置
CN106445311A (zh) * 2016-11-08 2017-02-22 宇龙计算机通信科技(深圳)有限公司 图标显示方法、图标显示装置及终端
WO2018133285A1 (fr) * 2017-01-22 2018-07-26 华为技术有限公司 Procédé d'affichage et terminal
CN108733275A (zh) * 2018-04-28 2018-11-02 维沃移动通信有限公司 一种对象显示方法及终端
CN108933861B (zh) * 2018-06-12 2021-01-08 奇酷互联网络科技(深圳)有限公司 应用图标排序方法、装置、可读存储介质及智能终端

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180039403A1 (en) * 2016-08-05 2018-02-08 Beijing Xiaomi Mobile Software Co., Ltd. Terminal control method, terminal, and storage medium
CN106445354A (zh) * 2016-11-24 2017-02-22 北京小米移动软件有限公司 终端设备的触摸控制方法及装置
CN108733282A (zh) * 2018-04-16 2018-11-02 维沃移动通信有限公司 一种页面移动方法及终端设备
CN108710459A (zh) * 2018-05-11 2018-10-26 维沃移动通信有限公司 一种界面操作方法和移动终端
CN109857317A (zh) * 2018-12-21 2019-06-07 维沃移动通信有限公司 一种终端设备的控制方法及终端设备

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112486330A (zh) * 2020-11-30 2021-03-12 维沃移动通信有限公司 显示控制方法、装置及设备
CN112486330B (zh) * 2020-11-30 2024-06-04 维沃移动通信有限公司 显示控制方法、装置及设备

Also Published As

Publication number Publication date
CN109857317A (zh) 2019-06-07

Similar Documents

Publication Publication Date Title
WO2020258929A1 (fr) Procédé de commutation d'interface de dossier et dispositif terminal
WO2021083132A1 (fr) Procédé de déplacement d'icônes et dispositif électronique
WO2021057337A1 (fr) Procédé de fonctionnement et dispositif électronique
EP3855714B1 (fr) Procédé de traitement d'informations et terminal
WO2020125405A1 (fr) Procédé de commande pour appareil terminal et appareil terminal
WO2020181955A1 (fr) Procédé de commande d'interface et dispositif terminal
WO2020215932A1 (fr) Procédé d'affichage de message non lu et dispositif terminal
WO2020151525A1 (fr) Procédé d'envoi de message et dispositif terminal
CN111124245B (zh) 一种控制方法及电子设备
CN109828705B (zh) 一种显示图标的方法及终端设备
CN109240783B (zh) 一种界面显示方法及终端设备
CN109032486B (zh) 一种显示控制方法及终端设备
WO2020192324A1 (fr) Procédé d'affichage d'interface et dispositif terminal
WO2020192299A1 (fr) Procédé d'affichage d'informations et dispositif terminal
WO2020057257A1 (fr) Procédé de basculement d'interface d'application et terminal mobile
WO2021083091A1 (fr) Procédé de capture d'écran et dispositif terminal
WO2020192282A1 (fr) Procédé d'affichage de messages de notification et dispositif terminal
CN111064848B (zh) 图片显示方法及电子设备
CN109614061A (zh) 显示方法及终端
WO2020220893A1 (fr) Procédé de capture d'écran et terminal mobile
WO2020192297A1 (fr) Procédé de commutation d'interface d'écran et dispositif terminal
WO2021057290A1 (fr) Procédé de commande d'informations et dispositif électronique
WO2020215982A1 (fr) Procédé de gestion d'icône de bureau et dispositif terminal
WO2020192322A1 (fr) Procédé d'affichage et dispositif terminal
WO2020135175A1 (fr) Procédé et appareil de rappel d'informations

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19898906

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19898906

Country of ref document: EP

Kind code of ref document: A1