WO2020181955A1 - Procédé de commande d'interface et dispositif terminal - Google Patents

Procédé de commande d'interface et dispositif terminal Download PDF

Info

Publication number
WO2020181955A1
WO2020181955A1 PCT/CN2020/075378 CN2020075378W WO2020181955A1 WO 2020181955 A1 WO2020181955 A1 WO 2020181955A1 CN 2020075378 W CN2020075378 W CN 2020075378W WO 2020181955 A1 WO2020181955 A1 WO 2020181955A1
Authority
WO
WIPO (PCT)
Prior art keywords
input
terminal device
target
interface
control
Prior art date
Application number
PCT/CN2020/075378
Other languages
English (en)
Chinese (zh)
Inventor
张堡霖
Original Assignee
维沃移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 维沃移动通信有限公司 filed Critical 维沃移动通信有限公司
Publication of WO2020181955A1 publication Critical patent/WO2020181955A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser

Definitions

  • terminal devices are more and more widely used, and the screen size of terminal devices is also getting larger.
  • the effective size of the terminal device display interface and the icons in the display interface will be reduced, resulting in the terminal device responding to the user in the terminal device display interface
  • the sensitivity of the operation is reduced, resulting in poor one-hand operation performance of the terminal device.
  • the embodiments of the present disclosure provide an interface control method and terminal device to solve the problem of poor single-handed operation performance of the terminal device.
  • an embodiment of the present disclosure provides an interface control method, the method includes: receiving a user's first input on a target control; and in response to the first input, executing a target action.
  • the target control is displayed in the user's one-handed operation area on the display screen of the terminal device, and the target action includes any one of the following: controlling the target object in the first interface currently displayed on the display screen, adjusting the output of the terminal device
  • the output parameter is a parameter when the terminal device outputs the target object through the first interface.
  • embodiments of the present disclosure provide a terminal device, which includes a receiving module and an executing module.
  • the receiving module is used to receive the user's first input on the target control; the execution module is used to perform the target action in response to the first input received by the receiving module; wherein the target control is displayed on the display screen of the terminal device.
  • the target action includes any of the following: controlling the target object in the first interface currently displayed on the display screen, adjusting the output parameters of the terminal device, and the output parameter is the terminal device outputting the target through the first interface The parameters of the object.
  • the embodiments of the present disclosure provide a terminal device.
  • the terminal device includes a processor, a memory, and a computer program stored on the memory and running on the processor.
  • the computer program is executed by the processor, The steps of the interface control method in the above-mentioned first aspect are realized.
  • the embodiments of the present disclosure provide a computer-readable storage medium storing a computer program on the computer-readable storage medium, and when the computer program is executed by a processor, the steps of the interface control method in the first aspect are implemented. .
  • the user's first input on the target control (the target control is displayed in the user's one-handed operation area on the display screen of the terminal device) can be received; and in response to the first input, the target action is executed (Including any one of the following: controlling the target object in the first interface currently displayed on the display screen and adjusting the output parameter of the terminal device, the output parameter being the parameter when the terminal device outputs the target object through the first interface).
  • the user since the user’s one-handed operation area displays the target control in the display screen of the terminal device, the user can trigger the terminal device to control the first interface currently displayed on the display screen by inputting the target control.
  • Target object or trigger the terminal device to adjust its parameters when outputting the target object through the first interface.
  • the embodiment of the present disclosure does not need to reduce the interface currently displayed on the display screen of the terminal device and the interface.
  • the effective size of the icon in the icon the user's need for one-handed operation of the terminal device can be met, and the one-handed operation performance of the terminal device can be improved.
  • FIG. 1 is a schematic diagram of the architecture of an Android operating system provided by an embodiment of the disclosure
  • FIG. 2 is one of the schematic diagrams of the interface control method provided by the embodiments of the disclosure.
  • FIG. 3 is one of the schematic diagrams of the interface of the interface control method application provided by the embodiment of the disclosure.
  • FIG. 5 is the second schematic diagram of the interface of the interface control method application provided by the embodiments of the disclosure.
  • FIG. 11 is the sixth schematic diagram of the interface of the interface control method application provided by the embodiments of the disclosure.
  • FIG. 15 is one of schematic structural diagrams of a terminal device provided by an embodiment of the disclosure.
  • FIG. 16 is the second structural diagram of a terminal device provided by an embodiment of the disclosure.
  • FIG. 17 is the third structural diagram of a terminal device provided by an embodiment of the disclosure.
  • first and second in the specification and claims of the present disclosure are used to distinguish different objects, rather than to describe a specific order of objects.
  • first input and the second input are used to distinguish different inputs, rather than to describe a specific order of input.
  • words such as “exemplary” or “for example” are used as examples, illustrations, or illustrations. Any embodiment or design solution described as “exemplary” or “for example” in the embodiments of the present disclosure should not be construed as being more preferable or advantageous than other embodiments or design solutions. To be precise, words such as “exemplary” or “for example” are used to present related concepts in a specific manner.
  • FIG. 1 it is a schematic structural diagram of a possible Android operating system provided by an embodiment of the present disclosure.
  • the architecture of the Android operating system includes 4 layers, namely: application layer, application framework layer, system runtime library layer, and kernel layer (specifically, it may be the Linux kernel layer).
  • the application layer includes various applications (including system applications and third-party applications) in the Android operating system.
  • the kernel layer is the operating system layer of the Android operating system and belongs to the lowest level of the Android operating system software level.
  • the kernel layer is based on the Linux kernel to provide core system services and hardware-related drivers for the Android operating system.
  • developers can develop software programs that implement the interface control method provided by the embodiments of the present disclosure based on the system architecture of the Android operating system as shown in FIG. 1, so that the interface The control method can be run based on the Android operating system as shown in FIG. 1. That is, the processor or the terminal device can implement the interface control method provided by the embodiments of the present disclosure by running the software program in the Android operating system.
  • the execution subject of the interface control method provided by the embodiments of the present disclosure may be the above-mentioned terminal device, or may be a functional module and/or functional entity in the terminal device that can implement the interface control method, which can be specifically determined according to actual usage requirements.
  • the embodiments of the present disclosure are not limited. The following takes a terminal device as an example to illustrate the interface control method provided by the embodiment of the present disclosure.
  • the terminal device can display a control (the control can be used to control the interface currently displayed on the display screen of the terminal device), and the user can control the terminal device by operating the control, so as to realize the terminal One-handed operation function of the device.
  • an embodiment of the present disclosure provides an interface control method, and the method may include the following S201 and S202.
  • S201 The terminal device receives a user's first input on the target control.
  • the aforementioned target control may be displayed in the user's one-handed operation area on the display screen of the terminal device.
  • the target control can be used to control objects in the first interface (hereinafter referred to as the first interface) currently displayed on the display screen of the terminal device, or to adjust the output parameters of the terminal device to output the target object through the first interface.
  • the terminal device after the terminal device enters the one-handed operation mode, the terminal device can display the above-mentioned target control, and the user can drag the target control into the user's one-handed operation area on the display screen of the terminal device. In this way, when the user holds the terminal device with one hand, the user can control the terminal device through the input on the target control (i.e., the above-mentioned first input).
  • the display shape of the above-mentioned target control may be a fan shape, a semicircle, or a quadrilateral (for example, a rectangle (specifically, a rounded rectangle, etc.), a square, etc.).
  • the user specifically, the finger of the user's hand holding the terminal device
  • the shape with the largest range can be determined according to actual usage requirements, and the embodiment of the present disclosure does not limit it.
  • FIG. 3 it is a schematic diagram of the display effect of the foregoing target control provided by an embodiment of the present disclosure.
  • Figure 3 (a) is that the shape of the target control is fan-shaped, that is, the target control is displayed with a fan-shaped effect
  • Figure 3 (b) is that the shape of the target control is a semicircle, that is, the target control is The semicircle effect is displayed
  • (c) in Figure 3 shows that the shape of the target control is a rounded rectangle, that is, the target control is displayed with a rounded rectangle effect.
  • the display shape of the target control may be a shape preset in the terminal device, and the display shape of the target control may be preset in the terminal device by the manufacturer of the terminal device, or it may be a terminal device.
  • the manufacturer of the device provides a setting interface through which the user presets in the terminal device. Specifically, it can be determined according to actual usage requirements, and the embodiment of the present disclosure does not limit it.
  • the area of the above-mentioned target control may be a value preset in the terminal device, and the area of the target control may be preset in the terminal device by the manufacturer of the terminal device, or may be the value of the terminal device.
  • the manufacturer provides a setting interface through which the user presets in the terminal device. Specifically, it can be determined according to actual usage requirements, and the embodiment of the present disclosure does not limit it.
  • the pressure threshold may be a value preset in the terminal device, and the pressure threshold may be preset in the terminal device by the manufacturer of the terminal device according to the performance of the terminal device. Specifically, it can be determined according to actual usage requirements, and the embodiment of the present disclosure does not limit it.
  • the aforementioned long-press input may be an input by the user to press on the screen of the terminal device for a duration greater than or equal to a time threshold.
  • the above-mentioned time threshold may be a value preset in the terminal device, and the time threshold may be preset in the terminal device by the manufacturer of the terminal device according to the performance of the terminal device. Specifically, it can be determined according to actual usage requirements, and the embodiment of the present disclosure does not limit it.
  • the terminal device executes a target action in response to the first input.
  • the above-mentioned target action may include any of the following: controlling the target object in the first interface currently displayed on the display screen of the terminal device, adjusting the output parameter of the terminal device, and the output parameter is the terminal device outputting the target object through the first interface The output parameters.
  • the terminal device may respond to the first input to control the target object in the first interface currently displayed on the display screen of the terminal device, or Adjust the parameters when the terminal device outputs the target object through the first interface (that is, the terminal device executes the above-mentioned target action). Therefore, it is possible to avoid the problem that the one-hand operation performance of the terminal device is affected by the shrinkage of the interface and icons after entering the one-hand operation mode in the related art.
  • the first interface currently displayed on the display screen of the terminal device may be a streaming media playing interface and a non-streaming media playing interface.
  • the non-streaming media playback interface may include the main interface of the terminal device, the file management interface of the terminal device, and the interface where the terminal device displays content such as files, text, pictures, and links.
  • the target action is different, and the target object is also different.
  • the combination of the first interface, the target action, and the target object can be divided into the following two cases, namely Case 1 and Case 2.
  • Case 1 and Case 2 The following is an example of Case 1 and Case 2. instruction of.
  • the first interface currently displayed on the display screen of the terminal device is any interface other than the streaming media playback interface (that is, the aforementioned non-streaming media playback interface), the target object is the content in the first interface, and the target action is Including: controlling the target object in the first interface currently displayed on the display screen of the terminal device.
  • the terminal device may determine the target object in the first interface according to the input parameters of the first input, and then determine the target object in the first interface. After the target object, the terminal device performs the target action on the target object.
  • the interface control method provided by the embodiment of the present disclosure may further include the following S203, where the above S202 can be specifically implemented by the following S202a.
  • the terminal device determines the target object in the first interface according to the input parameter of the first input.
  • the terminal device executes a target action on the target object.
  • the step of "determining the target object in the first interface according to the input parameter of the first input” in S203 and S202a are the results of responding to the first input. To illustrate clearly, only “response to the first input” is illustrated in S203 in FIG. 4.
  • the terminal device after the terminal device receives the first input, in response to the first input, the terminal device can, according to the input parameters of the first input, obtain the content displayed on the first interface After determining the target object, the terminal device can execute the target action on the target object (that is, the target object in the first interface currently displayed on the display screen of the control terminal device).
  • the content displayed in the first interface may include: icons (for example, application icons, control icons, etc.), files , Text, pictures, and links.
  • icons for example, application icons, control icons, etc.
  • files for example, text, pictures, and links.
  • it can be determined according to actual usage requirements, and the embodiment of the present disclosure does not limit it.
  • the target object may be an icon (for example, an application icon, a control icon, etc.), a file, text, picture, or a link in the first interface.
  • the content of the above target action Specifically, it can be determined according to actual usage requirements, and the embodiment of the present disclosure does not limit it.
  • displaying the interface corresponding to the target object by the terminal device may mean that the terminal device updates the first interface currently displayed on the display screen of the terminal device to the interface corresponding to the target object.
  • the terminal device can display an interface corresponding to the application icon "photo”
  • the terminal device may update the first interface currently displayed on the display screen of the terminal device to correspond to the application icon "photo” Interface.
  • the start position of the first input may be the start position of the user's sliding input; the end position of the first input may be the end position of the user's sliding input;
  • the input track may be an input track of the user's sliding input.
  • the terminal device may include the proportional relationship between the target control and the first interface (for example, the coordinate proportional relationship between the target control and the first interface), and the terminal device receives the user’s After the sliding input in the target control (that is, the first input), the terminal device can determine the target object in the first interface according to the input parameters of the first input and the proportional relationship.
  • the proportional relationship between the above-mentioned target control and the first interface may be a numerical value group or relationship preset in the terminal device.
  • the proportional relationship between the target control and the first interface may be preset in the terminal device by the manufacturer of the terminal device according to the performance of the terminal device. Specifically, it can be determined according to actual usage requirements, and the embodiment of the present disclosure does not limit it.
  • the terminal device may also display a pointer control in the above first interface, and the pointer control may be used to indicate to the user the above first interface.
  • the object corresponding to an input parameter.
  • the terminal device when the user triggers the terminal device to control the pointer control to point to an object in the first interface through the first input, the terminal device can determine that the object is the target object.
  • the target control may also include a trigger button.
  • the terminal device determines the target object
  • the user can trigger the terminal device to respond to the trigger button by inputting the trigger button.
  • the target object performs the above-mentioned target action.
  • the user can also trigger the terminal device to perform the above-mentioned target action on the target object through single-click input, double-click input, long-press input, or double-press input in the target control.
  • the terminal device can also trigger the terminal device to perform the above-mentioned target action on the target object through single-click input, double-click input, long-press input, or double-press input in the target control.
  • it can be determined according to actual usage requirements, and the embodiment of the present disclosure does not limit it.
  • the above-mentioned first input may include two sub-inputs, which are the first sub-input and the second sub-input respectively.
  • the first sub-input may be used to trigger the terminal device to determine the above-mentioned target object
  • the second sub-input may be used to trigger the terminal device to perform the above-mentioned target action.
  • the first sub-input may be a user's sliding input in the blank area of the target control.
  • the second sub-input when the input object of the second sub-input is the trigger button, the second sub-input may be the user's single-click input, double-click input, long-press input, or repetition of the trigger button. Press input and other possible inputs.
  • the second sub-input When the input object of the second sub-input is the position where the first sub-input ends, the second sub-input can be the user's single-click input, double-click input, long-press input, or input at the position where the first sub-input ends. Press again to enter any possible input.
  • it can be determined according to actual usage requirements, and the embodiment of the present disclosure does not limit it.
  • the terminal device when the terminal device displays the pointer control in the first interface, when the user slides in the target control, the terminal device may follow the track of the user sliding in the target control (That is, the first input parameter) in the first interface, move the pointer control until the pointer control moves to the object that the user wants to select.
  • the terminal device can determine that the object pointed to by the pointer control is the above The target object, and then the user can trigger the terminal device to perform the target operation on the target object by inputting the trigger button (that is, the second sub-input).
  • the first interface is the main interface of the terminal device.
  • the terminal device may also display the pointer control in the first interface, and the target control Includes the trigger button described above, and the first input may include a first sub-input and the second sub-input, wherein the first sub-input is the user's sliding input in the blank area of the target control, and the second input is the user
  • the click input of the trigger button is input, the target object is the application icon "photo", and the target action is to display an interface corresponding to the target object.
  • the schematic diagram shown in FIG. 5 is exemplified by taking the first interface as the main interface of the terminal device as an example.
  • the first interface being the file management interface of the terminal device or the interface displaying files, texts, pictures, links and other content on the terminal device
  • the control method is similar to the above-mentioned control method shown in FIG. 5 and will not be repeated here.
  • the first interface currently displayed on the display screen of the terminal device is any interface other than the streaming media playback interface
  • the above-mentioned target object is the content in the first interface
  • the above-mentioned target action includes: controlling the terminal device
  • the terminal device may first determine the target object in the first interface according to the input parameters of the first input, Then the target action is performed on the target object, so that the accuracy of the terminal device in response to the user's input in the single-handed operation mode can be improved, thereby improving the single-handed operation performance of the terminal device.
  • the first interface is a streaming media playback interface
  • the target object is a streaming media played through the first interface
  • the target action includes: adjusting the terminal device to output output parameters of the target object through the first interface.
  • the terminal device may determine a value (for example, the target value in the embodiment of the present disclosure) according to the input parameter of the first input. After determining the value, the terminal device adjusts the value of the output parameter of the target object output by the terminal device to the value.
  • a value for example, the target value in the embodiment of the present disclosure
  • the interface control method provided by the embodiment of the present disclosure may further include the following S204.
  • the above S202 can be specifically implemented by the following S202b.
  • the terminal device determines the target value according to the input parameter of the first input.
  • S202b The terminal device adjusts the value of the output parameter of the target object output by the terminal device through the first interface to the target value.
  • the input parameter of the first input includes at least one of the following: the starting position of the first input, the ending position of the first input, and the input track of the first input.
  • the step of "determining the target value according to the input parameter of the first input” in the above S204 and the above S202b are both the result of responding to the first input. To illustrate clearly, only “response to the first input” is illustrated in S204 in FIG. 6.
  • the terminal device may determine a value (that is, the target value) according to the input parameter of the first input, and after determining the target value, The terminal device can adjust the value of the output parameter of the target object output by the terminal device to the target value.
  • the output parameters of the terminal device outputting the target object may include any of the following: the brightness of the terminal device displaying the target object, the volume of the terminal device playing the target object, and the terminal device playing the target object Progress.
  • the above-mentioned target control may include at least one sub-control, wherein each sub-control can be used to adjust an output parameter, and the above-mentioned first input may be a user's input to one of the at least one sub-control. The input of the child control.
  • the aforementioned streaming media playback control may be one control or two controls.
  • the streaming media playback control can be used to control the progress of streaming media playback and the start and stop of streaming media playback;
  • the streaming media playback control is two controls, they are the playback progress control and Play start and stop controls, where the play progress control is used to control the progress of streaming media playback, and the play start and stop controls are used to control the start and stop of streaming media playback.
  • the first interface (streaming media playback interface) displayed on the vertical screen of the terminal device is taken as an example.
  • the target control includes 4 sub-controls, namely a playback progress control, a playback start/stop control, Brightness control and volume control, when the aforementioned target control includes a playback progress control 71 and a playback start-stop control 72 as shown in (a) in Figure 7, the user can slide the blank area in the target control, and the terminal device responds Based on the input, the child controls in the target control can be updated.
  • the terminal device can update the playback progress control 71 and the playback start/stop control 72 in the target control to a brightness control 73, a volume control 74 and a playback progress control 71.
  • the first interface (streaming media playback interface) displayed on the horizontal screen of the terminal device is taken as an example.
  • the target control includes 4 sub-controls, namely a playback progress control and a playback start/stop control. , Brightness control and volume control, then when the above-mentioned target control includes a playback progress control 71 and a playback start-stop control 72 as shown in (a) in Figure 8, the user can slide the blank area in the target control, so the terminal device In response to the input, the child controls in the target control can be updated.
  • the terminal device can update the playback progress control 71 and the playback start-stop control 72 in the target control to a brightness control 73, a volume control 74 and a playback progress control 71.
  • the above-mentioned Figures 7 and 8 are exemplified by taking the display of the playback progress control 71 on both pages in the target control as an example.
  • the playback progress control 71 may also be displayed on only one page in the target control.
  • the first interface currently displayed on the display screen of the terminal device is a streaming media playback interface
  • the above target object is the streaming media played in the first interface
  • the above target action includes: adjusting the terminal device to output the target object
  • the terminal device may first determine the target value according to the input parameter of the first input, and then adjust the value of the output parameter of the target object output by the terminal device to This value can improve the accuracy of the terminal device in response to the user's input in the single-handed operation mode, thereby improving the single-handed operation performance of the terminal device.
  • the user before the terminal device receives the user's first input on the above-mentioned target control, the user can trigger the terminal device to enter the one-handed operation through an input (for example, the second input in the embodiment of the present disclosure).
  • the terminal device may display the target control in a certain area of the display screen of the terminal device (for example, the second area in the embodiment of the present disclosure).
  • the current display interface i.e., the above-mentioned first interface
  • the target control may cover these objects.
  • the terminal device can display these objects in the target control so that the user can operate on these objects.
  • the target control displays the objects included in the area of the first interface (for example, the first area in the embodiment of the present disclosure) that is blocked by the target control.
  • the interface control method provided by the embodiment of the present disclosure may further include the following S205 and S206.
  • the above-mentioned M objects may be objects included in the first area, the first area may be an area corresponding to the second area in the first interface, and the second area may be a target control displayed on the display screen of the terminal device.
  • M is a positive integer.
  • first area may be an area in the first interface that is blocked by the target control.
  • the user can trigger the terminal device to enter the one-handed operation mode through the above-mentioned second input to trigger the terminal device to display the above-mentioned target control on the display screen of the terminal device, and display the above-mentioned M objects in the target control .
  • the user can not only trigger the terminal device to perform the aforementioned target action through the input to the target control (ie, the aforementioned first input), but can also make the M objects visible to the user to facilitate the user to operate these objects.
  • the above-mentioned second input may be any possibility such as double-click input, triple-click input, long-press input, or double-click input by the user in a blank area of the interface currently displayed on the display screen of the terminal device.
  • the input can also be any possible input such as double-click input, long-press input, or double-press input of a specific key in the terminal device. Specifically, it can be determined according to actual usage requirements, and the embodiment of the present disclosure does not limit it.
  • the aforementioned specific button may be an artificial intelligence button, or any possible button in a terminal device such as a virtual navigation button. Specifically, it can be determined according to actual usage requirements, and the embodiment of the present disclosure does not limit it.
  • the above-mentioned target control may be displayed floating on the display screen of the terminal device, or may be displayed in the above-mentioned first interface. Specifically, it can be determined according to actual usage requirements, and the embodiment of the present disclosure does not limit it.
  • the target control when the above-mentioned target control is displayed on the display screen of the terminal device in suspension, that is, the target control may be a draggable control, and the above-mentioned second area may be on the display screen of the terminal device.
  • the area where the target control is displayed, and the first area may be an area overlapping the second area in the first interface.
  • the target control is displayed in the first interface, that is, the target control is a non-dragable control
  • the first area and the second area may be the same area, that is, the area in the first interface where the target control is displayed.
  • the terminal device displays the above-mentioned target control in the lower left corner of the display screen of the terminal device in response to the second input of the user, and at this time, the user holds the terminal device with the right hand, then, as shown in Figure 10
  • the user can drag the target control to the right, and the terminal device responds to the input.
  • the terminal device can display the control in the lower right corner of the display screen of the terminal device.
  • Target control wherein, the right arrow in (a) in FIG. 10 is used to instruct the user to drag the target control to the right.
  • the terminal device displays the target control on the display screen of the terminal device in response to the second input, if the terminal device does not receive the user's operation on the target control within the target time period , The terminal device can hide the target control.
  • the foregoing target duration may be a value preset in the terminal device.
  • the target duration can be preset in the terminal device by the manufacturer of the terminal device, or it can provide a setting interface for the manufacturer of the terminal device, and the user can preset it in the terminal device through the setting interface. Specifically, it can be determined according to actual usage requirements, and the embodiment of the present disclosure does not limit it.
  • the foregoing target duration may be any possible value such as 30 seconds, 1 minute, or 2 minutes.
  • the user can trigger the terminal device to display the target control again by clicking input or double-clicking input on a blank area of the display screen of the terminal device.
  • the terminal device may cancel the display of the M objects in the first interface currently displayed on the terminal device, or in the terminal The M objects are continuously displayed in the first interface currently displayed by the device.
  • the above-mentioned target control may include two target areas, a first target area and a second target area respectively.
  • the first target area may include the foregoing M objects
  • the second target area may include the foregoing trigger button and a blank area operated by the user.
  • the area of the first target area and the area of the second target area may be determined according to the area of the target control. Specifically, the ratio of the area of the first target area to the area of the second target area can be preset in the terminal device. After the area of the target control is determined, the area of the first target area and the area of the second target area are also set. Can be determined.
  • the display shape of the above-mentioned target control is a fan shape
  • the current display interface of the display screen of the terminal device is the main interface of the terminal device
  • the above-mentioned second input is the user's input on the display screen of the terminal device.
  • the long-press input in the blank area of the currently displayed interface, and the above-mentioned first area S1 includes 4 objects, which are the application icon "Browser” and the application icon "SMS" as shown in (a) in Figure 11 , Application icon "Communication", application icon "Music”.
  • the terminal device when the user long presses the main interface of the terminal device, in response to the input, as shown in (b) in Figure 11, the terminal device is in the area corresponding to S1 (ie the first area) on the display screen of the terminal device (That is, the above-mentioned second area) displays the above-mentioned target control.
  • the terminal device displays these four objects (ie, the application icon "Browser”, the application icon "SMS”, and the application icon") in the first target area S11 of the target control. Communication", application icon "Music”), and display the above-mentioned trigger button in the second target area S12 of the target control.
  • the first target area and the second target area may be two connected areas.
  • the dotted line L1 in FIG. 11 is used to clearly indicate the first target area S11 and the second target area S12 in FIG. 11.
  • the target control may not include The broken line L1.
  • the display shape of the target control is semicircular
  • the current display interface of the display screen of the terminal device is the main interface of the terminal device
  • the second input is the user's input on the terminal device.
  • the long-press input in the blank area of the currently displayed interface of the display screen, and the above-mentioned first area S1 includes 5 objects, which are the application icons "contacts" and application icons as shown in (a) in Figure 12 "Browser", application icon "SMS”, application icon "communication”, application icon "payment”.
  • the terminal device is in the area corresponding to S1 (ie the first area) on the display screen of the terminal device (That is, the above-mentioned second area) when the above-mentioned target control is displayed.
  • the terminal device displays these five objects (ie, the application icon "contact”, the application icon "browser”, and the application icon) in the first target area S11 of the target control.
  • “SMS”, application icon "communication”, application icon "payment” is displayed in the second target area S12 of the target control.
  • the first target area and the second target area may be two connected areas.
  • the dotted line L2 in FIG. 12 is used to clearly indicate the first target area S11 and the second target area S12 in FIG. 12, and in specific implementation, the target control may not include The broken line L2.
  • the display shape of the above-mentioned target control is a rounded rectangle
  • the current display interface of the display screen of the terminal device is the main interface of the terminal device
  • the above-mentioned second input is the user's input on the terminal device.
  • the long-press input in the blank area of the currently displayed interface of the display screen, and the above-mentioned first area S1 includes 4 objects, which are the application icons "browser” and application icons as shown in (a) in Figure 13 "SMS", application icon "communication”, application icon "music”.
  • the terminal device is in the area corresponding to S1 (ie the first area) on the display screen of the terminal device (That is, the above-mentioned second area) displays the above-mentioned target control.
  • the terminal device displays these 4 objects (ie, the application icon "browser”, the application icon "SMS”, the application icon ") in the first target area S11 of the target control. Communication", application icon "Music”).
  • the above-mentioned trigger button is displayed in the second target area S12 of the target control.
  • the first target area and the second target area may be two independent areas of the target control.
  • the first target area S11 in (b) in FIG. 13 is only exemplarily illustrated by displaying the application icon “browser” and the application icon “SMS”, and the specific implementation is At this time, the user can trigger the terminal device to display the application icon "communication” and the application icon "music” in the first target area S11 by sliding input in the first target area S11.
  • the user when the user holds the terminal device with one hand, the user can trigger the terminal device to display the target control through the above-mentioned second input, so that the user can trigger the terminal device to enter the one-handed operation mode with one input , Which can improve the convenience of the terminal device entering the one-handed operation mode.
  • the first area corresponds to the second area
  • no target control may cover the M included in the first area.
  • the terminal device can display the M objects in the target control, so that the user can operate these objects in the target control, thereby improving the convenience of one-handed operation of the terminal device.
  • the terminal device can first display a part of the M objects in the target control (for example, the N first objects in the embodiment of the present disclosure), and then the user can use The input to the target control triggers the terminal device to display another part of the M objects in the target control (for example, K second objects in the embodiment of the present disclosure). In this way, the user can control these M objects in the target control. Object operation.
  • the interface display method provided by the embodiment of the present disclosure may further include the following S207 and S208.
  • S206 can be specifically implemented by the following S206a.
  • the terminal device In response to the second input, the terminal device displays the target control on the display screen of the terminal device, and when the number of objects allowed to be visible to the user in the target control is less than M, display N first objects in the target control .
  • the terminal device receives a third input from the user to the target control.
  • the terminal device updates the N first objects in the target control to K second objects.
  • the N first objects and the K second objects may be objects in the M objects, and both N and K are positive integers less than M.
  • the terminal device may first display the above N first objects in the target control , And then the user can trigger the terminal device to update the N first objects to the above K second objects through the third input to the target control.
  • the aforementioned third input may be a user's sliding input in the aforementioned target control.
  • the third input may be any possible sliding input of the user, such as sliding leftward, sliding rightward, sliding clockwise, or sliding counterclockwise of the target control.
  • it can be determined according to actual usage requirements, and the embodiment of the present disclosure does not limit it.
  • the objects in the above N first objects and the objects in the K second objects may be different or partially the same. Specifically, it can be determined according to actual usage requirements, and the embodiment of the present disclosure does not limit it.
  • S201-S202 and S207-S208 may not be limited in the embodiments of the present disclosure. That is, in the embodiment of the present disclosure, S201-S202 may be executed first, and then S207-S208 may be executed; or S207-S208 may be executed first, and S201-S202 may be executed later.
  • FIG. 14 is exemplified by performing S207-S208 first and then S201-S202 as an example.
  • an embodiment of the present disclosure provides a terminal device 1500.
  • the terminal device 1500 includes a receiving module 1501 and an executing module 1502.
  • the receiving module 1501 is used to receive the user's first input on the target control;
  • the execution module 1502 is used to perform the target action in response to the first input received by the receiving module 1501.
  • the target control is displayed in the user's one-handed operation area on the display screen of the terminal device, and the target action includes any one of the following: controlling the target object in the first interface currently displayed on the display screen, adjusting the output of the terminal device
  • the output parameter is a parameter when the terminal device outputs the target object through the first interface.
  • the first interface currently displayed on the display screen of the terminal device is any interface other than the streaming media playback interface
  • the target object is the content in the first interface
  • the target action includes: controlling the current display screen of the terminal device Display the target object in the first interface; in conjunction with FIG. 15, as shown in FIG. 16, the terminal device 1500 further includes a determining module 1503; a determining module 1503, used to perform the target action according to the receiving module 1501 before the execution module 1502
  • An input parameter, the target object is determined in the first interface, and the first input input parameter includes at least one of the following: the start position of the first input, the end position of the first input, and the first input ⁇ input trajectory.
  • controlling the target object in the first interface by the terminal device includes any of the following: moving the target object, deleting the target object, displaying an interface corresponding to the target object, copying the target object, and sharing the target object.
  • the output parameter of the terminal device outputting the target object includes any one of the following: the brightness of the terminal device displaying the target object, the volume of the terminal device playing the target object, and the progress of the terminal device playing the target object.
  • the aforementioned target control includes at least one sub-control, each sub-control is used to adjust an output parameter, and the first input is an input to one of the at least one sub-control.
  • the terminal device 1500 further includes a display module 1504.
  • the receiving module 1501 is further configured to receive the user's second input before receiving the user's first input on the target control; the display module 1504 is configured to display the second input on the display screen in response to the second input received by the receiving module 1501
  • the target control and display M objects in the target control.
  • the M objects are objects included in the first area
  • the first area is the area corresponding to the second area in the first interface
  • the second area is the area where the target control is displayed on the display screen
  • M is a positive integer .
  • M is an integer greater than or equal to 2.
  • the display module 1504 is specifically used to display N first objects in the target control when the number of objects allowed to be visible to the user in the target control is less than M; the receiving module 1501 is also used to receive the user's first object of the target control Three inputs; the display module 1504 is also used to update the N first objects in the target control to K second objects in response to the third input received by the receiving module 1501.
  • the N first objects and the K second objects are objects in M objects, and both N and K are positive integers less than M.
  • the terminal device provided by the embodiment of the present disclosure can implement the various processes performed by the terminal device in the above-mentioned interface control method embodiment, and can achieve the same technical effect. To avoid repetition, details are not described here.
  • the embodiment of the present disclosure provides a terminal device. Since a target control is displayed in the user's one-handed operation area on the display screen of the terminal device, the user can trigger the terminal device to control the current display of the display screen by inputting the target control The target object in the first interface or trigger the terminal device to adjust its parameters when outputting the target object through the first interface. In this way, when the user holds the terminal device with one hand, the user can control the terminal device by inputting the target control. That is, the embodiment of the present disclosure does not need to reduce the interface currently displayed on the display screen of the terminal device and the interface. In the case of the effective size of the icon in the icon, the user's need for one-handed operation of the terminal device can be met, and the one-handed operation performance of the terminal device can be improved.
  • the embodiment of the present disclosure provides a terminal device. Since a target control is displayed in the user's one-handed operation area on the display screen of the terminal device, the user can trigger the terminal device to control the current display of the display screen by inputting the target control The target object in the first interface or trigger the terminal device to adjust its parameters when outputting the target object through the first interface. In this way, when the user holds the terminal device with one hand, the user can control the terminal device by inputting the target control. That is, the embodiment of the present disclosure does not need to reduce the interface currently displayed on the display screen of the terminal device and the interface. In the case of the effective size of the icon in the icon, the user's need for one-handed operation of the terminal device can be met, and the one-handed operation performance of the terminal device can be improved.
  • the radio frequency unit 101 can be used for receiving and sending signals in the process of sending and receiving information or talking. Specifically, the downlink data from the base station is received and processed by the processor 110; Uplink data is sent to the base station.
  • the radio frequency unit 101 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like.
  • the radio frequency unit 101 can also communicate with the network and other devices through a wireless communication system.
  • the audio output unit 103 can convert the audio data received by the radio frequency unit 101 or the network module 102 or stored in the memory 109 into audio signals and output them as sounds. Moreover, the audio output unit 103 may also provide audio output related to a specific function performed by the terminal device 100 (for example, call signal reception sound, message reception sound, etc.).
  • the audio output unit 103 includes a speaker, a buzzer, a receiver, and the like.
  • the input unit 104 is used to receive audio or video signals.
  • the input unit 104 may include a graphics processing unit (GPU) 1041 and a microphone 1042.
  • the graphics processing unit 1041 is used to capture still pictures or video images obtained by an image capture device (such as a camera) in a video capture mode or an image capture mode. Data is processed.
  • the processed image frame can be displayed on the display unit 106.
  • the image frames processed by the graphics processor 1041 can be stored in the memory 109 (or other storage medium) or sent via the radio frequency unit 101 or the network module 102.
  • the microphone 1042 can receive sound, and can process such sound into audio data.
  • the processed audio data can be converted into a format that can be sent to a mobile communication base station via the radio frequency unit 101 for output in the case of a telephone call mode.
  • the accelerometer sensor can detect the magnitude of acceleration in various directions (usually three-axis), and can detect the magnitude and direction of gravity when it is stationary, and can be used to identify the posture of the terminal device (such as horizontal and vertical screen switching, related games) , Magnetometer attitude calibration), vibration recognition related functions (such as pedometer, tap), etc.; sensor 105 can also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, Infrared sensors, etc., will not be repeated here.
  • the interface unit 108 is an interface for connecting an external device with the terminal device 100.
  • the external device may include a wired or wireless headset port, an external power source (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device with an identification module, audio input/output (I/O) port, video I/O port, headphone port, etc.
  • the interface unit 108 can be used to receive input (for example, data information, power, etc.) from an external device and transmit the received input to one or more elements in the terminal device 100 or can be used to connect to the terminal device 100 and external Transfer data between devices.
  • the memory 109 can be used to store software programs and various data.
  • the memory 109 may mainly include a program storage area and a data storage area.
  • the program storage area may store an operating system, an application program required by at least one function (such as a sound playback function, an image playback function, etc.), etc.; Data (such as audio data, phone book, etc.) created by the use of mobile phones.
  • the memory 109 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, or other volatile solid-state storage devices.
  • the processor 110 is the control center of the terminal device. It uses various interfaces and lines to connect the various parts of the entire terminal device, runs or executes the software programs and/or modules stored in the memory 109, and calls data stored in the memory 109 , Perform various functions of the terminal equipment and process data, so as to monitor the terminal equipment as a whole.
  • the processor 110 may include one or more processing units; optionally, the processor 110 may integrate an application processor and a modem processor, where the application processor mainly processes the operating system, user interface, and application programs, etc.
  • the adjustment processor mainly deals with wireless communication. It can be understood that the foregoing modem processor may not be integrated into the processor 110.
  • the terminal device 100 includes some functional modules not shown, which will not be repeated here.
  • an embodiment of the present disclosure further provides a terminal device, including a processor 110 as shown in FIG. 18, a memory 109, a computer program stored in the memory 109 and running on the processor 110, and the computer program is
  • a terminal device including a processor 110 as shown in FIG. 18, a memory 109, a computer program stored in the memory 109 and running on the processor 110, and the computer program is
  • the processor 110 executes, each process of the foregoing interface control method embodiment is implemented, and the same technical effect can be achieved. To avoid repetition, details are not described herein again.
  • the embodiments of the present disclosure also provide a computer-readable storage medium on which a computer program is stored.
  • a computer program When the computer program is executed by a processor, each process of the above-mentioned interface control method embodiment is realized, and the same technology can be achieved. The effect, in order to avoid repetition, will not be repeated here.
  • the computer-readable storage medium may include read-only memory (ROM), random access memory (RAM), magnetic disk or optical disk, etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Procédé de commande d'interface et dispositif terminal. Le procédé consiste à : recevoir une première entrée effectuée par un utilisateur par rapport à une commande cible (S201), la commande cible étant affichée dans une région d'opération d'une main pour des utilisateurs sur un écran d'affichage d'un dispositif terminal ; et réaliser une action cible en réponse à la première entrée (S202), l'action cible comprenant l'une quelconque des actions suivantes : la commande d'un objet cible dans une première interface actuellement affichée sur l'écran d'affichage ; et l'ajustement d'un paramètre de sortie du dispositif de terminal, le paramètre de sortie étant un paramètre utilisé par le dispositif terminal pour délivrer l'objet cible à l'aide de la première interface.
PCT/CN2020/075378 2019-03-14 2020-02-14 Procédé de commande d'interface et dispositif terminal WO2020181955A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910194690.1A CN110069178B (zh) 2019-03-14 2019-03-14 界面控制方法及终端设备
CN201910194690.1 2019-03-14

Publications (1)

Publication Number Publication Date
WO2020181955A1 true WO2020181955A1 (fr) 2020-09-17

Family

ID=67366149

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/075378 WO2020181955A1 (fr) 2019-03-14 2020-02-14 Procédé de commande d'interface et dispositif terminal

Country Status (2)

Country Link
CN (1) CN110069178B (fr)
WO (1) WO2020181955A1 (fr)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110069178B (zh) * 2019-03-14 2021-04-02 维沃移动通信有限公司 界面控制方法及终端设备
CN110460907B (zh) * 2019-08-16 2021-04-13 维沃移动通信有限公司 一种视频播放控制方法及终端
CN110898424B (zh) * 2019-10-21 2023-10-20 维沃移动通信有限公司 一种显示控制方法及电子设备
CN111372140A (zh) * 2020-03-04 2020-07-03 网易(杭州)网络有限公司 弹幕调整方法及装置、计算机可读存储介质和电子设备
CN111694494B (zh) * 2020-06-10 2022-04-26 维沃移动通信有限公司 控制方法及装置
CN112148172B (zh) * 2020-09-29 2022-11-11 维沃移动通信有限公司 操作控制方法及装置
CN113641275A (zh) * 2021-08-24 2021-11-12 维沃移动通信有限公司 界面控制方法和电子设备
CN114594897A (zh) * 2022-03-10 2022-06-07 维沃移动通信有限公司 触摸屏的单手控制方法、控制装置、电子设备和存储介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150074598A1 (en) * 2013-09-09 2015-03-12 Lenovo (Beijing) Limited Information processing methods and electronic devices
CN107621914A (zh) * 2017-08-02 2018-01-23 努比亚技术有限公司 终端功能控键的显示方法、终端及计算机可读存储介质
CN108733282A (zh) * 2018-04-16 2018-11-02 维沃移动通信有限公司 一种页面移动方法及终端设备
CN108762634A (zh) * 2018-05-15 2018-11-06 维沃移动通信有限公司 一种控制方法及终端
CN110069178A (zh) * 2019-03-14 2019-07-30 维沃移动通信有限公司 界面控制方法及终端设备

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102830917A (zh) * 2012-08-02 2012-12-19 上海华勤通讯技术有限公司 移动终端及其触控建立方法
CN103019568B (zh) * 2012-12-21 2015-09-30 东莞宇龙通信科技有限公司 终端和图标显示方法
CN103019604B (zh) * 2012-12-24 2016-08-03 东莞宇龙通信科技有限公司 终端和终端操控方法
CN104077067A (zh) * 2013-03-28 2014-10-01 深圳市快播科技有限公司 基于具有触摸屏的装置的播放方法及系统
CN104238746B (zh) * 2014-08-25 2019-02-05 联想(北京)有限公司 信息处理方法及电子设备
CN104866228B (zh) * 2015-02-17 2022-02-01 顾红波 一种用于捏着便携智能设备操作的系统和方法
CN107924274A (zh) * 2015-07-31 2018-04-17 麦克赛尔株式会社 信息终端装置
CN107908346A (zh) * 2017-11-16 2018-04-13 北京小米移动软件有限公司 用于控制屏幕显示的方法及装置

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150074598A1 (en) * 2013-09-09 2015-03-12 Lenovo (Beijing) Limited Information processing methods and electronic devices
CN107621914A (zh) * 2017-08-02 2018-01-23 努比亚技术有限公司 终端功能控键的显示方法、终端及计算机可读存储介质
CN108733282A (zh) * 2018-04-16 2018-11-02 维沃移动通信有限公司 一种页面移动方法及终端设备
CN108762634A (zh) * 2018-05-15 2018-11-06 维沃移动通信有限公司 一种控制方法及终端
CN110069178A (zh) * 2019-03-14 2019-07-30 维沃移动通信有限公司 界面控制方法及终端设备

Also Published As

Publication number Publication date
CN110069178A (zh) 2019-07-30
CN110069178B (zh) 2021-04-02

Similar Documents

Publication Publication Date Title
WO2020181955A1 (fr) Procédé de commande d'interface et dispositif terminal
WO2021104365A1 (fr) Procédé de partage d'objets et dispositif électronique
WO2019174611A1 (fr) Procédé de configuration d'application et terminal mobile
WO2019228293A1 (fr) Procédé de commande d'affichage et terminal
WO2020258929A1 (fr) Procédé de commutation d'interface de dossier et dispositif terminal
WO2021057337A1 (fr) Procédé de fonctionnement et dispositif électronique
WO2020063091A1 (fr) Procédé de traitement d'image et dispositif terminal
WO2021083132A1 (fr) Procédé de déplacement d'icônes et dispositif électronique
CN108762634B (zh) 一种控制方法及终端
CN109032486B (zh) 一种显示控制方法及终端设备
CN108762705B (zh) 一种信息显示方法、移动终端和计算机可读存储介质
WO2021012927A1 (fr) Procédé d'affichage d'icône et dispositif terminal
WO2021129536A1 (fr) Procédé de déplacement d'icône et dispositif électronique
WO2021129538A1 (fr) Procédé de commande et dispositif électronique
WO2020192299A1 (fr) Procédé d'affichage d'informations et dispositif terminal
WO2021004327A1 (fr) Procédé de définition d'autorisation d'application, et dispositif terminal
WO2020151460A1 (fr) Procédé de traitement d'objet et dispositif terminal
WO2021104163A1 (fr) Procédé d'agencement d'icônes et dispositif électronique
WO2020057257A1 (fr) Procédé de basculement d'interface d'application et terminal mobile
WO2021004306A1 (fr) Procédé et terminal de commande d'opérations
WO2021057290A1 (fr) Procédé de commande d'informations et dispositif électronique
WO2020192297A1 (fr) Procédé de commutation d'interface d'écran et dispositif terminal
WO2020173235A1 (fr) Procédé de commutation de tâches et dispositif terminal
WO2020078234A1 (fr) Procédé de commande d'affichage et terminal
WO2021017738A1 (fr) Procédé d'affichage d'interface et dispositif électronique

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20768954

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20768954

Country of ref document: EP

Kind code of ref document: A1