CN108762634B - Control method and terminal - Google Patents

Control method and terminal Download PDF

Info

Publication number
CN108762634B
CN108762634B CN201810464385.5A CN201810464385A CN108762634B CN 108762634 B CN108762634 B CN 108762634B CN 201810464385 A CN201810464385 A CN 201810464385A CN 108762634 B CN108762634 B CN 108762634B
Authority
CN
China
Prior art keywords
sub
input
display
terminal
control
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810464385.5A
Other languages
Chinese (zh)
Other versions
CN108762634A (en
Inventor
谢锦洋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN201810464385.5A priority Critical patent/CN108762634B/en
Publication of CN108762634A publication Critical patent/CN108762634A/en
Application granted granted Critical
Publication of CN108762634B publication Critical patent/CN108762634B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the invention provides a control method and a terminal, which are applied to the technical field of communication and can solve the problem of low convenience for a user to operate a virtual key. The method is applied to the terminal and comprises the following steps: receiving a first input of a user; responding to the first input, and displaying a target control in an operation area of the first input; receiving a second input of the user on the target control; and responding to the second input, and executing the control operation corresponding to the second input. The method is particularly applied to the process that the terminal displays the virtual control on the display screen.

Description

Control method and terminal
Technical Field
The embodiment of the invention relates to the technical field of communication, in particular to a control method and a terminal.
Background
As the screen occupation ratio of the display screen in the terminal increases, the terminal tends not to set physical keys but to set virtual keys displayed on the display screen so as to perform corresponding functions through the virtual keys.
In the prior art, the terminal usually displays the virtual keys in a fixed area on the display screen, for example, on the edge (e.g., bottom edge) of the display screen. Therefore, in some use scenarios, for example, a scenario in which a user operates the terminal with one hand, the user needs to adjust a gesture for holding the terminal to operate the virtual key displayed on the display screen of the terminal. Thus, the user's convenience in operating the virtual keys is low.
Disclosure of Invention
The embodiment of the invention provides a control method and a terminal, and aims to solve the problem that a user is low in convenience for operating a virtual key.
In order to solve the above technical problem, the embodiment of the present invention is implemented as follows:
in a first aspect, an embodiment of the present invention provides a control method, which is applied to a terminal, and the method includes: receiving a first input of a user; responding to the first input, and displaying a target control in an operation area of the first input; receiving a second input of the user on the target control; and responding to the second input, and executing the control operation corresponding to the second input.
In a second aspect, an embodiment of the present invention further provides a terminal, where the terminal includes: the device comprises a receiving module, a display module and an execution module; the receiving module is used for receiving a first input of a user; the display module is used for responding to the first input received by the receiving module and displaying the target control in the operation area of the first input; the receiving module is further used for receiving a second input of the user on the target control; and the execution module is used for responding to the second input received by the receiving module and executing the control operation corresponding to the second input.
In a third aspect, an embodiment of the present invention provides a terminal, including a processor, a memory, and a computer program stored on the memory and operable on the processor, where the computer program, when executed by the processor, implements the steps of the control method according to the first aspect.
In a fourth aspect, an embodiment of the present invention provides a computer-readable storage medium, on which a computer program is stored, and the computer program, when executed by a processor, implements the steps of the control method according to the first aspect.
In the embodiment of the present invention, according to the control method provided in the embodiment of the present invention, after receiving the first input of the user, the terminal may display the target control in the operation area of the first input on the display screen of the terminal, and may not display the target control in other areas in the display screen. Namely, the user can operate the target control without adjusting the gesture of holding the terminal. Therefore, the flexibility of the terminal for displaying the target control can be improved, and the convenience of operating the target control (namely the virtual key) by the user is improved.
Drawings
Fig. 1 is a schematic diagram of an architecture of a possible android operating system according to an embodiment of the present invention;
FIG. 2 is a flowchart illustrating a control method according to an embodiment of the present invention;
FIG. 3 is a second flowchart of a control method according to an embodiment of the present invention;
fig. 4 is one of schematic diagrams of a display interface of a terminal according to an embodiment of the present invention;
fig. 5 is a second schematic diagram of a display interface of the terminal according to the embodiment of the present invention;
fig. 6 is a third schematic view of a display interface of the terminal according to the embodiment of the present invention;
fig. 7 is a fourth schematic view of a display interface of a terminal according to an embodiment of the present invention;
fig. 8 is a schematic structural diagram of a possible terminal according to an embodiment of the present invention;
fig. 9 is a schematic diagram of a hardware structure of a terminal according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that "/" in this context means "or", for example, A/B may mean A or B; "and/or" herein is merely an association describing an associated object, and means that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. "plurality" means two or more than two.
It should be noted that, in the embodiments of the present invention, words such as "exemplary" or "for example" are used to indicate examples, illustrations or explanations. Any embodiment or design described as "exemplary" or "e.g.," an embodiment of the present invention is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, use of the word "exemplary" or "such as" is intended to present concepts related in a concrete fashion.
The terms "first" and "second," and the like, in the description and in the claims of the present invention are used for distinguishing between different objects and not for describing a particular order of the objects. For example, the first input and the second input, etc. are for distinguishing different inputs, rather than for describing a particular order of inputs.
The control method provided by the embodiment of the invention is applied to the terminal, and under the condition that the user triggers the terminal to display the target control (also called a virtual key), the terminal can display the target control in the operation area of the first input of the user on the display screen of the terminal, and can not display the target control in other areas in the display screen. Therefore, the flexibility of the terminal for displaying the target control can be improved, and the convenience of operating the target control (namely the virtual key) by the user is improved.
The terminal in the embodiment of the present invention may be a terminal having an operating system. The operating system may be an Android (Android) operating system, an ios operating system, or other possible operating systems, and embodiments of the present invention are not limited in particular.
It should be noted that, in the control method provided in the embodiment of the present invention, the execution main body may be a terminal, or a Central Processing Unit (CPU) of the terminal, or a control module in the terminal for executing the control method, which is not specifically limited in this embodiment of the present invention.
The following describes a software environment to which the control method provided by the embodiment of the present invention is applied, by taking an android operating system as an example.
Fig. 1 is a schematic diagram of an architecture of a possible android operating system according to an embodiment of the present invention. In fig. 1, the architecture of the android operating system includes 4 layers, which are respectively: an application layer, an application framework layer, a system runtime layer, and a kernel layer (specifically, a Linux kernel layer).
The application program layer comprises various application programs (including system application programs and third-party application programs) in an android operating system.
The application framework layer is a framework of the application, and a developer can develop some applications based on the application framework layer under the condition of complying with the development principle of the framework of the application. For example, applications such as a system setup application, a system chat application, and a system camera application. And the third-party setting application, the third-party camera application, the third-party chatting application and other application programs.
The system runtime layer includes libraries (also called system libraries) and android operating system runtime environments. The library mainly provides various resources required by the android operating system. The android operating system running environment is used for providing a software environment for the android operating system.
The kernel layer is an operating system layer of an android operating system and belongs to the bottommost layer of an android operating system software layer. The kernel layer provides kernel system services and hardware-related drivers for the android operating system based on the Linux kernel.
Taking an android operating system as an example, in the embodiment of the present invention, a developer may develop a software program for implementing the control method provided in the embodiment of the present invention based on the system architecture of the android operating system shown in fig. 1, so that the control method may operate based on the android operating system shown in fig. 1. Namely, the processor or the terminal device can implement the control method provided by the embodiment of the invention by running the software program in the android operating system.
The following describes the control method provided by the embodiment of the present invention in detail with reference to the flowchart of the control method shown in fig. 2. Wherein, although the logical order of the control methods provided by embodiments of the present invention is shown in method flow diagrams, in some cases, the steps shown or described may be performed in an order different than here. For example, the control method shown in fig. 2 may include steps 201 to 204:
step 201, receiving a first input of a user.
Optionally, the terminal provided in the embodiment of the present invention may include a display screen, where the display screen may support touch input or fingerprint input of a user. Specifically, the first input may be a touch screen input or a fingerprint input. The touch screen input may be touch input such as pressing input, long-pressing input, sliding input, clicking input, and hovering input (input by a user near the touch screen) of the terminal by the user; the fingerprint input can be fingerprint input such as sliding fingerprint, long-time pressing fingerprint, single-click fingerprint and double-click fingerprint of a fingerprint identifier of the terminal by a user.
In addition, the gesture of the first input may be at least one of: pressure recognition gestures, long press gestures, area change gestures, multi-touch gestures, slide gestures, double press gestures, double tap gestures, tangential gestures, designated area gestures. Specifically, the manner of the first input is not limited in the embodiment of the present invention, and may be any realizable manner.
It is understood that there are two types of capacitive sensors, a mutual capacitance sensor and a self-capacitance sensor, on the display screen of the terminal that supports the floating input. Wherein, the self-capacitance sensor can generate a signal stronger than the mutual capacitance, and detect a more distant finger sensing detection distance range, such as 20 mm. The floating input is realized by simultaneously operating a self-capacitance sensor and a mutual capacitance sensor on one display screen, the mutual capacitance sensor is used for finishing normal touch sensing including multi-point touch control input, and the self-capacitance sensor is used for detecting finger operation hovering above the screen. In addition, the display screen supporting the suspension input can also be realized by adopting infrared light sensation, ultrasonic waves and the like.
Optionally, the first input provided by the embodiment of the present invention may be input by a user when the terminal is in an unlocked state, a screen-locked state, or a screen-saving state.
Step 202, responding to the first input, and displaying the target control in the operation area of the first input.
It should be emphasized that the control (such as the above target control) provided by the embodiment of the present invention may be a floating key displayed on the display screen of the terminal in a floating manner.
Illustratively, the target control described above may be used to adjust the volume.
It is emphasized that the operation region of the first input may be a region in the display screen of the terminal where the first input is detected.
And step 203, receiving a second input of the user on the target control.
Similarly, for the description of the second input in the embodiment of the present invention, reference may be made to the description related to the first input in the foregoing embodiment, and details are not described in the embodiment of the present invention.
And step 204, responding to the second input, and executing the control operation corresponding to the second input.
And the terminal executes the control operation corresponding to the second input, so that the terminal control target control executes the corresponding function.
For example, a second input by the user to the target control may trigger the terminal to adjust the volume.
Optionally, the second input of the user to the target control may be an input of the user to a center point of the target control.
It is understood that a user usually holds the terminal in one gesture, for example, the user may hold the terminal with one hand, and as the screen area of the display screen in the terminal increases, the user may touch a part of the area of the display screen and cannot touch other areas of the display screen when using the terminal. Therefore, after the user performs the first input on the first input operation area in the display screen of the terminal, if the terminal displays the target control in the other area except the area on the display screen, the user may need to change the holding gesture of the terminal to be able to operate the target control, so that the terminal executes the function corresponding to the target control.
In the control method provided by the embodiment of the invention, even if a physical key, such as a volume adjusting key, is not arranged in the terminal, the terminal can generate one or more controls supporting the function of the physical key.
It should be noted that, according to the control method provided in the embodiment of the present invention, after receiving the first input of the user, the terminal may display the target control in the operation area of the first input on the display screen of the terminal, and may not display the target control in other areas in the display screen, so that the terminal supports the user to operate the target control on the operation area of the first input on the display screen. Namely, the user can operate the target control without adjusting the gesture of holding the terminal. Therefore, the flexibility of the terminal for displaying the target control can be improved, and the convenience of operating the target control (namely the virtual key) by the user is improved.
In a possible implementation manner, as shown in fig. 3, another schematic flow chart of the control method provided in the embodiment of the present invention is shown. Specifically, with reference to fig. 2, in the control method shown in fig. 3, step 205 and step 206 may be further included before step 202, and for example, step 205 and step 206 are further included between step 201 and step 202:
and step 205, acquiring the program attribute information of the currently running target application program.
The target application program is an application program running in a foreground or a background.
Optionally, when the terminal is in an unlocked state and the display screen displays the main interface, the target application program is the last application program switched to the background to run; or under the condition that the terminal is in an unlocked state and the display screen displays the interface of the target application program, the target application program is an application program operated by the foreground of the terminal; or when the terminal is in the screen-off state or the screen-locking state, the target application program is an application program which is operated in the foreground before the screen-off state or the screen-locking state of the terminal.
And step 206, generating a target control according to the program attribute information.
Wherein the program attribute information includes at least one of a program type and a program priority.
Illustratively, under the condition that the terminal is in an unlocked state and the display screen displays the main interface, the target application is an application running in the background and having a higher priority. The program priority of the application program in the terminal may be preset. For example, the priority of the application 1 installed in the terminal is greater than the priority of the application 2.
The target control generated by the terminal changes along with the target application program running in the terminal.
It can be understood that the terminal may preset some target controls corresponding to the application program, or generate the target controls corresponding to the application program in real time. For example, the target controls generated by the target applications that the terminal installs named "a player", "B player", and "C game" all include volume adjustment keys.
Specifically, the program types of an application program may include a video type, a music type, a chat type, an office type, and the like. For example, the target controls set by the terminal for the target applications installed in the terminal and having the video type, the music type and the chat type comprise volume adjusting keys.
Exemplarily, as shown in fig. 4, a schematic diagram of a terminal display interface provided in an embodiment of the present invention is shown. Fig. 4 shows an interface 401 displayed on a terminal, in which the interface 401 includes volume adjustment keys 4011, and the volume adjustment keys 4011 include a volume up key 4011a and a volume down key 4011 b. Among them, the volume up key 4011a and the volume down key 4011b are two different touch positions in the volume adjustment key 4011.
In the interface 401 shown in fig. 4, an operation area a1 where the volume adjustment key 4011 is located is an operation area for the first input.
It should be noted that, in the embodiment of the present invention, the area and the shape of the operation region for one input are not particularly limited, and may be set according to the actual application. For example, only the operation region is illustrated in a dashed-line box in fig. 4.
Alternatively, the first input of the interface 401 shown in fig. 4, which triggers the terminal to display, may be a multi-touch operation of two fingers of the user as shown in fig. 4.
Optionally, when the target application is a music player, the target control may be a fast forward key, a fast rewind key, a play key, a pause key, and the like, in addition to the volume up key and the volume down key.
Similarly, in the control method provided in the embodiment of the present invention, the description of the target control generated by the terminal according to the program type and the program priority of the target application may refer to the description of the terminal according to the program type of the target application and the description of the terminal according to the program priority of the target application in the foregoing embodiment.
It can be understood that, in the case that the target application is an application running in the background in the terminal unlock state, the target control is displayed in the main interface; when the target application program is an application program which runs in the foreground when the terminal is in an unlocked state, the target control is displayed in an interface of the target application program; and under the condition that the target application program is an application program which runs in the foreground before the screen of the terminal is turned off or locked, displaying the target control in a screen locking interface of the terminal.
It should be noted that, with the control method provided in the embodiment of the present invention, when the user uses the terminal, the terminal may generate the target control for the target application in real time, so that the user may conveniently operate the target application through the target control. Therefore, the flexibility of the terminal for displaying the target control can be further improved, and the convenience of the user for operating the target control is further improved.
In a possible implementation manner, in the control method provided in the embodiment of the present invention, the first input includes N sub-inputs, and one sub-input corresponds to one sub-operation region. Specifically, in the control method provided in the embodiment of the present invention, the step 202 may include steps 202a and 202 b:
step 202a, acquiring N sub-operation regions corresponding to the N sub-inputs.
And 202b, displaying N sub-controls in the N sub-operation areas.
The target control comprises N sub-controls, one sub-operation area correspondingly displays one sub-control, and N is an integer larger than 1.
Illustratively, as shown in fig. 5, a schematic diagram of another terminal display interface provided in the embodiment of the present invention is shown. Fig. 5 shows an interface 501 of a terminal, in which interface 501 a volume up key 5011 and a volume down key 5012 are included. At this time, the above N sub-controls may include a volume up key 5011 and a volume down key 5012, i.e., N is equal to 2.
Alternatively, the user may trigger the volume up key 5011 in the terminal display interface 501 through one input (denoted as sub-input 1) and trigger the volume down key 5012 in the terminal display interface 501 through another input (denoted as sub-input 2). At this time, the first input may include sub input 1 and sub input 2.
Specifically, the area where the volume increase key 5011 in the interface 5 acquired by the terminal is located is an operation area a2 corresponding to the sub input 1, that is, the area where the sub input 1 is detected in the display screen of the terminal; the area where the volume reduction key 5012 is located in the interface 5 acquired by the terminal is the operation area a3 corresponding to the sub input 2, that is, the area where the sub input 2 is detected in the display screen of the terminal.
It should be noted that, in the control method provided in the embodiment of the present invention, one control set in the terminal corresponds to one trigger input, so that a user can respectively display the control corresponding to the input through each input trigger terminal. And, the terminal may display a control corresponding to each input in an operation region of the input. For example, the terminal may display a sub-control corresponding to one sub-input in the first input in the sub-operation region corresponding to the sub-input, so as to achieve displaying of the target control. Therefore, the flexibility of the terminal for displaying the target control can be further improved, and the convenience of the user for operating the target control is further improved.
In a possible implementation manner, the control method provided in the embodiment of the present invention, after the step 202b, may further include a step 207 and a step 208:
and step 207, receiving a third input of the user under the condition that the first sub-control is displayed in the first sub-operation area and the second sub-control is displayed in the second sub-operation area.
The first sub-control and the second sub-control are contained in the N sub-controls, and the first sub-operation area and the second sub-operation area are contained in the N sub-operation areas.
Illustratively, in connection with the interface 501 of the terminal shown in fig. 5 described above, the first sub-control may be a volume increase key 5011 shown in the interface 501, and the second sub-control may be a volume decrease key 5012 shown in the interface 501. The first sub-operation region may be the sub-operation region a2, and the second sub-operation region may be the sub-operation region A3.
Similarly, in the control method provided in the embodiment of the present invention, the description of the third input may refer to the description of the first input in the embodiment.
It is understood that the third input is used for triggering the terminal to exchange the sub operation regions where the first sub control and the second sub control are located.
For example, the above-described third input may be an input in which the user drags the volume up key 5011 in the interface 501 of the terminal shown in fig. 5 from the sub manipulation area a2 to the sub manipulation area A3 in which the volume down key 5012 is located.
And step 208, responding to the third input, updating the display content in the first sub-operation area to be the second sub-control, and updating the display content in the second sub-operation area to be the first sub-control.
Exemplarily, as shown in fig. 6, a schematic diagram of a display interface of another terminal provided in the embodiment of the present invention is shown.
Among them, in conjunction with fig. 5, after the user performs a third input of the volume up key 5011 dragged from the sub manipulation area a2 to the sub manipulation area A3 where the volume down key 5012 is located as shown in fig. 5, the terminal may display an interface 601 as shown in fig. 6, in which the volume up key 5011 is displayed in the sub manipulation area A3 and the volume down key 5012 is displayed in the sub manipulation area a 2.
It is understood that a user may be accustomed to operating controls in a particular area of the terminal's display interface while using the terminal. Such that the user may desire the terminal to move or swap one or more of the plurality of child controls displayed by the terminal.
It should be noted that, with the control method provided in the embodiment of the present invention, the terminal may move or exchange the area where the plurality of sub-controls displayed by the terminal are located, so that the plurality of sub-controls displayed by the terminal better conform to the usage habit of the user. Therefore, the flexibility of the terminal for displaying the target control can be further improved, and the convenience of the user for operating the target control is further improved.
In a possible implementation manner, in the control method provided in the embodiment of the present invention, the display screen of the terminal includes M sub-display screens, where M is an integer greater than 1. Specifically, the control method provided in the embodiment of the present invention may further include, before the step 202, a step 209 and a step 210, where the step 202 may be replaced by a step 202 c:
and step 209, acquiring the display attribute of the target sub-display screen where the first input operation area is located.
The display attribute comprises at least one of the area of the sub display screen, the shape of the sub display screen and the color of the interface currently displayed by the sub display screen.
Alternatively, the display attributes of different sub-displays may be different. For example, one sub-display may be a liquid crystal display, and the other sub-display may be an ink-jet display, and the display properties of the two sub-displays are different.
Optionally, the display screen provided by the embodiment of the invention may be a full screen, a semi-enclosure screen or a full enclosure screen.
Optionally, the display screen provided by the embodiment of the invention may be a flexible screen or a non-flexible screen.
For example, one sub-display may be curved or non-curved in shape; the color of one display screen can be color or black and white.
And step 210, determining target display parameters of the target control according to the display attributes.
Wherein the target display parameter comprises at least one of area, shape and color.
In addition, the display parameters of a control in the sub-display screen may include the pixel value displayed by the control, the line displayed by the control, the transparency displayed by the control, and the like.
It can be understood that, in the embodiment of the present invention, the target display parameter of the target control may be a preset area 1, the shape may be a preset shape 1, and the color may be a preset color 1.
And step 202c, displaying the target control in the first input operation area according to the target display parameters.
For example, the terminal may display the target control in the operation region of the first input with an area 1, a shape 1, and a color 1.
It should be noted that, with the control method provided in the embodiment of the present invention, the terminal may determine the target display parameters of the target control in different sub-display screens according to the display attributes of the sub-display screens. Therefore, even if the display attributes of the plurality of sub-display screens are different, the terminal can display the target control corresponding to the same function in the different sub-display screens. Therefore, the flexibility of the terminal for displaying the target control can be further improved, and the convenience of the user for operating the target control is further improved.
In a possible implementation manner, the control method provided in the embodiment of the present invention, after the step 202, may further include steps 211 and 212:
and step 211, receiving a fourth input of the user sliding from the first operation area to the second operation area.
Similarly, the description of the fourth input according to the embodiment of the present invention may refer to the description of the first input according to the above embodiment.
Illustratively, as shown in fig. 7, a schematic diagram of another terminal display interface provided for implementing the present invention is provided. Fig. 7 shows the interface 701 including a volume adjustment key 7011, and the volume adjustment key 7011 is displayed in the operation region P3.
At this time, as shown in the interface 701 in fig. 7, the fourth input by the user may be a slide input from the operation region P1 to the operation region P2.
And step 212, responding to the fourth input, and moving the second control from the operation area of the first input to the third operation area.
Specifically, the first operation region corresponds to an operation region of the first input, and the second operation region corresponds to a third operation region.
Among them, in the course of the user performing the slide input from the operation region P1 to the operation region P2, the volume adjustment key 7011 may make a moving process indicated by an arrow between the operation region P3 and the operation region P4 shown in fig. 7 to effect the movement of the volume adjustment key 7011 from the operation region P3 into the operation region P4. Thus, the terminal is enabled to display the interface 702 as shown in fig. 7, in which the volume adjustment key 7011 is displayed in the operation region P4 in the interface 702.
Optionally, the first operation area and the second operation area where the user's finger is located may be different from the first input operation area and the third operation area where the target control is located.
For example, the target control may be displayed in a user-inoperable region of the display screen of the terminal, that is, the operation region of the first input and the third operation region may be in a user-inoperable region of the display screen of the terminal, and the first operation region and the second operation region in which the finger of the user is located may be in a user-operable region of the display screen of the terminal.
For example, in a scene of one-handed operation by the user, the operation region of the first input and the third operation region may be in a region that is not operable when the user performs one-handed operation on the display screen of the terminal, and the first operation region and the second operation region may be in a region that is operable when the user performs one-handed operation on the display screen of the terminal.
In this way, even if the target control is displayed in an area (e.g., a non-holding area) that is not operable by the user's finger in the display screen of the terminal, the terminal receives an input in an area (e.g., a holding area) that is operable by the user's finger in the display screen of the terminal, such as an input in the first operation area and the second operation area, the terminal can be caused to move the target control.
Of course, the first operation area and the second operation area where the fingers of the user are located may be areas that the user is accustomed to operating, and the first input operation area and the third operation area where the target control is located may be areas that the user is not accustomed to operating. Therefore, the user can control the terminal moving target control in the area which accords with the use habit of the user in the display screen of the terminal.
In addition, in the process of performing the fourth input on the target control by the user, the display parameters of the target control before moving may be different from the display parameters in the moving process.
For example, the display parameters of the target control before and after movement indicate that the transparency of the target control displayed by the terminal is 0%, that is, the target control is in an opaque state; the display parameter of the moving process of the target control indicates that the transparency of the target control displayed by the terminal is 50%, namely the target control is in a semitransparent state. Or the display parameters of the target control before and after the movement indicate that the line of the target control displayed by the terminal is a solid line; and the display parameter of the moving process of the target control indicates that the line of the target control displayed by the terminal is a dotted line.
It should be noted that, with the control method provided in the embodiment of the present invention, the terminal may move the target control from one operation region to another operation region, so as to display the target control in the region required by the user. And even if the target control is displayed in the area which can not be operated by the user on the display screen of the terminal, the terminal receives the input of the user in the first operation area and the second operation area, and the target control can be moved. Therefore, the flexibility of the terminal for displaying the target control can be further improved, and the convenience of the user for operating the target control is further improved.
In a possible implementation manner, the control method provided in the embodiment of the present invention, after the step 202, may further include a step 213:
and step 213, eliminating the display of the target control under the condition that the user input is not received within the preset time length.
For example, the preset time period may be 2 minutes.
It can be understood that when the user does not need to operate the target control displayed by the terminal, the user does not input the target control for a long time.
In addition, after the user triggers the terminal to display the target control, the target control displayed by the terminal may have other contents in the application program displayed by the terminal. For example, if a user controls a terminal to play a video, a target control displayed by the terminal may block content in the video. Thus, the user may require the terminal to cancel displaying the target control.
It should be noted that, with the control method provided in the embodiment of the present invention, the terminal may eliminate the display of the target control, so that the target control does not affect other contents displayed in the terminal. And when the user does not need to operate the target control, the space on the display screen can be reasonably utilized. Therefore, the flexibility of the terminal for displaying the target control can be further improved.
In a possible implementation manner, as shown in fig. 8, a schematic diagram of a possible structure of a terminal is provided for the embodiment of the present invention. Fig. 8 shows a terminal 80 comprising a receiving module 801, a display module 802 and an executing module 803; a receiving module 801, configured to receive a first input of a user; a display module 802, configured to, in response to the first input received by the receiving module 801, display a target control in an operation area of the first input; the receiving module 801 is further configured to receive a second input of the user on the target control; and an executing module 803, configured to, in response to the second input received by the receiving module 801, execute a control operation corresponding to the second input.
Optionally, the terminal 80 further includes: the device comprises an acquisition module and a generation module; an obtaining module, configured to obtain, by the display module 802, program attribute information of a currently running target application program in a first input operation area before a target control is displayed; the generating module is used for generating a target control according to the program attribute information acquired by the acquiring module; the target application program is an application program running in a foreground or a background; the program attribute information includes at least one of a program type and a program priority.
Optionally, the first input includes N sub-inputs, and one sub-input corresponds to one sub-operation region; the display module 802 is specifically configured to obtain N sub-operation regions corresponding to N sub-inputs; displaying N sub-controls in N sub-operation areas; the target control comprises N sub-controls, one sub-operation area correspondingly displays one sub-control, and N is an integer larger than 1.
Optionally, the receiving module 801 is further configured to receive a third input of the user when the display module 802 displays the N sub-controls in the N sub-operation regions, and displays the first sub-control in the first sub-operation region, and displays the second sub-control in the second sub-operation region; the display module 802 is further configured to, in response to a third input received by the receiving module 801, update the display content in the first sub-operation region to a second sub-control, and update the display content in the second sub-operation region to the first sub-control; the first sub-control and the second sub-control are contained in the N sub-controls, and the first sub-operation area and the second sub-operation area are contained in the N sub-operation areas.
Optionally, the display screen of the terminal 80 includes M sub-display screens, where M is an integer greater than 1; the terminal 80 further includes: a determination module; a determining module, configured to obtain, by the display module 802, a display attribute of a target sub-display screen in which the first input operation area is located before the target control is displayed in the first input operation area; determining target display parameters of the target control according to the display attributes; the display module 802 is specifically configured to display a target control in the first input operation area according to the target display parameter; the display attribute comprises at least one of the area of the sub display screen, the shape of the sub display screen and the color of the interface currently displayed by the sub display screen; the target display parameter includes at least one of an area, a shape, and a color.
Optionally, the receiving module 801 is further configured to receive a fourth input that the user slides from the first operation area to the second operation area after the display module 802 displays the target control in the first input operation area; the display module 802 is further configured to, in response to the fourth input received by the receiving module 801, move the target control from the operation region of the first input to the third operation region.
Optionally, the display module 802 is further configured to, after the target control is displayed in the first input operation area, eliminate the display of the target control when the user input is not received within a preset time period.
It should be noted that, according to the terminal provided in the embodiment of the present invention, after receiving the first input of the user, the terminal may display the target control in the operation area of the first input on the display screen of the terminal, and may not display the target control in other areas in the display screen, so that the terminal supports the user to operate the target control on the operation area of the first input on the display screen. Namely, the user can operate the target control without adjusting the gesture of holding the terminal. Therefore, the flexibility of the terminal for displaying the target control can be improved, and the convenience of operating the target control (namely the virtual key) by the user is improved.
The terminal 80 provided in the embodiment of the present invention can implement each process implemented by the terminal in the foregoing method embodiments, and is not described here again to avoid repetition.
Fig. 9 is a schematic diagram of a hardware structure of a terminal according to an embodiment of the present invention, where the terminal 100 includes, but is not limited to: radio frequency unit 101, network module 102, audio output unit 103, input unit 104, sensor 105, display unit 106, user input unit 107, interface unit 108, memory 109, processor 110, and power supply 111. Those skilled in the art will appreciate that the terminal configuration shown in fig. 9 is not intended to be limiting, and that the terminal may include more or fewer components than shown, or some components may be combined, or a different arrangement of components. In the embodiment of the present invention, the terminal includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal, a wearable device, a pedometer, and the like.
The user input unit 107 is used for receiving a first input of a user; a display unit 106 for displaying a target control in an operation area of a first input in response to the first input received by the user input unit 107; the user input unit 107 is further used for receiving a second input of the user on the target control; and the processor 110 is used for responding to the second input received by the user input unit 107 and executing the control operation corresponding to the second input.
It should be noted that, according to the terminal provided in the embodiment of the present invention, after receiving the first input of the user, the terminal may display the target control in the operation area of the first input on the display screen of the terminal, and may not display the target control in other areas in the display screen, so that the terminal supports the user to operate the target control on the operation area of the first input on the display screen. Namely, the user can operate the target control without adjusting the gesture of holding the terminal. Therefore, the flexibility of the terminal for displaying the target control can be improved, and the convenience of operating the target control (namely the virtual key) by the user is improved.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 101 may be used for receiving and sending signals during a message transmission or call process, and specifically, after receiving downlink data from a base station, the downlink data is processed by the processor 110; in addition, the uplink data is transmitted to the base station. Typically, radio frequency unit 101 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 101 can also communicate with a network and other devices through a wireless communication system.
The terminal provides wireless broadband internet access to the user through the network module 102, such as helping the user send and receive e-mails, browse web pages, access streaming media, and the like.
The audio output unit 103 may convert audio data received by the radio frequency unit 101 or the network module 102 or stored in the memory 109 into an audio signal and output as sound. Also, the audio output unit 103 may also provide audio output related to a specific function performed by the terminal 100 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 103 includes a speaker, a buzzer, a receiver, and the like.
The input unit 104 is used to receive an audio or video signal. The input Unit 104 may include a Graphics Processing Unit (GPU) 1041 and a microphone 1042, and the graphics processor 1041 processes image data of a still picture or video obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 106. The image frames processed by the graphic processor 1041 may be stored in the memory 109 (or other storage medium) or transmitted via the radio frequency unit 101 or the network module 102. The microphone 1042 may receive sound and may be capable of processing such sound into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 101 in case of a phone call mode.
The terminal 100 also includes at least one sensor 105, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor that can adjust the brightness of the display panel 1061 according to the brightness of ambient light, and a proximity sensor that can turn off the display panel 1061 and/or a backlight when the terminal 100 is moved to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used to identify the terminal posture (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration identification related functions (such as pedometer, tapping), and the like; the sensors 105 may also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, infrared sensors, etc., which are not described in detail herein.
The display unit 106 is used to display information input by a user or information provided to the user. The Display unit 106 may include a Display panel 1061, and the Display panel 1061 may be configured in the form of a Liquid Crystal Display (LCD), an organic light-Emitting Diode (OLED), or the like.
The user input unit 107 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the terminal. Specifically, the user input unit 107 includes a touch panel 1071 and other input devices 1072. Touch panel 1071, also referred to as a touch screen, may collect touch operations by a user on or near the touch panel 1071 (e.g., operations by a user on or near touch panel 1071 using a finger, stylus, or any suitable object or attachment). The touch panel 1071 may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 110, and receives and executes commands sent by the processor 110. In addition, the touch panel 1071 may be implemented in various types, such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. In addition to the touch panel 1071, the user input unit 107 may include other input devices 1072. Specifically, other input devices 1072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein.
Further, the touch panel 1071 may be overlaid on the display panel 1061, and when the touch panel 1071 detects a touch operation thereon or nearby, the touch panel 1071 transmits the touch operation to the processor 110 to determine the type of the touch event, and then the processor 110 provides a corresponding visual output on the display panel 1061 according to the type of the touch event. Although in fig. 9, the touch panel 1071 and the display panel 1061 are two independent components to implement the input and output functions of the terminal, in some embodiments, the touch panel 1071 and the display panel 1061 may be integrated to implement the input and output functions of the terminal, and is not limited herein.
The interface unit 108 is an interface for connecting an external device to the terminal 100. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 108 may be used to receive input (e.g., data information, power, etc.) from an external device and transmit the received input to one or more elements within the terminal 100 or may be used to transmit data between the terminal 100 and the external device.
The memory 109 may be used to store software programs as well as various data. The memory 109 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 109 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 110 is a control center of the terminal, connects various parts of the entire terminal using various interfaces and lines, and performs various functions of the terminal and processes data by operating or executing software programs and/or modules stored in the memory 109 and calling data stored in the memory 109, thereby performing overall monitoring of the terminal. Processor 110 may include one or more processing units; preferably, the processor 110 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 110.
The terminal 100 may further include a power supply 111 (e.g., a battery) for supplying power to various components, and preferably, the power supply 111 may be logically connected to the processor 110 through a power management system, so as to manage charging, discharging, and power consumption management functions through the power management system.
In addition, the terminal 100 includes some functional modules that are not shown, and thus, the detailed description thereof is omitted.
Preferably, an embodiment of the present invention further provides a terminal, which includes a processor 110, a memory 109, and a computer program stored in the memory 109 and capable of running on the processor 110, where the computer program, when executed by the processor 110, implements each process of the above control method embodiment, and can achieve the same technical effect, and details are not repeated here to avoid repetition.
The embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements each process of the control method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here. The computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
While the present invention has been described with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, which are illustrative and not restrictive, and it will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (8)

1. A control method is applied to a terminal, and is characterized by comprising the following steps:
receiving a first input of a user;
responding to the first input, and displaying a target control in an operation area of the first input, wherein the operation area of the first input is an area of a display screen of the terminal, in which the first input is detected;
receiving a second input of the user on the target control;
responding to the second input, and executing a control operation corresponding to the second input;
before the target control is displayed in the operation area of the first input, the method further includes:
acquiring program attribute information of a currently running target application program;
generating a target control according to the program attribute information;
the target application program is an application program running in a foreground or a background; the program attribute information includes at least one of a program type and a program priority;
the first input comprises N sub-inputs, and one sub-input corresponds to one sub-operation area;
the displaying of the target control in the operation area of the first input comprises:
acquiring N sub-operation areas corresponding to the N sub-inputs;
displaying N sub-controls in the N sub-operation areas;
the target control comprises N sub-controls, one sub-operation area correspondingly displays one sub-control, and N is an integer greater than 1;
the display screen of the terminal comprises M sub-display screens, wherein M is an integer larger than 1;
before the target control is displayed in the operation area of the first input, the method further includes:
acquiring the display attribute of a target sub-display screen where the first input operation area is located;
determining target display parameters of the target control according to the display attributes;
the displaying of the target control in the operation area of the first input comprises:
displaying the target control in the first input operation area according to the target display parameters;
the display attribute comprises at least one of the area of the sub display screen, the shape of the sub display screen and the color of an interface currently displayed by the sub display screen; the target display parameter comprises at least one of an area, a shape, and a color; when the target application program is an application program which runs in a background in the terminal unlocking state, the target control is displayed in a main interface; when the target application program is an application program which runs in a foreground when the terminal is in an unlocked state, the target control is displayed in an interface of the target application program; and under the condition that the target application program is an application program which runs in the foreground before the screen of the terminal is turned off or locked, the target control is displayed in a screen locking interface of the terminal.
2. The method according to claim 1, wherein after the displaying N sub-controls in the N sub-operation regions, further comprising:
receiving a third input of a user under the condition that a first sub-control is displayed in the first sub-operation area and a second sub-control is displayed in the second sub-operation area;
in response to the third input, updating the display content in the first sub-operation region to be the second sub-control, and updating the display content in the second sub-operation region to be the first sub-control;
the first sub-control and the second sub-control are both contained in the N sub-controls, and the first sub-operation area and the second sub-operation area are both contained in the N sub-operation areas.
3. The method of claim 1, wherein after displaying a target control in the operation region of the first input, further comprising:
receiving a fourth input of the user sliding from the first operation area to the second operation area;
in response to the fourth input, moving the target control from the operational area of the first input to a third operational area.
4. The method of claim 1, wherein after displaying the target control in the operation region of the first input, further comprising:
and eliminating the display of the target control under the condition that the user input is not received within the preset time length.
5. A terminal for controlling a method, comprising: the device comprises a receiving module, an obtaining module, a generating module, a display module, a determining module and an executing module;
the receiving module is used for receiving a first input of a user;
the display module is configured to display a target control in an operation area of the first input in response to the first input received by the receiving module, where the operation area of the first input is an area where the first input is detected in a display screen of the terminal;
the receiving module is further used for receiving a second input of the user on the target control;
the execution module is used for responding to the second input received by the receiving module and executing the control operation corresponding to the second input;
the obtaining module is configured to obtain program attribute information of a currently running target application before the display module displays the target control in the first input operation area;
the generating module is used for generating the target control according to the program attribute information acquired by the acquiring module;
the target application program is an application program running in a foreground or a background; the program attribute information includes at least one of a program type and a program priority;
the first input comprises N sub-inputs, and one sub-input corresponds to one sub-operation area;
the display module is specifically configured to acquire N sub-operation regions corresponding to the N sub-inputs; displaying N sub-controls in the N sub-operation areas;
the target control comprises N sub-controls, one sub-operation area correspondingly displays one sub-control, and N is an integer greater than 1;
the display screen of the terminal comprises M sub-display screens, wherein M is an integer larger than 1;
the determining module is configured to, before the display module displays the target control in the first input operation area, acquire a display attribute of a target sub-display screen in which the first input operation area is located; determining target display parameters of the target control according to the display attributes;
the display module is specifically configured to display the target control in the first input operation area according to the target display parameter;
the display attribute comprises at least one of the area of the sub display screen, the shape of the sub display screen and the color of an interface currently displayed by the sub display screen; the target display parameter comprises at least one of an area, a shape, and a color; when the target application program is an application program which runs in a background in the terminal unlocking state, the target control is displayed in a main interface; when the target application program is an application program which runs in a foreground when the terminal is in an unlocked state, the target control is displayed in an interface of the target application program; and under the condition that the target application program is an application program which runs in the foreground before the screen of the terminal is turned off or locked, the target control is displayed in a screen locking interface of the terminal.
6. The terminal of claim 5,
the receiving module is further configured to receive a third input of the user when the display module displays the N sub-controls in the N sub-operation regions, and a first sub-control is displayed in the first sub-operation region, and a second sub-control is displayed in the second sub-operation region;
the display module is further configured to update the display content in the first sub-operation region to the second sub-control and update the display content in the second sub-operation region to the first sub-control in response to the third input received by the receiving module;
the first sub-control and the second sub-control are both contained in the N sub-controls, and the first sub-operation area and the second sub-operation area are both contained in the N sub-operation areas.
7. The terminal of claim 5,
the receiving module is further configured to receive a fourth input that the user slides from the first operation area to the second operation area after the display module displays the target control in the first input operation area;
the display module is further configured to move the target control from the operation area of the first input to a third operation area in response to the fourth input received by the receiving module.
8. The terminal of claim 5,
the display module is further configured to eliminate the display of the target control when no user input is received within a preset time after the target control is displayed in the first input operation area.
CN201810464385.5A 2018-05-15 2018-05-15 Control method and terminal Active CN108762634B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810464385.5A CN108762634B (en) 2018-05-15 2018-05-15 Control method and terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810464385.5A CN108762634B (en) 2018-05-15 2018-05-15 Control method and terminal

Publications (2)

Publication Number Publication Date
CN108762634A CN108762634A (en) 2018-11-06
CN108762634B true CN108762634B (en) 2022-04-15

Family

ID=64007917

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810464385.5A Active CN108762634B (en) 2018-05-15 2018-05-15 Control method and terminal

Country Status (1)

Country Link
CN (1) CN108762634B (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109766680B (en) * 2018-12-27 2021-01-08 维沃移动通信有限公司 Authority control method and terminal
CN110069178B (en) * 2019-03-14 2021-04-02 维沃移动通信有限公司 Interface control method and terminal equipment
CN110069180A (en) * 2019-03-28 2019-07-30 维沃软件技术有限公司 A kind of function control method and terminal device
CN110898424B (en) * 2019-10-21 2023-10-20 维沃移动通信有限公司 Display control method and electronic equipment
CN113079254A (en) * 2020-01-03 2021-07-06 北京小米移动软件有限公司 Terminal control method, device and medium
CN112311932A (en) * 2020-10-23 2021-02-02 珠海格力电器股份有限公司 Control center gesture control method and device, computer equipment and storage medium
CN114385297A (en) * 2022-01-11 2022-04-22 北京字跳网络技术有限公司 Page display method and device, electronic equipment, storage medium and program product
CN117399933A (en) * 2023-12-14 2024-01-16 湖南凯之成智能装备有限公司 Control method and device for photovoltaic panel paving equipment and computer readable storage medium

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101836182A (en) * 2007-09-04 2010-09-15 苹果公司 Editing interface
CN101996019A (en) * 2009-08-17 2011-03-30 龙旗科技(上海)有限公司 Man-machine interaction mode of mobile terminal supporting double-touch-screen multi-point touch
CN102768607A (en) * 2011-11-02 2012-11-07 联想(北京)有限公司 Method and device for realizing touch operation application program
CN103164116A (en) * 2011-12-13 2013-06-19 现代自动车株式会社 Apparatus and method for executing menu provided in vehicle
CN105630377A (en) * 2015-12-17 2016-06-01 中山市读书郎电子有限公司 Natural gesture based information display method
CN106126034A (en) * 2016-06-29 2016-11-16 维沃移动通信有限公司 A kind of keypress function method to set up and mobile terminal
CN106293483A (en) * 2016-09-29 2017-01-04 福州新锐同创电子科技有限公司 Electronic display writing on the blackboard subdispatch display packing
CN106843739A (en) * 2017-02-28 2017-06-13 维沃移动通信有限公司 The display control method and mobile terminal of a kind of mobile terminal
CN107678613A (en) * 2017-09-01 2018-02-09 珠海市魅族科技有限公司 A kind of display control method and device, terminal and readable storage medium storing program for executing

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120060123A1 (en) * 2010-09-03 2012-03-08 Hugh Smith Systems and methods for deterministic control of instant-on mobile devices with touch screens
CN106406656B (en) * 2016-08-30 2019-07-26 维沃移动通信有限公司 A kind of control method and mobile terminal of application tool bar
CN107205081A (en) * 2017-04-27 2017-09-26 北京小米移动软件有限公司 A kind of method and apparatus for showing interactive controls
CN107831989A (en) * 2017-11-28 2018-03-23 维沃移动通信有限公司 A kind of Application Parameters method of adjustment and mobile terminal

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101836182A (en) * 2007-09-04 2010-09-15 苹果公司 Editing interface
CN101996019A (en) * 2009-08-17 2011-03-30 龙旗科技(上海)有限公司 Man-machine interaction mode of mobile terminal supporting double-touch-screen multi-point touch
CN102768607A (en) * 2011-11-02 2012-11-07 联想(北京)有限公司 Method and device for realizing touch operation application program
CN103164116A (en) * 2011-12-13 2013-06-19 现代自动车株式会社 Apparatus and method for executing menu provided in vehicle
CN105630377A (en) * 2015-12-17 2016-06-01 中山市读书郎电子有限公司 Natural gesture based information display method
CN106126034A (en) * 2016-06-29 2016-11-16 维沃移动通信有限公司 A kind of keypress function method to set up and mobile terminal
CN106293483A (en) * 2016-09-29 2017-01-04 福州新锐同创电子科技有限公司 Electronic display writing on the blackboard subdispatch display packing
CN106843739A (en) * 2017-02-28 2017-06-13 维沃移动通信有限公司 The display control method and mobile terminal of a kind of mobile terminal
CN107678613A (en) * 2017-09-01 2018-02-09 珠海市魅族科技有限公司 A kind of display control method and device, terminal and readable storage medium storing program for executing

Also Published As

Publication number Publication date
CN108762634A (en) 2018-11-06

Similar Documents

Publication Publication Date Title
CN108255378B (en) Display control method and mobile terminal
CN108762634B (en) Control method and terminal
CN111061574B (en) Object sharing method and electronic device
CN110062105B (en) Interface display method and terminal equipment
CN109828850B (en) Information display method and terminal equipment
CN109857306B (en) Screen capturing method and terminal equipment
CN109032486B (en) Display control method and terminal equipment
US20220300302A1 (en) Application sharing method and electronic device
CN108762705B (en) Information display method, mobile terminal and computer readable storage medium
CN110874147B (en) Display method and electronic equipment
CN109828705B (en) Icon display method and terminal equipment
CN109857289B (en) Display control method and terminal equipment
CN110764666B (en) Display control method and electronic equipment
CN108681427B (en) Access right control method and terminal equipment
CN110489045B (en) Object display method and terminal equipment
CN109976611B (en) Terminal device control method and terminal device
CN109407949B (en) Display control method and terminal
CN109407948B (en) Interface display method and mobile terminal
CN110225180B (en) Content input method and terminal equipment
CN111010523A (en) Video recording method and electronic equipment
CN108762606B (en) Screen unlocking method and terminal equipment
CN110866465A (en) Control method of electronic equipment and electronic equipment
CN111190517B (en) Split screen display method and electronic equipment
CN111273993A (en) Icon sorting method and electronic equipment
CN109933267B (en) Method for controlling terminal equipment and terminal equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant