WO2020155980A1 - 控制方法及终端设备 - Google Patents

控制方法及终端设备 Download PDF

Info

Publication number
WO2020155980A1
WO2020155980A1 PCT/CN2019/129064 CN2019129064W WO2020155980A1 WO 2020155980 A1 WO2020155980 A1 WO 2020155980A1 CN 2019129064 W CN2019129064 W CN 2019129064W WO 2020155980 A1 WO2020155980 A1 WO 2020155980A1
Authority
WO
WIPO (PCT)
Prior art keywords
screen
target
input
terminal device
content
Prior art date
Application number
PCT/CN2019/129064
Other languages
English (en)
French (fr)
Inventor
王程刚
杨春
Original Assignee
维沃移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 维沃移动通信有限公司 filed Critical 维沃移动通信有限公司
Publication of WO2020155980A1 publication Critical patent/WO2020155980A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Definitions

  • the embodiments of the present invention relate to the technical field of terminals, and in particular to a control method and terminal equipment.
  • terminal devices such as editing documents, processing pictures, and playing games.
  • users can use terminal devices to edit documents. During the editing process, users may need to insert content at a certain position. The user can drag the cursor on the screen to move the cursor to the position where the content is to be inserted. Location input content.
  • the embodiment of the present invention provides a control method and terminal device to solve the problem of poor convenience of controlling the terminal device.
  • an embodiment of the present invention provides a control method, which includes: receiving a user's target input on a target screen; in response to the target input, executing a target corresponding to the target input on the target content displayed on the first screen Action; wherein, the target screen includes the first screen and the second screen, the target action is to control the target object in the target content to move on the first screen; or, the target screen is the second screen, the target The action is to control the target object in the target content to move on the first screen, or to superimpose and display special effect content on the target content.
  • an embodiment of the present invention also provides a terminal device.
  • the terminal device includes: a receiving module and an execution module; the receiving module is configured to receive user input on the target screen; and the execution module is configured to respond The target input received by the receiving module executes a target action corresponding to the target input on the target content displayed on the first screen; wherein the target screen includes the first screen and the second screen, and the target action is to control the target The target object in the content moves on the first screen; or, the target screen is the second screen, and the target action is to control the target object in the target content to move on the first screen, or on the target content Superimpose the content of special effects.
  • an embodiment of the present invention provides a terminal device, including a processor, a memory, and a computer program stored in the memory and capable of running on the processor.
  • the computer program is executed by the processor to achieve the following On the one hand, the steps of the control method.
  • an embodiment of the present invention provides a computer-readable storage medium storing a computer program on the computer-readable storage medium, and when the computer program is executed by a processor, the steps of the control method as described in the first aspect are implemented.
  • the terminal device receives the user's target input on the target screen, and then in response to the target input, the terminal device executes the target action corresponding to the target input on the target content displayed on the first screen.
  • the target screen includes the first screen and the second screen
  • the user can trigger the terminal device to control the movement of the target object in the target content on the first screen through input on the two screens.
  • the target screen is the second screen
  • the user can trigger the terminal device to control the movement of the target object in the target content on the first screen through input on other screens, or the user can trigger the terminal device to move on the first screen through input on other screens
  • the content of special effects is superimposed on the target content.
  • the terminal device can control the objects displayed on one screen through two screens, or through the user in the second screen.
  • the input control on the screen displays the special effect content in the target content displayed on the first screen, which makes it more convenient and quicker for users to operate the terminal equipment, and the user experience is better.
  • FIG. 1 is a schematic structural diagram of a possible Android operating system provided by an embodiment of the present invention
  • FIG. 2 is a schematic flowchart of a control method provided by an embodiment of the present invention.
  • FIG. 3 is one of the schematic diagrams of controlling terminal equipment according to an embodiment of the present invention.
  • FIG. 4 is the second schematic diagram of a control terminal device provided by an embodiment of the present invention.
  • FIG. 5 is the third schematic diagram of controlling terminal equipment according to an embodiment of the present invention.
  • FIG. 6 is one of the schematic diagrams of the distribution of a second screen area provided by an embodiment of the present invention.
  • FIG. 7 is a second schematic diagram of the distribution of the area on the second screen according to an embodiment of the present invention.
  • FIG. 8 is one of possible structural schematic diagrams of a terminal device provided by an embodiment of the present invention.
  • FIG. 9 is the second schematic diagram of a possible structure of a terminal device according to an embodiment of the present invention.
  • FIG. 10 is a schematic diagram of the hardware structure of a terminal device according to various embodiments of the present invention.
  • A/B can mean A or B
  • the "and/or” in this article is only an association relationship describing associated objects, which means that there may be three A relationship, for example, A and/or B, can mean that: A alone exists, A and B exist at the same time, and B exists alone.
  • Multiple means two or more than two.
  • first and second in the specification and claims of the present invention are used to distinguish different objects, rather than to describe a specific order of objects.
  • first screen and the second screen are used to distinguish different screens, rather than to describe the specific order of the screens.
  • the terminal device in the embodiment of the present invention may be a terminal device with an operating system.
  • the operating system may be an Android operating system, an ios operating system, or other possible operating systems, which is not specifically limited in the embodiment of the present invention.
  • the following takes the Android operating system as an example to introduce the software environment to which the control method provided by the embodiment of the present invention is applied.
  • FIG. 1 it is a schematic structural diagram of a possible Android operating system provided by an embodiment of the present invention.
  • the architecture of the Android operating system includes 4 layers, which are: application layer, application framework layer, system runtime library layer, and kernel layer (specifically, it may be the Linux kernel layer).
  • the application layer includes various applications (including system applications and third-party applications) in the Android operating system.
  • the application framework layer is the framework of the application. Developers can develop some applications based on the application framework layer while complying with the development principles of the application framework.
  • the system runtime library layer includes a library (also called a system library) and an Android operating system runtime environment.
  • the library mainly provides various resources needed by the Android operating system.
  • the Android operating system operating environment is used to provide a software environment for the Android operating system.
  • the kernel layer is the operating system layer of the Android operating system and belongs to the lowest level of the Android operating system software.
  • the kernel layer is based on the Linux kernel to provide core system services and hardware-related drivers for the Android operating system.
  • developers can develop a software program that implements the control method provided in the embodiment of the present invention based on the system architecture of the Android operating system shown in FIG. 1, so that the control method It can run based on the Android operating system as shown in Figure 1. That is, the processor or the terminal device can implement the control method provided by the embodiment of the present invention by running the software program in the Android operating system.
  • FIG. 2 is a schematic flowchart of a control method provided by an embodiment of the present invention. As shown in FIG. 2, the control method includes S201 and S202:
  • S201 The terminal device receives the user's target input on the target screen.
  • the target screen of the terminal device in the embodiment of the present invention is a screen with a touch function.
  • the terminal device executes a target action corresponding to the target input on the target content displayed on the first screen.
  • the target screen includes a first screen and a second screen, and the target action is to control the target object in the target content to move on the first screen.
  • the target screen is the second screen
  • the target action is to control the target object in the target content to move on the first screen, or superimpose and display special effect content on the target content.
  • the first screen may be the front screen of the double-sided screen or the back screen of the double-sided screen, which is not specifically limited in the embodiment of the present invention.
  • the target object can be the application icon displayed in the terminal device, the cursor in the document, the characters and equipment in the game interface, it can also be a selection box for more accurate screenshots after the screenshot operation, or it can be a beauty
  • the image in the interface can be the content of special effects can be skills in the game, or additional effects added when editing pictures.
  • special effects can be black and white effects, small fresh effects, and the effect of adding a rabbit ear to the image.
  • FIG. 3 is a schematic diagram of controlling a terminal device according to an embodiment of the present invention.
  • the user can input the cursor movement displayed on the control screen 31 (the first screen) on the screen 32 (the second screen). It can avoid the defect that the user cannot accurately move the cursor to the desired position due to the user's finger covering the cursor position when editing text.
  • the user controls the cursor on the first screen on the second screen The movement of can facilitate the user to quickly move the cursor to the position where the user wants to insert the text.
  • the terminal device receives the user's target input on the target screen, and then in response to the target input, the terminal device executes the target action corresponding to the target input on the target content displayed on the first screen.
  • the target screen includes the first screen and the second screen
  • the user can trigger the terminal device to control the movement of the target object in the target content on the first screen through input on the two screens.
  • the target screen is the second screen
  • the user can trigger the terminal device to control the movement of the target object in the target content on the first screen through input on other screens, or the user can trigger the terminal device to move on the first screen through input on other screens
  • the content of special effects is superimposed on the target content.
  • the terminal device can control the objects displayed on one screen through two screens, or through the user in the second screen.
  • the input control on the screen displays the special effect content in the target content displayed on the first screen, which makes it more convenient and quicker for users to operate the terminal equipment, and the user experience is better.
  • S202 in the foregoing embodiment can be specifically executed by S202a:
  • the terminal device In response to the target input, the terminal device performs a target action corresponding to the target input on the target content displayed on the first screen when the preset condition is satisfied.
  • the preset conditions include any of the following: the first screen displays content in unlimited screen mode, the first screen displays content in projection mode, the first screen displays the interface of the application of the preset application type, and the first screen displays remote control interface.
  • the wireless screen mode means that the terminal device displays content that exceeds the screen size on the screen.
  • the user can slide in the first direction on the page, and there is no The content displayed in the display area is displayed in the display area, wherein the second direction and the first direction are opposite directions.
  • the desktop of the terminal device displays the application icons in the wireless screen mode, assuming that there are enough application icons, the user can always slide the screen in one direction, and the icons of the applications that are not displayed can be displayed on the terminal device according to the user's sliding operation. In the display area.
  • the projection mode refers to that the terminal device projects the content displayed on the display screen of the terminal device to the display area of a device with a larger display area.
  • the preset application types may include: game applications, image editing applications, e-book applications, document applications, and so on.
  • the terminal device controls the target object to move according to the input parameters of the target input, the target special effect content is the special effect content corresponding to the target function indicated by the target function identifier, and the target function identifier is the function identifier displayed at the position corresponding to the target input.
  • the terminal device can execute the target action corresponding to the target input on the target content displayed on the first screen when the preset conditions are met, and the terminal device can determine that the target input corresponds to different target actions according to different preset conditions That is, the terminal device can flexibly respond to user input according to the application scenario, so that the terminal device can respond to user input more intelligently.
  • the target screen includes a first screen and a second screen
  • the target input includes a first input on the first screen and a second input on the second screen
  • the control method provided by the embodiment of the present invention, Before executing the target action corresponding to the target input on the target content displayed on the first screen, S203 is further included:
  • the terminal device determines the target input parameter of the target input according to the target input.
  • S203 can be executed through S203a:
  • a represents the target input parameter
  • b represents the input parameter of the first input
  • c represents the input parameter of the second input
  • the k value can adjust the step length of the second input to the target input parameter, so that the second input can control the movement of the target object with a smaller step length, that is, the user can pass the first screen on the first screen.
  • One input moves with larger steps, and the second input on the second screen moves with smaller steps. That is, the first input on the first screen is coarse control, and the second input on the second screen is fine control.
  • S202 can be executed through S202b:
  • the terminal device controls the target object to move on the first screen according to the target input parameters.
  • first input and the second input can be the user's input on two screens at the same time, or the user can input one input on one screen, and then input another input on the other screen.
  • the present invention The embodiment does not specifically limit this.
  • the target object that the terminal device controls to move according to the target input parameter may select an object in the target content for the user's target input, and the target object may also be a cursor displayed on the display interface, the target object It may also be the entire display page, which is not specifically limited in the embodiment of the present invention.
  • the terminal device may use the length in the target input parameter to only control the distance of movement, or may also use the length and direction of the target input parameter to control the distance and direction of movement, which is not specifically limited in the embodiment of the present invention.
  • the starting point of the movement of the target object is the first input and the starting position of the input first received by the terminal device in the second input is the starting point, or it can be the position before the target object receives the target input in the first screen In actual applications, it can be specifically set according to the application scenario, which is not specifically limited in the embodiment of the present invention.
  • screen 31 is the first screen and screen 32 is the second screen.
  • the first input is that the user swipes to the right on the screen 31, and the second input is that the user swipes up on the screen 32.
  • the second input is only used to adjust the movement distance and direction of the first input.
  • the first input is the user sliding down on screen 31, and the input parameter of the first input is a vector (Move from O2 point to B2 point), the input parameter of the second input is a vector (Move from O3 point to C2 point), assuming that the second input is only used to adjust the moving distance of the first input, and the direction of the first input and the second input are opposite, the target input parameter is Then the final display position of the cursor is point A2.
  • the user can make a rough adjustment of the target object position on the first screen, and fine-tune the target object position on the second screen. For example, when the user searches for an application icon in the wireless screen mode, Enter the first input on the first screen for a rough search, and perform a fine search on the second screen. When drawing or retouching, the user can make rough movements on the first screen and precise movements on the second screen to find the position the user wants.
  • the step length of the second input on the second screen is adjusted by the k value to perform fine control, so that the user can accurately adjust the position of the target object on the first screen, which is convenient for the user to find or edit.
  • the target screen includes the first screen and the second screen
  • the target object includes the first object and the second object
  • the target input includes the first input for the first object on the first screen and the first input on the second screen.
  • the terminal device controls the first object to move on the first screen according to the input parameters of the first input.
  • the first object may be an object selected by the user's first input, or an object controlled by default input on the first screen, which is not specifically limited in the embodiment of the present invention.
  • the terminal device controls the second object to move on the first screen according to the input parameter of the second input.
  • the object selected by the first input can be the first object, and the second object is another object; the second object can also be the input on the second screen
  • the object of default control; this embodiment of the present invention does not specifically limit this.
  • the terminal device may move the first object from the first position (that is, the position corresponding to the first input on the first screen when the first object is selected by the first input) Move to the second position (position at the end of the first input) according to the input parameters of the first input.
  • the terminal device will move the current position of the first object to the third position according to the input parameters of the first input (that is, the direction and length corresponding to the first input correspond to The position in the interface displayed on the first screen).
  • FIG. 5 is a schematic diagram of controlling a terminal device according to an embodiment of the present invention.
  • the game interface includes characters and shooting equipment, wherein the control 301 is used to indicate the position of the character in the interface, and the control 302 indicates the position of the front sight of the shooting equipment.
  • the movement of the character is controlled by the first input on the screen 31, and the user can control the movement of the front sight by the second input on the screen 32.
  • the user can also set the first input on the first screen to control the movement of the front sight, and the second input set on the second screen to control the movement of the character.
  • the user can Make changes according to your own habits, and the embodiment of the present invention does not specifically limit this.
  • the user can trigger the terminal device to control the movement of the first object on the first screen and the movement of the second object on the first screen through the input on the first screen and the input on the second screen.
  • the above input controls the movement of different objects on the first screen, thus making it more convenient for the user to control the terminal device.
  • the second screen includes M areas, each area is used to control different objects on the first screen, or each area corresponds to different special effect content, and M is an integer greater than or equal to 2; where, The target input includes an input in one of the M areas.
  • the user can also set the entire screen of the second screen to correspond to an object or to correspond to a special effect content, which is not specifically limited in the embodiment of the present invention.
  • each area on the second screen when used to control different objects on the first screen, it can be applied to a scene where multiple objects on the first screen need to be operated, and the user does not need to be on the first screen. You can switch objects up, and you can quickly control different objects through operations on different areas.
  • Figure 6 is a schematic diagram of the area distribution on the second screen.
  • the second screen includes 4 areas, namely area 30a, area 30b, area 30c, and area 30d. Each area corresponds to a special effect content.
  • a game interface is displayed on 31.
  • special effect 1 corresponds to area 30a
  • special effect 2 corresponds to area 30b
  • special effect 3 corresponds to area 30c
  • special effect 4 corresponds to area 30d.
  • the user can input and control the terminal device to display different special effects in the 4 areas on the second screen.
  • the user can enter the setting interface, where the user can set the corresponding special effect content for each of the M areas.
  • the control method provided by the embodiment of the present invention can control multiple objects on one screen more quickly and conveniently; if the user needs to control the objects on the first screen to display special effects, the user can display special effects on the second screen. Operate the M areas on the upper screen separately, and then trigger the terminal device to display different special effects on the first screen. For example, the user can trigger the object to use different skills by operating the M areas on the second screen when playing a game. It is more convenient and quicker than the way in which the special effect content needs to be switched in the related technology, which makes the user experience better.
  • the second screen includes N first areas and K second areas, each first area is used to control different objects on the first screen, and each second area corresponds to different special effect content.
  • N and K are positive integers; wherein, the target input includes an input in a first area among the N first areas, or an input in a second area among the K second areas.
  • FIG. 7 is a schematic diagram of the distribution of the areas on the second screen. It is assumed that the two first areas are area 30f and area 30h, and the two second areas are area 30e and area 30g, respectively. Among them, the area 30f can control the movement of the front sight of the weapon of the character 1, the area 30h can control the movement of the character 1, the area 30e can be used to switch weapons, and the area 30g can control the jump of the character 1.
  • FIG. 7 is only an exemplary illustration. In actual applications, users can set different functions in different areas according to different games according to their needs, which is not specifically limited in the embodiment of the present invention.
  • the terminal device can control the target object on the first screen through the first area of the second screen, and display special effects through the second area of the second screen.
  • the user can use the second screen Control the movement of the characters in the game, and trigger the skills in the game through the second screen, which is convenient for the user to operate.
  • the user does not need to connect an external handle to the terminal, which makes the user's operating experience better.
  • FIG. 8 is a schematic diagram of a possible structure of a terminal device provided by an embodiment of the present invention.
  • the terminal device 800 includes: a receiving module 801 and an execution module 802; a receiving module 801 for receiving user information on the target screen Target input; the execution module 802, in response to the target input received by the receiving module 801, execute the target action corresponding to the target input on the target content displayed on the first screen; wherein the target screen includes the first screen and the second screen, the target The action is to control the target object in the target content to move on the first screen; or, the target screen is the second screen, and the target action is to control the target object in the target content to move on the first screen, or superimpose and display special effects on the target content content.
  • the execution module 802 is specifically configured to execute a target action corresponding to the target input on the target content displayed on the first screen when a preset condition is met;
  • the preset condition includes any one of the following: Infinite screen mode displays content, the first screen displays content in projection mode, the first screen displays the interface of the preset application type, and the first screen displays the remote control interface.
  • the target screen includes a first screen and a second screen
  • the target input includes a first input on the first screen and a second input on the second screen
  • the terminal device 800 also includes a determining module 803; the determining module 803 is used to determine the target input parameters of the target input according to the target input before the execution module 802 executes the target action corresponding to the target input on the target content displayed on the first screen; the executing module 802 , Specifically used to control the target object to move on the first screen according to the target input parameter determined by the determining module 803.
  • the target screen includes a first screen and a second screen
  • the target object includes a first object and a second object
  • the target input includes a first input for the first object on the first screen and a first input for the first object on the second screen.
  • the second input of the second object; the execution module 802 is specifically configured to: in response to the first input, control the first object to move on the first screen according to the input parameters of the first input; in response to the second input, according to the second input Enter parameters to control the movement of the second object on the first screen.
  • the second screen includes M areas, each area is used to control different objects on the first screen, or each area corresponds to different special effect content, M is an integer greater than or equal to 2; wherein, the target input includes Input in one of the M areas.
  • the second screen includes N first areas and K second areas, each first area is used to control different objects on the first screen, and each second area corresponds to different special effect content, N and K Both are positive integers; where the target input includes an input in a first area among the N first areas or an input in a second area among the K second areas.
  • the terminal device 800 provided in the embodiment of the present invention can implement each process implemented by the terminal device in the foregoing method embodiment, and in order to avoid repetition, details are not described herein again.
  • the terminal device receives a user's target input on the target screen, and then in response to the target input, the terminal device executes a target action corresponding to the target input on the target content displayed on the first screen.
  • the target screen includes the first screen and the second screen
  • the user can trigger the terminal device to control the movement of the target object in the target content on the first screen through input on the two screens.
  • the target screen is the second screen
  • the user can trigger the terminal device to control the movement of the target object in the target content on the first screen through input on other screens, or the user can trigger the terminal device to move on the first screen through input on other screens
  • the content of special effects is superimposed on the target content.
  • the terminal device can control the objects displayed on one screen through two screens, or through the user in the second screen.
  • the input control on the screen displays the special effect content in the target content displayed on the first screen, which makes it more convenient and quicker for users to operate the terminal equipment, and the user experience is better.
  • the terminal device 100 includes but is not limited to: a radio frequency unit 101, a network module 102, an audio output unit 103, an input unit 104, a sensor 105, and a display unit 106, a user input unit 107, an interface unit 108, a memory 109, a processor 110, and a power supply 111 and other components.
  • a radio frequency unit 101 includes but is not limited to: a radio frequency unit 101, a network module 102, an audio output unit 103, an input unit 104, a sensor 105, and a display unit 106, a user input unit 107, an interface unit 108, a memory 109, a processor 110, and a power supply 111 and other components.
  • terminal devices include, but are not limited to, mobile phones, tablet computers, notebook computers, palmtop computers, vehicle-mounted terminal devices, wearable devices, and pedometers.
  • the user input unit 107 is used to receive the user's target input on the target screen; the processor 110 is used to respond to the target input and execute the target action corresponding to the target input on the target content displayed on the first screen; wherein, the target The screen includes a first screen and a second screen.
  • the target action is to control the target object in the target content to move on the first screen; or, the target screen is the second screen, and the target action is to control the target object in the target content on the first screen. Move up, or superimpose the special effect content on the target content.
  • the terminal device receives the user's target input on the target screen, and then in response to the target input, the terminal device executes the target action corresponding to the target input on the target content displayed on the first screen.
  • the target screen includes the first screen and the second screen
  • the user can trigger the terminal device to control the movement of the target object in the target content on the first screen through input on the two screens.
  • the target screen is the second screen
  • the user can trigger the terminal device to control the movement of the target object in the target content on the first screen through input on other screens, or the user can trigger the terminal device to move on the first screen through input on other screens
  • the content of special effects is superimposed on the target content.
  • the terminal device can control the objects displayed on one screen through two screens, or through the user in the second screen.
  • the input control on the screen displays the special effect content in the target content displayed on the first screen, which makes it more convenient and quicker for users to operate the terminal equipment, and the user experience is better.
  • the radio frequency unit 101 can be used for receiving and sending signals in the process of sending and receiving information or talking. Specifically, the downlink data from the base station is received and processed by the processor 110; in addition, Uplink data is sent to the base station.
  • the radio frequency unit 101 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like.
  • the radio frequency unit 101 can also communicate with the network and other devices through a wireless communication system.
  • the terminal device provides users with wireless broadband Internet access through the network module 102, such as helping users to send and receive emails, browse web pages, and access streaming media.
  • the audio output unit 103 can convert the audio data received by the radio frequency unit 101 or the network module 102 or stored in the memory 109 into audio signals and output them as sounds. Moreover, the audio output unit 103 may also provide audio output related to a specific function performed by the terminal device 100 (e.g., call signal reception sound, message reception sound, etc.).
  • the audio output unit 103 includes a speaker, a buzzer, a receiver, and the like.
  • the input unit 104 is used to receive audio or video signals.
  • the input unit 104 may include a graphics processing unit (GPU) 1041 and a microphone 1042, and the graphics processor 1041 is configured to monitor images of still pictures or videos obtained by an image capture device (such as a camera) in a video capture mode or an image capture mode. Data is processed.
  • the processed image frame can be displayed on the display unit 106.
  • the image frame processed by the graphics processor 1041 may be stored in the memory 109 (or other storage medium) or sent via the radio frequency unit 101 or the network module 102.
  • the microphone 1042 can receive sound, and can process such sound into audio data.
  • the processed audio data can be converted into a format that can be sent to the mobile communication base station via the radio frequency unit 101 for output in the case of a telephone call mode.
  • the terminal device 100 further includes at least one sensor 105, such as a light sensor, a motion sensor, and other sensors.
  • the light sensor includes an ambient light sensor and a proximity sensor.
  • the ambient light sensor can adjust the brightness of the display panel 1061 according to the brightness of the ambient light.
  • the proximity sensor can close the display panel 1061 and the display panel 1061 when the terminal device 100 is moved to the ear. / Or backlight.
  • the accelerometer sensor can detect the magnitude of acceleration in various directions (usually three-axis), and can detect the magnitude and direction of gravity when stationary, and can be used to identify the posture of the terminal device (such as horizontal and vertical screen switching, related games , Magnetometer attitude calibration), vibration recognition related functions (such as pedometer, percussion), etc.; the sensor 105 can also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, Infrared sensors, etc., will not be repeated here.
  • the display unit 106 is used to display information input by the user or information provided to the user.
  • the display unit 106 may include a display panel 1061, and the display panel 1061 may be configured in the form of a liquid crystal display (LCD), an organic light-emitting diode (OLED), etc.
  • LCD liquid crystal display
  • OLED organic light-emitting diode
  • the user input unit 107 may be used to receive inputted numeric or character information, and generate key signal input related to user settings and function control of the terminal device.
  • the user input unit 107 includes a touch panel 1071 and other input devices 1072.
  • the touch panel 1071 also called a touch screen, can collect the user's touch operations on or near it (for example, the user uses any suitable objects or accessories such as fingers, stylus, etc.) on the touch panel 1071 or near the touch panel 1071. operating).
  • the touch panel 1071 may include two parts: a touch detection device and a touch controller.
  • the touch detection device detects the user's touch position, and detects the signal brought by the touch operation, and transmits the signal to the touch controller; the touch controller receives the touch information from the touch detection device, converts it into contact coordinates, and then sends it To the processor 110, the command sent by the processor 110 is received and executed.
  • the touch panel 1071 can be implemented in multiple types such as resistive, capacitive, infrared, and surface acoustic wave.
  • the user input unit 107 may also include other input devices 1072.
  • other input devices 1072 may include, but are not limited to, a physical keyboard, function keys (such as volume control buttons, switch buttons, etc.), trackball, mouse, and joystick, which will not be repeated here.
  • the touch panel 1071 can be overlaid on the display panel 1061.
  • the touch panel 1071 detects a touch operation on or near it, it is transmitted to the processor 110 to determine the type of the touch event.
  • the type of event provides corresponding visual output on the display panel 1061.
  • the touch panel 1071 and the display panel 1061 are used as two independent components to implement the input and output functions of the terminal device, in some embodiments, the touch panel 1071 and the display panel 1061 can be integrated
  • the implementation of the input and output functions of the terminal device is not specifically limited here.
  • the interface unit 108 is an interface for connecting an external device with the terminal device 100.
  • the external device may include a wired or wireless headset port, an external power source (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device with an identification module, audio input/output (I/O) port, video I/O port, headphone port, etc.
  • the interface unit 108 can be used to receive input (for example, data information, power, etc.) from an external device and transmit the received input to one or more elements in the terminal device 100 or can be used to connect to the terminal device 100 and external Transfer data between devices.
  • the memory 109 can be used to store software programs and various data.
  • the memory 109 may mainly include a program storage area and a data storage area.
  • the program storage area may store an operating system, an application program required by at least one function (such as a sound playback function, an image playback function, etc.), etc.; Data (such as audio data, phone book, etc.) created by the use of mobile phones.
  • the memory 109 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, or other volatile solid-state storage devices.
  • the processor 110 is the control center of the terminal device. It uses various interfaces and lines to connect the various parts of the entire terminal device, runs or executes the software programs and/or modules stored in the memory 109, and calls the data stored in the memory 109 , Perform various functions of terminal equipment and process data, so as to monitor the terminal equipment as a whole.
  • the processor 110 may include one or more processing units; optionally, the processor 110 may integrate an application processor and a modem processor, where the application processor mainly processes the operating system, user interface, application programs, etc., and the modem
  • the adjustment processor mainly deals with wireless communication. It can be understood that the foregoing modem processor may not be integrated into the processor 110.
  • the terminal device 100 may also include a power source 111 (such as a battery) for supplying power to various components.
  • a power source 111 such as a battery
  • the power source 111 may be logically connected to the processor 110 through a power management system, so as to manage charging, discharging, and power consumption through the power management system. Management and other functions.
  • the terminal device 100 includes some functional modules not shown, which will not be repeated here.
  • an embodiment of the present invention further provides a terminal device.
  • a terminal device With reference to FIG. 10, it includes a processor 110, a memory 109, a computer program stored in the memory 109 and running on the processor 110, and the computer program is When 110 is executed, each process of the foregoing control method embodiment is realized, and the same technical effect can be achieved. In order to avoid repetition, details are not repeated here.
  • the terminal device receives the user's target input on the target screen, and then in response to the target input, the terminal device executes the target action corresponding to the target input on the target content displayed on the first screen.
  • the target screen includes the first screen and the second screen
  • the user can trigger the terminal device to control the movement of the target object in the target content on the first screen through input on the two screens.
  • the target screen is the second screen
  • the user can trigger the terminal device to control the movement of the target object in the target content on the first screen through input on other screens, or the user can trigger the terminal device to move on the first screen through input on other screens
  • the content of special effects is superimposed on the target content.
  • the terminal device can control the objects displayed on one screen through two screens, or through the user in the second screen.
  • the input control on the screen displays the special effect content in the target content displayed on the first screen, which makes it more convenient and quicker for users to operate the terminal equipment, and the user experience is better.
  • the embodiment of the present invention also provides a computer-readable storage medium on which a computer program is stored.
  • a computer program is stored.
  • the computer program is executed by a processor, each process of the above-mentioned control method embodiment is realized, and the same technical effect can be achieved To avoid repetition, I won’t repeat it here.
  • the computer-readable storage medium such as read-only memory (Read-Only Memory, ROM), random access memory (Random Access Memory, RAM), magnetic disk, or optical disk, etc.
  • the technical solution of the present invention can be embodied in the form of a software product in essence or a part that contributes to related technologies.
  • the computer software product is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk). ) Includes several instructions to make a terminal device (which can be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.) execute the method described in each embodiment of the present invention.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Telephone Function (AREA)

Abstract

一种控制方法及终端设备,该方法包括:终端设备接收用户在目标屏上的目标输入(S201);终端设备响应于该目标输入,对第一屏显示的内容执行与该目标输入对应的目标动作(S202);其中,该目标屏包括第一屏和第二屏,该目标动作为控制该目标内容中的目标对象在第一屏上移动;或者,该目标屏为该第二屏,该目标动作为控制该目标内容中的目标对象在该第一屏上移动,或在该目标内容上叠加显示特效内容。

Description

控制方法及终端设备
相关申请的交叉引用
本申请主张在2019年1月30日在中国提交的中国专利申请号No.201910092727.X的优先权,其全部内容通过引用包含于此。
技术领域
本发明实施例涉及终端技术领域,尤其涉及一种控制方法及终端设备。
背景技术
随着终端技术的发展,用户使用终端设备的场景越来越多,例如编辑文档、处理图片、玩游戏等。
通常,用户可以使用终端设备编辑文档,在编辑的过程中,用户可能需要在某个位置插入内容,用户可以通过在屏幕上拖动光标以将光标移动至待插入内容的位置,然后用户在该位置输入内容。
然而,上述控制终端设备的方式,当用户拖动光标时,由于用户手指可能会遮挡屏幕上的光标,因此需要用户在较小的区域内来回移动多次,才可能将光标移动至待插入内容的位置,如此导致控制终端设备的便捷性较差。
发明内容
本发明实施例提供一种控制方法及终端设备,以解决控制终端设备的便捷性较差的问题。
为了解决上述技术问题,本发明实施例是这样实现的:
第一方面,本发明实施例提供一种控制方法,该方法包括:接收用户在目标屏上的目标输入;响应于该目标输入,对第一屏显示的目标内容执行与该目标输入对应的目标动作;其中,该目标屏包括该第一屏和第二屏,该目标动作为控制该目标内容中的目标对象在该第一屏上移动;或者,该目标屏为该第二屏,该目标动作为控制该目标内容中的目标对象在该第一屏上移动,或在该目标内容上叠加显示特效内容。
第二方面,本发明实施例还提供了一种终端设备,该终端设备包括:接收模块和执行模块;该接收模块,用于接收用户在目标屏上的目标输入;该执行模块,用于响应于该接收模块接收的该目标输入,对第一屏显示的目标内容执行与该目标输入对应的目标动作;其中,该目标屏包括该第一屏和第二屏,该目标动作为控制该目标内容中的目标对象在该第一屏上移动;或者,该目标屏为该第二屏,该目标动作为控制该目标内容中的目标对象在该第一屏上移动,或在该目标内容上叠加显示特效内容。
第三方面,本发明实施例提供了一种终端设备,包括处理器、存储器及存储在该存储器上并可在该处理器上运行的计算机程序,该计算机程序被该处理器执行时实现如第一方面所述的控制方法的步骤。
第四方面,本发明实施例提供了一种计算机可读存储介质,该计算机可读存储介 质上存储计算机程序,该计算机程序被处理器执行时实现如第一方面所述的控制方法的步骤。
在本发明实施例中,首先,终端设备接收用户在目标屏上的目标输入,然后响应于目标输入,终端设备对第一屏显示的目标内容执行与目标输入对应的目标动作。当目标屏包括第一屏和第二屏,即用户可以触发终端设备通过两个屏上的输入控制第一屏上目标内容中的目标对象移动。当目标屏为第二屏时,即用户可以触发终端设备通过其他屏上的输入控制第一屏上目标内容中的目标对象移动,或者用户可以触发终端设备通过其他屏上的输入在第一屏上目标内容上叠加显示特效内容。相比于目前终端设备仅通过一个屏上的输入控制该屏上的对象的控制方式,本发明实施例中,终端设备可以通过两个屏控制一个屏中显示的对象,或者通过用户在第二屏上的输入控制在第一屏显示的目标内容中显示特效内容,使得用户操作终端设备更加方便快捷,用户使用体验更佳。
附图说明
图1为本发明实施例提供的一种可能的安卓操作系统的架构示意图;
图2为本发明实施例提供的一种控制方法的流程示意图;
图3为本发明实施例提供的一种控制终端设备的示意图之一;
图4为本发明实施例提供的一种控制终端设备的示意图之二;
图5为本发明实施例提供的一种控制终端设备的示意图之三;
图6为本发明实施例提供的一种第二屏上区域的分布示意图之一;
图7为本发明实施例提供的一种第二屏上区域的分布示意图之二;
图8为本发明实施例提供的一种终端设备可能的结构示意图之一;
图9为本发明实施例提供的一种终端设备可能的结构示意图之二;
图10为本发明各个实施例的一种终端设备的硬件结构示意图。
具体实施方式
下面将结合本发明实施例中的附图,对本发明实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例是本发明一部分实施例,而不是全部的实施例。基于本发明中的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都属于本发明保护的范围。
需要说明的是,本文中的“/”表示或的意思,例如,A/B可以表示A或B;本文中的“和/或”仅仅是一种描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B这三种情况。“多个”是指两个或多于两个。
本发明的说明书和权利要求书中的术语“第一”和“第二”等是用于区别不同的对象,而不是用于描述对象的特定顺序。例如,第一屏和第二屏等是用于区别不同的屏,而不是用于描述屏的特定顺序。
需要说明的是,本发明实施例中,“示例性的”或者“例如”等词用于表示作例子、例证或说明。本发明实施例中被描述为“示例性的”或者“例如”的任何实施例或设计方案不应被解释为比其它实施例或设计方案更可选或更具优势。确切而言,使用“示例性的”或者“例如” 等词旨在以具体方式呈现相关概念。
本发明实施例中的终端设备可以为具有操作系统的终端设备。该操作系统可以为安卓(Android)操作系统,可以为ios操作系统,还可以为其他可能的操作系统,本发明实施例不作具体限定。
下面以安卓操作系统为例,介绍一下本发明实施例提供的控制方法所应用的软件环境。
如图1所示,为本发明实施例提供的一种可能的安卓操作系统的架构示意图。在图1中,安卓操作系统的架构包括4层,分别为:应用程序层、应用程序框架层、系统运行库层和内核层(具体可以为Linux内核层)。
其中,应用程序层包括安卓操作系统中的各个应用程序(包括系统应用程序和第三方应用程序)。
应用程序框架层是应用程序的框架,开发人员可以在遵守应用程序的框架的开发原则的情况下,基于应用程序框架层开发一些应用程序。
系统运行库层包括库(也称为系统库)和安卓操作系统运行环境。库主要为安卓操作系统提供其所需的各类资源。安卓操作系统运行环境用于为安卓操作系统提供软件环境。
内核层是安卓操作系统的操作系统层,属于安卓操作系统软件层次的最底层。内核层基于Linux内核为安卓操作系统提供核心系统服务和与硬件相关的驱动程序。
以安卓操作系统为例,本发明实施例中,开发人员可以基于上述如图1所示的安卓操作系统的系统架构,开发实现本发明实施例提供的控制方法的软件程序,从而使得该控制方法可以基于如图1所示的安卓操作系统运行。即处理器或者终端设备可以通过在安卓操作系统中运行该软件程序实现本发明实施例提供的控制方法。
下面结合图2中对本发明实施例的控制方法进行说明。图2为本发明实施例提供的一种控制方法的流程示意图,如图2所示,该控制方法包括S201和S202:
S201、终端设备接收用户在目标屏上的目标输入。
可选地,本发明实施例中的终端设备的目标屏为具有触摸功能的屏。
S202、终端设备响应于目标输入,对第一屏显示的目标内容执行与目标输入对应的目标动作。
其中,目标屏包括第一屏和第二屏,目标动作为控制目标内容中的目标对象在第一屏上移动。
或者,目标屏为第二屏,目标动作为控制目标内容中的目标对象在第一屏上移动,或在目标内容上叠加显示特效内容。
需要说明的是,第一屏可以为双面屏的前屏,也可以为双面屏的背屏,本发明实施例对此不作具体限定。
示例性的,目标对象可以为终端设备中显示的应用图标、文档中的光标、游戏界面中的人物、装备、也可以为截屏操作后对截图进行更加精确截图的选择框,也可以为美颜界面中的图像。特效内容可以为游戏中的技能、也可以为对图片进行编辑时添加的附加效果,例如特效可以为黑白效果、小清新效果、在图像中添加了一个兔耳朵的效果等等。
下面以目标屏为第二屏,目标动作为控制目标内容中的目标对象在第一屏上移动为例进行说明,示例性的,图3为本发明实施例提供的一种控制终端设备的示意图,假设屏31(第一屏)中显示了一个正在编辑文本的界面,用户可以在屏32(第二屏)上输入控制屏31(第一屏)上显示的光标移动。能够避免用户在编辑文本的时候,由于用户的手指遮挡光标的位置,用户无法精确移动光标至想要的位置的缺陷,本发明实施例中,用户在第二屏上控制第一屏上的光标的移动,可以便于用户快速的将光标移动至用户想要插入文本的位置。
本发明实施例提供的控制方法,首先,终端设备接收用户在目标屏上的目标输入,然后响应于目标输入,终端设备对第一屏显示的目标内容执行与目标输入对应的目标动作。当目标屏包括第一屏和第二屏,即用户可以触发终端设备通过两个屏上的输入控制第一屏上目标内容中的目标对象移动。当目标屏为第二屏时,即用户可以触发终端设备通过其他屏上的输入控制第一屏上目标内容中的目标对象移动,或者用户可以触发终端设备通过其他屏上的输入在第一屏上目标内容上叠加显示特效内容。相比于目前终端设备仅通过一个屏上的输入控制该屏上的对象的控制方式,本发明实施例中,终端设备可以通过两个屏控制一个屏中显示的对象,或者通过用户在第二屏上的输入控制在第一屏显示的目标内容中显示特效内容,使得用户操作终端设备更加方便快捷,用户使用体验更佳。
一种可能的实现方式,本发明实施例提供的控制方法,上述实施例中的S202具体可以通过S202a执行:
S202a、响应于目标输入,在满足预设条件的情况下,终端设备对第一屏显示的目标内容执行与目标输入对应的目标动作。
其中,预设条件包括以下任意一项:第一屏以无限屏模式显示内容、第一屏以投屏模式显示内容、第一屏显示预设应用类型的应用的界面、第一屏显示远程控制界面。
通常,无线屏模式,即为终端设备在屏幕上显示超过屏幕尺寸的内容,当一个页面以无线屏模式显示时,用户可以在该页面中向第一方向滑动,将该页面中第二方向没有在显示区域内的显示的内容显示在显示区域内,其中,第二方向与第一方向为相反的方向。当终端设备的显示应用图标的桌面以无线屏模式显示应用图标时,假设应用图标足够多,用户可以在一个方向一直滑动屏幕,没有显示的应用的图标可以根据用户的滑动操作显示在终端设备的显示区域中。
通常,投屏模式指的是终端设备将终端设备显示屏上显示的内容投屏显示到一个显示区域更大的设备的显示区域中。
可选地,预设应用类型可以包括:游戏类应用、修图类应用、电子书类的应用、文档类应用等。
可选地,终端设备按照目标输入的输入参数控制目标对象移动,目标特效内容为与目标功能标识指示的目标功能对应的特效内容,目标功能标识为在与目标输入对应的位置显示的功能标识。
基于该方案,终端设备可以在满足预设条件的情况下,对第一屏显示的目标内容执行与目标输入对应的目标动作,终端设备可以根据不同的预设条件确定目标输入对应不同的目标动作,即终端设备可以根据应用场景灵活的响应用户的输入,使得终端 设备更加智能的响应用户输入。
一种可能的实现方式,目标屏包括第一屏和第二屏,目标输入包括在第一屏上的第一输入和在第二屏上的第二输入;本发明实施例提供的控制方法,在对第一屏显示的目标内容执行与目标输入对应的目标动作之前,还包括S203:
S203、终端设备根据目标输入,确定目标输入的目标输入参数。
具体的,S203可以通过S203a执行:
S203a、终端设备根据目标输入,采用公式a=b+k×c,确定目标输入的目标输入参数。
其中,a表示目标输入参数,b表示第一输入的输入参数,c表示第二输入的输入参数,0<k<1。
可选地,系数k可以为系统默认的,例如k=0.3、k=0.8;系数k也可以为用户手动设置的,本发明实施例对此不作具体限定。
可以理解的是,k值可以调整第二输入对目标输入参数的步长,从而使得第二输入可以以较小的步长控制目标对象的移动,也就是说用户可以通过第一屏上的第一输入进行较大步长的移动,通过第二屏上的第二输入进行较小步长的移动,即第一屏上的第一输入为粗略控制,第二屏上的第二输入为精细控制。
进而,S202可以通过S202b执行:
S202b、响应于目标输入,终端设备按照目标输入参数,控制目标对象在第一屏上移动。
需要说明的是,第一输入和第二输入可以为用户同时在两个屏上的输入,也可以为用户先在一个屏上输入一个输入,再在另一个屏上输入另一个输入,本发明实施例对此不作具体限定。
可选地,在本发明实施例中,终端设备按照目标输入参数控制移动的目标对象可以为用户的目标输入选中目标内容中的一个对象,目标对象也可以为显示界面中显示的光标,目标对象也可以为整个显示页面,本发明实施例对此不作具体限定。
需要说明的是,终端设备可以采用目标输入参数中的长度仅控制移动的距离,也可也采用目标输入参数的长度和方向控制移动的距离和方向,本发明实施例对此不作具体限定。
需要说明的是,目标对象的移动的起点为第一输入和第二输入中终端设备先接收到的输入的起始位置为起点,也可以为在第一屏中目标对象接收到目标输入之前所在的位置,实际应用中可以根据应用的场景具体设置,本发明实施例对此不作具体限定。
示例性的,如图4所示,假设屏31为第一屏,屏32为第二屏,假设第一输入为用户在屏31上向右滑动,第二输入为用户在屏32上向上滑动,假设第二输入仅用于调整第一输入的移动距离和方向。第一输入的输入参数为向量
Figure PCTCN2019129064-appb-000001
(从O1点滑动到B1点),向量
Figure PCTCN2019129064-appb-000002
包括第一输入在屏31上的滑动距离和滑动方向,第二输入的输入参数为向量
Figure PCTCN2019129064-appb-000003
(从O1点滑动到C1点),假设k=0.3,则目标输入参数为
Figure PCTCN2019129064-appb-000004
假设目标对象为显示屏上显示的一个光标,则该光标最终的位置为第一输入和第二输入之前所在的位置沿向量
Figure PCTCN2019129064-appb-000005
指示的方向移动向量
Figure PCTCN2019129064-appb-000006
指示的长度,即从O1点移动到A1点。
假设第一输入为用户在屏31上向下滑动,第一输入的输入参数为向量
Figure PCTCN2019129064-appb-000007
(从O2 点移动到B2点),第二输入的输入参数为向量
Figure PCTCN2019129064-appb-000008
(从O3点移动到C2点),假设第二输入仅用于调整第一输入的移动距离,第一输入和第二输入的方向相反,则目标输入参数为
Figure PCTCN2019129064-appb-000009
则该光标最终的显示位置为A2点。
参考上述示例,用户可以在第一屏上进行一个目标对象位置的粗调,在第二屏上进行目标对象位置的细调,例如用户在无线屏模式下查找某一个应用的图标时,可以在第一屏上输入第一输入进行粗略查找,在第二屏上进行精细查找。用户在绘图或者修图时,可以在第一屏上进行粗络移动,在第二屏上进行精确移动,从而找到用户想要的位置。
基于该方案,在目标屏包括第一屏和第二屏,目标输入包括在第一屏上的第一输入和在第二屏上的第二输入的情况下,首先,终端设备根据目标输入,例如采用公式a=b+k×c,确定目标输入的目标输入参数,然后终端设备按照目标输入参数,控制目标对象在第一屏上移动,通过第一屏上的第一输入进行粗略控制,通过k值调整第二屏上的第二输入的步长进行精细的控制,从而使得用户可以对第一屏的目标对象位置的进行精确调整,方便用户查找或者编辑。
一种可能的实现方式,目标屏包括第一屏和第二屏,目标对象包括第一对象和第二对象,目标输入包括在第一屏上针对第一对象的第一输入和在第二屏上针对第二对象的第二输入;本发明实施例提供的控制方法,S202具体可以通过S202c1和S202c2执行:
S202c1、响应于第一输入,终端设备按照第一输入的输入参数,控制第一对象在第一屏上移动。
可选地,第一对象可以为用户第一输入选中的对象,也可以为第一屏上的输入默认的控制的对象,本发明实施例对此不作具体限定。
S202c2、响应于第二输入,终端设备按照第二输入的输入参数,控制第二对象在第一屏上移动。
具体的,假设第一屏上包括两个可以移动的对象,则第一输入选中的对象可以为第一对象,第二对象即为另一个对象;第二对象也可以为第二屏上的输入默认控制的对象;本发明实施例对此不作具体限定。
示例性的,假设第一对象为第一输入选中的对象,终端设备可以将第一对象从第一位置(即,第一输入选中第一对象时第一输入在第一屏上对应的位置)按照第一输入的输入参数移动到第二位置(第一输入结束时的位置)。假设第一对象为第一屏上的输入默认控制的对象,终端设备将按照第一输入的输入参数将第一对象当前的位置移动到第三位置(即第一输入对应的方向和长度对应在第一屏中显示的界面中的位置)。
示例性的,图5为本发明实施例提供的一种控制终端设备的示意图。如图5所示,假设屏31上显示为一个游戏界面,游戏界面包括人物和射击装备,其中控件301用于指示人物在界面中的位置,控件302指示射击装备的准星的位置,用户可以在屏31上通过第一输入控制人物移动,用户可以在屏32上通过第二输入控制准星的移动。
需要说明的是,上述仅为示例性说明,用户也可以设置在第一屏上的第一输入用于控制准星移动的,设置在第二屏上的第二输入用于控制人物移动,用户可以根据自 己的习惯进行更改,本发明实施例对此不作具体限定。
基于该方案,用户可以通过在第一屏的输入和第二屏上的输入,触发终端设备分别控制第一屏上的第一对象移动和第二对象移动,即终端设备能够根据用户在不同屏上的输入控制第一屏上的不同的对象移动,因此,使得用户控制终端设备更加方便。
一种可能的实现方式,第二屏包括M个区域,每个区域用于控制第一屏上的不同对象,或每个区域对应不同的特效内容,M为大于或等于2的整数;其中,目标输入包括在该M个区域中的一个区域中的输入。
需要说明的是,用户当然也可以设置第二屏整个屏幕对应一个对象或者对应一个特效内容,本发明实施例对此不作具体限定。
可以理解的是,当第二屏上的每个区域用于控制第一屏上的不同对象时,可以应用于第一屏上对多个对象均需要操作的场景中,用户无需在第一屏上切换对象,通过不同区域上的操作可以快速控制不同的对象。
示例性的,图6为第二屏上区域的分布示意图,假设第二屏上包括4个区域,分别为区域30a、区域30b、区域30c以及区域30d,每个区域对应一个特效内容,假设屏31上显示一个游戏界面,该游戏中的特效1与区域30a对应,特效2与区域30b对应,特效3与区域30c对应,特效4与区域30d对应。用户可以在第二屏上的4个区域中输入控制终端设备显示不同的特效内容。
需要说明的是,上述仅以第二屏上每个区域对应不同的特效内容为例进行举例说明,每个区域对应不同的对象可以参考上述区域对应特效内容的描述,在此不再赘述。
具体的,用户可以进入设置界面,在设置界面中用户可以为M个区域中的每个区域设置对应的特效内容。
基于该方案,若用户需要控制第一屏上的不同对象,用户可以通过第二屏上的M个区域控制第一屏上的不同对象,相比于相关技术中用户需要在屏幕上来回切换控制的对象的方式,本发明实施例提供的控制方式可以更快速、更方便地控制一个屏上的多个对象;若用户需要控制第一屏上的对象显示特效内容时,用户可以在第二屏上的M个区域分别操作,然后触发终端设备在第一屏上分别显示不同的特效内容,例如用户在玩游戏时可以通过对第二屏上的M个区域操作触发对象使用不同的技能,相比于相关技术中显示特效内容需要进行切换的方式,更加方便快捷,使得用户使用体验更佳。
一种可能的实现方式,第二屏包括N个第一区域和K个第二区域,每个第一区域用于控制第一屏上的不同对象,每个第二区域对应不同的特效内容,N和K均为正整数;其中,目标输入包括在该N个第一区域中的一个第一区域中的输入,或包括在该K个第二区域中的一个第二区域中的输入。
示例性的,图7为第二屏上区域的分布示意图,假设2个第一区域分别为区域30f和区域30h,2个第二区域分别为区域30e和区域30g。其中,区域30f可以控制人物1的武器的准星移动,区域30h可以控制人物1移动,区域30e可以用于切换武器,区域30g可以控制人物1跳跃。
需要说明的是,图7仅为示例性的说明,实际应用中,用户可以根据不同的游戏在不同的区域根据自己的需求设置不同的功能,本发明实施例对此不作具体限定。
基于该方案,终端设备可以通过第二屏的第一区域控制第一屏上的目标对象,通过第二屏的第二区域显示特效,例如在游戏界面中,可以使得用户既可以通过第二屏控制游戏中的人物的移动,也可以通过第二屏触发游戏中的技能,方便用户操作,用户无需在终端上连接外接手柄,使得用户的操作体验更佳。
需要说明的是,在实际应用中,用户可以在终端设备中为不同的应用自行配置第一屏上的第一输入和第二屏上的第二输入对应的各个功能,本发明实施例对此不作具体限定。
图8为本发明实施例提供的一种终端设备可能的结构示意图,如图8所示,终端设备800包括:接收模块801和执行模块802;接收模块801,用于接收用户在目标屏上的目标输入;执行模块802,用于响应于接收模块801接收的目标输入,对第一屏显示的目标内容执行与目标输入对应的目标动作;其中,目标屏包括第一屏和第二屏,目标动作为控制目标内容中的目标对象在第一屏上移动;或者,目标屏为第二屏,目标动作为控制目标内容中的目标对象在第一屏上移动,或在目标内容上叠加显示特效内容。
可选地,执行模块802具体用于:在满足预设条件的情况下,对第一屏显示的目标内容执行与目标输入对应的目标动作;预设条件包括以下任意一项:第一屏以无限屏模式显示内容、第一屏以投屏模式显示内容、第一屏显示预设应用类型的应用的界面、第一屏显示远程控制界面。
可选地,目标屏包括第一屏和第二屏,目标输入包括在第一屏上的第一输入和在第二屏上的第二输入;结合图8,如图9所示,终端设备800还包括确定模块803;确定模块803,用于在执行模块802对第一屏显示的目标内容执行与目标输入对应的目标动作之前,根据目标输入,确定目标输入的目标输入参数;执行模块802,具体用于按照确定模块803确定的目标输入参数,控制目标对象在第一屏上移动。
可选地,目标屏包括第一屏和第二屏,目标对象包括第一对象和第二对象,目标输入包括在第一屏上针对第一对象的第一输入和在第二屏上针对第二对象的第二输入;执行模块802具体用于:响应于第一输入,按照第一输入的输入参数,控制第一对象在第一屏上移动;响应于第二输入,按照第二输入的输入参数,控制第二对象在第一屏上移动。
可选地,第二屏包括M个区域,每个区域用于控制第一屏上的不同对象,或每个区域对应不同的特效内容,M为大于或等于2的整数;其中,目标输入包括在该M个区域中的一个区域中的输入。
可选地,第二屏包括N个第一区域和K个第二区域,每个第一区域用于控制第一屏上的不同对象,每个第二区域对应不同的特效内容,N和K均为正整数;其中,目标输入包括在该N个第一区域中的一个第一区域中的输入,或包括在该K个第二区域中的一个第二区域中的输入。
本发明实施例提供的终端设备800能够实现上述方法实施例中终端设备实现的各个过程,为避免重复,这里不再赘述。
本发明实施例提供的终端设备,首先,终端设备接收用户在目标屏上的目标输入,然后响应于目标输入,终端设备对第一屏显示的目标内容执行与目标输入对应的目标 动作。当目标屏包括第一屏和第二屏,即用户可以触发终端设备通过两个屏上的输入控制第一屏上目标内容中的目标对象移动。当目标屏为第二屏时,即用户可以触发终端设备通过其他屏上的输入控制第一屏上目标内容中的目标对象移动,或者用户可以触发终端设备通过其他屏上的输入在第一屏上目标内容上叠加显示特效内容。相比于目前终端设备仅通过一个屏上的输入控制该屏上的对象的控制方式,本发明实施例中,终端设备可以通过两个屏控制一个屏中显示的对象,或者通过用户在第二屏上的输入控制在第一屏显示的目标内容中显示特效内容,使得用户操作终端设备更加方便快捷,用户使用体验更佳。
图10为实现本发明各个实施例的一种终端设备的硬件结构示意图,该终端设备100包括但不限于:射频单元101、网络模块102、音频输出单元103、输入单元104、传感器105、显示单元106、用户输入单元107、接口单元108、存储器109、处理器110、以及电源111等部件。本领域技术人员可以理解,图10中示出的终端设备结构并不构成对终端设备的限定,终端设备可以包括比图示更多或更少的部件,或者组合某些部件,或者不同的部件布置。在本发明实施例中,终端设备包括但不限于手机、平板电脑、笔记本电脑、掌上电脑、车载终端设备、可穿戴设备、以及计步器等。
其中,用户输入单元107,用于接收用户在目标屏上的目标输入;处理器110,用于响应于目标输入,对第一屏显示的目标内容执行与目标输入对应的目标动作;其中,目标屏包括第一屏和第二屏,目标动作为控制目标内容中的目标对象在第一屏上移动;或者,目标屏为第二屏,目标动作为控制目标内容中的目标对象在第一屏上移动,或在目标内容上叠加显示特效内容。
本发明实施例提供的终端设备,首先,终端设备接收用户在目标屏上的目标输入,然后响应于目标输入,终端设备对第一屏显示的目标内容执行与目标输入对应的目标动作。当目标屏包括第一屏和第二屏,即用户可以触发终端设备通过两个屏上的输入控制第一屏上目标内容中的目标对象移动。当目标屏为第二屏时,即用户可以触发终端设备通过其他屏上的输入控制第一屏上目标内容中的目标对象移动,或者用户可以触发终端设备通过其他屏上的输入在第一屏上目标内容上叠加显示特效内容。相比于目前终端设备仅通过一个屏上的输入控制该屏上的对象的控制方式,本发明实施例中,终端设备可以通过两个屏控制一个屏中显示的对象,或者通过用户在第二屏上的输入控制在第一屏显示的目标内容中显示特效内容,使得用户操作终端设备更加方便快捷,用户使用体验更佳。
应理解的是,本发明实施例中,射频单元101可用于收发信息或通话过程中,信号的接收和发送,具体的,将来自基站的下行数据接收后,给处理器110处理;另外,将上行的数据发送给基站。通常,射频单元101包括但不限于天线、至少一个放大器、收发信机、耦合器、低噪声放大器、双工器等。此外,射频单元101还可以通过无线通信系统与网络和其他设备通信。
终端设备通过网络模块102为用户提供了无线的宽带互联网访问,如帮助用户收发电子邮件、浏览网页和访问流式媒体等。
音频输出单元103可以将射频单元101或网络模块102接收的或者在存储器109中存储的音频数据转换成音频信号并且输出为声音。而且,音频输出单元103还可以提供与终 端设备100执行的特定功能相关的音频输出(例如,呼叫信号接收声音、消息接收声音等等)。音频输出单元103包括扬声器、蜂鸣器以及受话器等。
输入单元104用于接收音频或视频信号。输入单元104可以包括图形处理器(Graphics Processing Unit,GPU)1041和麦克风1042,图形处理器1041对在视频捕获模式或图像捕获模式中由图像捕获装置(如摄像头)获得的静态图片或视频的图像数据进行处理。处理后的图像帧可以显示在显示单元106上。经图形处理器1041处理后的图像帧可以存储在存储器109(或其它存储介质)中或者经由射频单元101或网络模块102进行发送。麦克风1042可以接收声音,并且能够将这样的声音处理为音频数据。处理后的音频数据可以在电话通话模式的情况下转换为可经由射频单元101发送到移动通信基站的格式输出。
终端设备100还包括至少一种传感器105,比如光传感器、运动传感器以及其他传感器。具体地,光传感器包括环境光传感器及接近传感器,其中,环境光传感器可根据环境光线的明暗来调节显示面板1061的亮度,接近传感器可在终端设备100移动到耳边时,关闭显示面板1061和/或背光。作为运动传感器的一种,加速计传感器可检测各个方向上(一般为三轴)加速度的大小,静止时可检测出重力的大小及方向,可用于识别终端设备姿态(比如横竖屏切换、相关游戏、磁力计姿态校准)、振动识别相关功能(比如计步器、敲击)等;传感器105还可以包括指纹传感器、压力传感器、虹膜传感器、分子传感器、陀螺仪、气压计、湿度计、温度计、红外线传感器等,在此不再赘述。
显示单元106用于显示由用户输入的信息或提供给用户的信息。显示单元106可包括显示面板1061,可以采用液晶显示器(Liquid Crystal Display,LCD)、有机发光二极管(Organic Light-Emitting Diode,OLED)等形式来配置显示面板1061。
用户输入单元107可用于接收输入的数字或字符信息,以及产生与终端设备的用户设置以及功能控制有关的键信号输入。具体地,用户输入单元107包括触控面板1071以及其他输入设备1072。触控面板1071,也称为触摸屏,可收集用户在其上或附近的触摸操作(比如用户使用手指、触笔等任何适合的物体或附件在触控面板1071上或在触控面板1071附近的操作)。触控面板1071可包括触摸检测装置和触摸控制器两个部分。其中,触摸检测装置检测用户的触摸方位,并检测触摸操作带来的信号,将信号传送给触摸控制器;触摸控制器从触摸检测装置上接收触摸信息,并将它转换成触点坐标,再送给处理器110,接收处理器110发来的命令并加以执行。此外,可以采用电阻式、电容式、红外线以及表面声波等多种类型实现触控面板1071。除了触控面板1071,用户输入单元107还可以包括其他输入设备1072。具体地,其他输入设备1072可以包括但不限于物理键盘、功能键(比如音量控制按键、开关按键等)、轨迹球、鼠标、操作杆,在此不再赘述。
进一步的,触控面板1071可覆盖在显示面板1061上,当触控面板1071检测到在其上或附近的触摸操作后,传送给处理器110以确定触摸事件的类型,随后处理器110根据触摸事件的类型在显示面板1061上提供相应的视觉输出。虽然在图10中,触控面板1071与显示面板1061是作为两个独立的部件来实现终端设备的输入和输出功能,但是在某些实施例中,可以将触控面板1071与显示面板1061集成而实现终端设备的输入和输出功能,具体此处不做限定。
接口单元108为外部装置与终端设备100连接的接口。例如,外部装置可以包括有线或无线头戴式耳机端口、外部电源(或电池充电器)端口、有线或无线数据端口、存储卡 端口、用于连接具有识别模块的装置的端口、音频输入/输出(I/O)端口、视频I/O端口、耳机端口等等。接口单元108可以用于接收来自外部装置的输入(例如,数据信息、电力等等)并且将接收到的输入传输到终端设备100内的一个或多个元件或者可以用于在终端设备100和外部装置之间传输数据。
存储器109可用于存储软件程序以及各种数据。存储器109可主要包括存储程序区和存储数据区,其中,存储程序区可存储操作系统、至少一个功能所需的应用程序(比如声音播放功能、图像播放功能等)等;存储数据区可存储根据手机的使用所创建的数据(比如音频数据、电话本等)等。此外,存储器109可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件、闪存器件、或其他易失性固态存储器件。
处理器110是终端设备的控制中心,利用各种接口和线路连接整个终端设备的各个部分,通过运行或执行存储在存储器109内的软件程序和/或模块,以及调用存储在存储器109内的数据,执行终端设备的各种功能和处理数据,从而对终端设备进行整体监控。处理器110可包括一个或多个处理单元;可选地,处理器110可集成应用处理器和调制解调处理器,其中,应用处理器主要处理操作系统、用户界面和应用程序等,调制解调处理器主要处理无线通信。可以理解的是,上述调制解调处理器也可以不集成到处理器110中。
终端设备100还可以包括给各个部件供电的电源111(比如电池),可选地,电源111可以通过电源管理系统与处理器110逻辑相连,从而通过电源管理系统实现管理充电、放电、以及功耗管理等功能。
另外,终端设备100包括一些未示出的功能模块,在此不再赘述。
可选地,本发明实施例还提供一种终端设备,结合图10,包括处理器110,存储器109,存储在存储器109上并可在处理器110上运行的计算机程序,该计算机程序被处理器110执行时实现上述控制方法实施例的各个过程,且能达到相同的技术效果,为避免重复,这里不再赘述。
本发明实施例提供的终端设备,首先,终端设备接收用户在目标屏上的目标输入,然后响应于目标输入,终端设备对第一屏显示的目标内容执行与目标输入对应的目标动作。当目标屏包括第一屏和第二屏,即用户可以触发终端设备通过两个屏上的输入控制第一屏上目标内容中的目标对象移动。当目标屏为第二屏时,即用户可以触发终端设备通过其他屏上的输入控制第一屏上目标内容中的目标对象移动,或者用户可以触发终端设备通过其他屏上的输入在第一屏上目标内容上叠加显示特效内容。相比于目前终端设备仅通过一个屏上的输入控制该屏上的对象的控制方式,本发明实施例中,终端设备可以通过两个屏控制一个屏中显示的对象,或者通过用户在第二屏上的输入控制在第一屏显示的目标内容中显示特效内容,使得用户操作终端设备更加方便快捷,用户使用体验更佳。
本发明实施例还提供一种计算机可读存储介质,计算机可读存储介质上存储有计算机程序,该计算机程序被处理器执行时实现上述控制方法实施例的各个过程,且能达到相同的技术效果,为避免重复,这里不再赘述。其中,所述的计算机可读存储介质,如只读存储器(Read-Only Memory,ROM)、随机存取存储器(Random Access Memory,RAM)、磁碟或者光盘等。
需要说明的是,在本文中,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、物品或者装置不仅包括那些要素,而 且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、物品或者装置所固有的要素。在没有更多限制的情况下,由语句“包括一个……”限定的要素,并不排除在包括该要素的过程、方法、物品或者装置中还存在另外的相同要素。
通过以上的实施方式的描述,本领域的技术人员可以清楚地了解到上述实施例方法可借助软件加必需的通用硬件平台的方式来实现,当然也可以通过硬件,但很多情况下前者是更佳的实施方式。基于这样的理解,本发明的技术方案本质上或者说对相关技术做出贡献的部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质(如ROM/RAM、磁碟、光盘)中,包括若干指令用以使得一台终端设备(可以是手机,计算机,服务器,空调器,或者网络设备等)执行本发明各个实施例所述的方法。
上面结合附图对本发明的实施例进行了描述,但是本发明并不局限于上述的具体实施方式,上述的具体实施方式仅仅是示意性的,而不是限制性的,本领域的普通技术人员在本发明的启示下,在不脱离本发明宗旨和权利要求所保护的范围情况下,还可做出很多形式,均属于本发明的保护之内。

Claims (14)

  1. 一种控制方法,应用于具有第一屏和第二屏的终端设备,其中,所述方法包括:
    接收用户在目标屏上的目标输入;
    响应于所述目标输入,对所述第一屏显示的目标内容执行与所述目标输入对应的目标动作;
    其中,所述目标屏包括所述第一屏和所述第二屏,所述目标动作为控制所述目标内容中的目标对象在所述第一屏上移动;或者,
    所述目标屏为所述第二屏,所述目标动作为控制所述目标内容中的目标对象在所述第一屏上移动,或在所述目标内容上叠加显示特效内容。
  2. 根据权利要求1所述的方法,其中,所述对所述第一屏显示的目标内容执行与所述目标输入对应的目标动作,包括:
    在满足预设条件的情况下,对所述第一屏显示的目标内容执行与所述目标输入对应的目标动作;
    所述预设条件包括以下任意一项:所述第一屏以无限屏模式显示内容、所述第一屏以投屏模式显示内容、所述第一屏显示预设应用类型的应用的界面、所述第一屏显示远程控制界面。
  3. 根据权利要求1或2所述的方法,其中,所述目标屏包括所述第一屏和所述第二屏,所述目标输入包括在所述第一屏上的第一输入和在所述第二屏上的第二输入;
    所述对所述第一屏显示的目标内容执行与所述目标输入对应的目标动作之前,还包括:
    根据所述目标输入,确定所述目标输入的目标输入参数;
    所述对所述第一屏显示的目标内容执行与所述目标输入对应的目标动作,包括:
    按照所述目标输入参数,控制所述目标对象在所述第一屏上移动。
  4. 根据权利要求1或2所述的方法,其中,所述目标屏包括所述第一屏和所述第二屏,所述目标对象包括第一对象和第二对象,所述目标输入包括在所述第一屏上针对所述第一对象的第一输入和在所述第二屏上针对所述第二对象的第二输入;
    所述响应于所述目标输入,对所述第一屏显示的目标内容执行与所述目标输入对应的目标动作,包括:
    响应于所述第一输入,按照所述第一输入的输入参数,控制所述第一对象在所述第一屏上移动;
    响应于所述第二输入,按照所述第二输入的输入参数,控制所述第二对象在所述第一屏上移动。
  5. 根据权利要求1或2所述的方法,其中,所述第二屏包括M个区域,每个区域用于控制所述第一屏上的不同对象,或每个区域对应不同的特效内容,M为大于或等于2的整数;
    其中,所述目标输入包括在所述M个区域中的一个区域中的输入。
  6. 根据权利要求1或2所述的方法,其中,所述第二屏包括N个第一区域和K个第二区域,每个第一区域用于控制所述第一屏上的不同对象,每个第二区域对应不同的特效内容,N和K均为正整数;
    其中,所述目标输入包括在所述N个第一区域中的一个第一区域中的输入,或包括在所述K个第二区域中的一个第二区域中的输入。
  7. 一种终端设备,所述终端设备具有第一屏和第二屏,其中,所述终端设备包括:接收模块和执行模块;
    所述接收模块,用于接收用户在目标屏上的目标输入;
    所述执行模块,用于响应于所述接收模块接收的所述目标输入,对所述第一屏显示的目标内容执行与所述目标输入对应的目标动作;
    其中,所述目标屏包括所述第一屏和所述第二屏,所述目标动作为控制所述目标内容中的目标对象在所述第一屏上移动;或者,
    所述目标屏为所述第二屏,所述目标动作为控制所述目标内容中的目标对象在所述第一屏上移动,或在所述目标内容上叠加显示特效内容。
  8. 根据权利要求7所述的终端设备,其中,所述执行模块具体用于:在满足预设条件的情况下,对所述第一屏显示的目标内容执行与所述目标输入对应的目标动作;
    所述预设条件包括以下任意一项:所述第一屏以无限屏模式显示内容、所述第一屏以投屏模式显示内容、所述第一屏显示预设应用类型的应用的界面、所述第一屏显示远程控制界面。
  9. 根据权利要求7或8所述的终端设备,其中,所述目标屏包括所述第一屏和所述第二屏,所述目标输入包括在所述第一屏上的第一输入和在所述第二屏上的第二输入;所述终端设备还包括确定模块;
    所述确定模块,用于在所述执行模块对所述第一屏显示的目标内容执行与所述目标输入对应的目标动作之前,根据所述目标输入,确定所述目标输入的目标输入参数;
    所述执行模块,具体用于按照所述确定模块确定的所述目标输入参数,控制所述目标对象在所述第一屏上移动。
  10. 根据权利要求7或8所述的终端设备,其中,所述目标屏包括所述第一屏和所述第二屏,所述目标对象包括第一对象和第二对象,所述目标输入包括在所述第一屏上针对所述第一对象的第一输入和在所述第二屏上针对所述第二对象的第二输入;
    所述执行模块具体用于:
    响应于所述第一输入,按照所述第一输入的输入参数,控制所述第一对象在所述第一屏上移动;
    响应于所述第二输入,按照所述第二输入的输入参数,控制所述第二对象在所述第一屏上移动。
  11. 根据权利要求7或8所述的终端设备,其中,所述第二屏包括M个区域,每个区域用于控制所述第一屏上的不同对象,或每个区域对应不同的特效内容,M为大于或等于2的整数;其中,所述目标输入包括在所述M个区域中的一个区域中的输入。
  12. 根据权利要求7或8所述的终端设备,其中,所述第二屏包括N个第一区域和K个第二区域,每个第一区域用于控制所述第一屏上的不同对象,每个第二区域对应不同的特效内容,N和K均为正整数;
    其中,所述目标输入包括在所述N个第一区域中的一个第一区域中的输入,或包括在所述K个第二区域中的一个第二区域中的输入。
  13. 一种终端设备,其中,所述终端设备包括处理器、存储器及存储在所述存储器上并可在所述处理器上运行的计算机程序,所述计算机程序被所述处理器执行时实现如权利要求1-6中任一项所述的控制方法的步骤。
  14. 一种计算机可读存储介质,其中,所述计算机可读存储介质上存储计算机程序,所述计算机程序被处理器执行时实现如权利要求1-6中任一项所述的控制方法的步骤。
PCT/CN2019/129064 2019-01-30 2019-12-27 控制方法及终端设备 WO2020155980A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910092727.XA CN109947312A (zh) 2019-01-30 2019-01-30 一种控制方法及终端设备
CN201910092727.X 2019-01-30

Publications (1)

Publication Number Publication Date
WO2020155980A1 true WO2020155980A1 (zh) 2020-08-06

Family

ID=67007427

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/129064 WO2020155980A1 (zh) 2019-01-30 2019-12-27 控制方法及终端设备

Country Status (2)

Country Link
CN (1) CN109947312A (zh)
WO (1) WO2020155980A1 (zh)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109947312A (zh) * 2019-01-30 2019-06-28 维沃移动通信有限公司 一种控制方法及终端设备
CN111522519A (zh) * 2020-04-03 2020-08-11 青岛进化者小胖机器人科技有限公司 一种投屏方法、装置、设备、系统及存储介质
CN111880845A (zh) * 2020-07-24 2020-11-03 西安万像电子科技有限公司 控制目标应用的方法、系统及装置
CN112642150B (zh) * 2020-12-31 2023-01-17 上海米哈游天命科技有限公司 游戏画面的拍摄方法、装置、设备及存储介质
CN114708290A (zh) * 2022-03-28 2022-07-05 北京字跳网络技术有限公司 图像处理方法、装置、电子设备及存储介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101866260A (zh) * 2010-01-29 2010-10-20 宇龙计算机通信科技(深圳)有限公司 第二屏幕控制第一屏幕的方法、系统及移动终端
CN105183198A (zh) * 2015-10-14 2015-12-23 李彦辰 双屏电子设备、可单手操作的双屏电子设备鼠标指针的控制方法及控制装置
US20160018941A1 (en) * 2014-07-17 2016-01-21 Lg Electronics Inc. Mobile terminal and method for controlling the same
CN108628565A (zh) * 2018-05-07 2018-10-09 维沃移动通信有限公司 一种移动终端操作方法及移动终端
CN109947312A (zh) * 2019-01-30 2019-06-28 维沃移动通信有限公司 一种控制方法及终端设备

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102681710B (zh) * 2011-03-17 2016-08-10 联芯科技有限公司 电子设备输入方法、装置及基于该装置的电子设备
CN203338321U (zh) * 2013-07-10 2013-12-11 京东方科技集团股份有限公司 移动终端
US9817500B2 (en) * 2013-12-27 2017-11-14 Intel Corporation Mechanism for facilitating flexible wraparound displays for computing devices
CN105824553A (zh) * 2015-08-31 2016-08-03 维沃移动通信有限公司 一种触控方法及移动终端
CN108205419A (zh) * 2017-12-21 2018-06-26 中兴通讯股份有限公司 双屏控制方法、装置、移动终端及计算机可读存储介质
CN108897478A (zh) * 2018-06-28 2018-11-27 努比亚技术有限公司 终端操作方法、移动终端及计算机可读存储介质

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101866260A (zh) * 2010-01-29 2010-10-20 宇龙计算机通信科技(深圳)有限公司 第二屏幕控制第一屏幕的方法、系统及移动终端
US20160018941A1 (en) * 2014-07-17 2016-01-21 Lg Electronics Inc. Mobile terminal and method for controlling the same
CN105183198A (zh) * 2015-10-14 2015-12-23 李彦辰 双屏电子设备、可单手操作的双屏电子设备鼠标指针的控制方法及控制装置
CN108628565A (zh) * 2018-05-07 2018-10-09 维沃移动通信有限公司 一种移动终端操作方法及移动终端
CN109947312A (zh) * 2019-01-30 2019-06-28 维沃移动通信有限公司 一种控制方法及终端设备

Also Published As

Publication number Publication date
CN109947312A (zh) 2019-06-28

Similar Documents

Publication Publication Date Title
WO2021104365A1 (zh) 对象分享方法及电子设备
WO2021083052A1 (zh) 对象分享方法及电子设备
WO2021104195A1 (zh) 图像显示方法及电子设备
WO2020258929A1 (zh) 文件夹界面切换方法及终端设备
WO2020156466A1 (zh) 拍摄方法及终端设备
WO2020155980A1 (zh) 控制方法及终端设备
WO2020181955A1 (zh) 界面控制方法及终端设备
US11658932B2 (en) Message sending method and terminal device
US20220300302A1 (en) Application sharing method and electronic device
WO2020233323A1 (zh) 显示控制方法、终端设备及计算机可读存储介质
US20220004357A1 (en) Audio signal outputting method and terminal device
CN109032486B (zh) 一种显示控制方法及终端设备
WO2021129538A1 (zh) 一种控制方法及电子设备
WO2021129536A1 (zh) 图标移动方法及电子设备
WO2020156123A1 (zh) 信息处理方法及终端设备
WO2020238497A1 (zh) 图标移动方法及终端设备
WO2020215991A1 (zh) 显示控制方法及终端设备
WO2021068885A1 (zh) 控制方法及电子设备
WO2020215982A1 (zh) 桌面图标管理方法及终端设备
US20220043564A1 (en) Method for inputting content and terminal device
WO2021057301A1 (zh) 文件控制方法及电子设备
WO2021129850A1 (zh) 语音消息播放方法及电子设备
WO2021115172A1 (zh) 显示方法及电子设备
WO2020181954A1 (zh) 应用程序控制方法及终端设备
WO2020220893A1 (zh) 截图方法及移动终端

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19912561

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19912561

Country of ref document: EP

Kind code of ref document: A1