WO2020151490A1 - Procédé de commande d'application et équipement terminal - Google Patents

Procédé de commande d'application et équipement terminal Download PDF

Info

Publication number
WO2020151490A1
WO2020151490A1 PCT/CN2020/070722 CN2020070722W WO2020151490A1 WO 2020151490 A1 WO2020151490 A1 WO 2020151490A1 CN 2020070722 W CN2020070722 W CN 2020070722W WO 2020151490 A1 WO2020151490 A1 WO 2020151490A1
Authority
WO
WIPO (PCT)
Prior art keywords
screen
input
terminal device
functional area
target application
Prior art date
Application number
PCT/CN2020/070722
Other languages
English (en)
Chinese (zh)
Inventor
马俊杰
Original Assignee
维沃移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 维沃移动通信有限公司 filed Critical 维沃移动通信有限公司
Publication of WO2020151490A1 publication Critical patent/WO2020151490A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/426Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving on-screen location information, e.g. screen coordinates of an area at which the player is aiming with a light gun
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/837Shooting of targets
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Definitions

  • the embodiments of the present disclosure relate to the field of communication technologies, and in particular, to an application control method and terminal equipment.
  • the user needs to click a function control corresponding to a function in the interface of the application displayed in the terminal device to trigger the terminal device to execute The function.
  • the control method of this kind of application must click the function control corresponding to the function in the interface of the application, so the control method of the application is not flexible enough.
  • an application control method provided by an embodiment of the present disclosure is applied to a terminal device including at least two screens.
  • the method includes: receiving the user's first screen when the terminal device runs the target application on the first screen.
  • the first input on the two screens, the first screen and the second screen are different screens among the at least two screens; in response to the first input, if the first input satisfies the first condition, the terminal device executes and The first function corresponding to the first input; wherein, the first screen includes at least one functional area, and the first condition is any one of the following: the input position of the first input is in a functional area on the first screen, the first input To match with a preset input, different functional areas on the first screen correspond to different functions of the target application, and different preset inputs in the terminal device correspond to different functions of the target application.
  • an embodiment of the present disclosure provides a terminal device, the terminal device includes at least two screens, the terminal device further includes: a receiving module and a processing module; the above receiving module is used to run the target application on the first screen In the case of receiving the first input of the user on the second screen, the first screen and the second screen are different screens among the at least two screens; the above processing module is configured to respond to the first input and satisfy the first input In the case of the first condition, execute the first function corresponding to the first input on the target application; wherein, the first screen includes at least one functional area, and the first condition is any one of the following: the input position of the first input is A functional area and a first input on the first screen match a preset input, different functional areas on the first screen correspond to different functions of the target application, and different preset inputs in the terminal device correspond to different functions of the target application.
  • an embodiment of the present disclosure provides a terminal device.
  • the terminal device includes a processor, a memory, and a computer program stored on the memory and running on the processor.
  • the computer program is executed by the processor, Steps to implement the control method applied as in the first aspect.
  • an embodiment of the present disclosure provides a computer-readable storage medium that stores a computer program on the computer-readable storage medium, and when the computer program is executed by a processor, it implements the steps of the control method applied in the first aspect.
  • the application control method can be applied to a terminal device, and the terminal device includes at least two screens.
  • the method includes: receiving the user's first screen on the second screen when the target application is running on the first screen.
  • One input, the first screen and the second screen are at least two different screens; in response to the first input, in the case that the first input satisfies the first condition, execute the first input corresponding to the first input on the target application Function; wherein, the first screen includes at least one functional area, the first condition is any of the following: the input position of the first input is in a functional area on the first screen, the first input matches a preset input, Different functional areas on the first screen correspond to different functions of the target application, and different preset inputs in the terminal device correspond to different functions of the target application.
  • the terminal device can trigger the terminal device to execute the target application running on the first screen through the first input on the second screen when the first input meets the first condition.
  • the first function in this way, the applications running on the first screen can be controlled through the input on the second screen, and the applications in the terminal device can be controlled more flexibly.
  • FIG. 1 is a schematic structural diagram of a possible Android operating system provided by an embodiment of the present disclosure
  • FIG. 2 is a schematic diagram of an application control method provided by an embodiment of the disclosure
  • FIG. 3 is a schematic diagram of an interface in a second screen of a terminal device provided by an embodiment of the disclosure.
  • FIG. 4 is an intent of different areas on the second screen of the terminal device provided by an embodiment of the disclosure.
  • FIG. 5 is a schematic structural diagram of a terminal device provided by an embodiment of the disclosure.
  • FIG. 6 is a schematic diagram of hardware of a terminal device provided by an embodiment of the disclosure.
  • first and second in the specification and claims of the present disclosure are used to distinguish different objects, rather than to describe a specific order of objects.
  • first screen and the second screen are used to distinguish different screens, rather than to describe the specific order of the screens.
  • an application refers to an application program.
  • the terminal device in the embodiment of the present disclosure may be a terminal device with an operating system.
  • the operating system may be an Android operating system, an ios operating system, or other possible operating systems, which are not specifically limited in the embodiments of the present disclosure.
  • the following takes the Android operating system as an example to introduce the software environment applied by the application control method provided by the embodiments of the present disclosure.
  • FIG. 1 it is a schematic structural diagram of a possible Android operating system provided by an embodiment of the present disclosure.
  • the architecture of the Android operating system includes 4 layers, namely: application layer, application framework layer, system runtime library layer, and kernel layer (specifically, it may be the Linux kernel layer).
  • the application layer includes various applications (including system applications and third-party applications) in the Android operating system.
  • the application framework layer is the framework of the application. Developers can develop some applications based on the application framework layer while complying with the development principles of the application framework.
  • the system runtime library layer includes a library (also called a system library) and an Android operating system runtime environment.
  • the library mainly provides various resources needed by the Android operating system.
  • the Android operating system operating environment is used to provide a software environment for the Android operating system.
  • the kernel layer is the operating system layer of the Android operating system and belongs to the lowest level of the Android operating system software.
  • the kernel layer is based on the Linux kernel to provide core system services and hardware-related drivers for the Android operating system.
  • developers can develop software programs that implement the application control method provided by the embodiments of the present disclosure based on the system architecture of the Android operating system as shown in FIG.
  • the application control method can be run based on the Android operating system as shown in FIG. 1. That is, the processor or the terminal device can implement the application control method provided by the embodiment of the present disclosure by running the software program in the Android operating system.
  • the application control method and terminal device can be applied to a terminal device including at least two screens.
  • the method includes: when the terminal device runs the target application on the first screen, receiving the user's first screen The first input on the two screens, the first screen and the second screen are different screens among the at least two screens; in response to the first input, if the first input satisfies the first condition, the terminal device executes and The first function corresponding to the first input; wherein, the first screen includes at least one functional area, and the first condition is any one of the following: the input position of the first input is in a functional area on the first screen, the first input To match with a preset input, different functional areas on the first screen correspond to different functions of the target application, and different preset inputs in the terminal device correspond to different functions of the target application.
  • the terminal device can trigger the terminal device to execute the target application running on the first screen corresponding to the first input by receiving the first input on the second screen when the first input meets the first condition
  • the applications running on the first screen can be controlled through the input on the second screen, and the applications in the terminal device can be controlled more flexibly.
  • the terminal device in the embodiment of the present disclosure may be a mobile terminal device or a non-mobile terminal device.
  • the mobile terminal device can be a mobile phone, a tablet computer, a notebook computer, a handheld computer, a vehicle-mounted terminal device, a wearable device, an ultra-mobile personal computer (UMPC), a netbook, or a personal digital assistant (PDA)
  • the non-mobile terminal device may be a personal computer (PC), a television (television, TV), a teller machine or a self-service machine, etc.; the embodiment of the present disclosure does not specifically limit it.
  • the execution subject of the application control method provided by the embodiments of the present disclosure may be the above-mentioned terminal device (including mobile terminal equipment and non-mobile terminal equipment), or may be a functional module and/or the terminal device that can implement the application control method. Or functional entities, which can be specifically determined according to actual usage requirements, which are not limited in the embodiments of the present disclosure.
  • the following uses a terminal device as an example to illustrate the application control method provided by the embodiment of the present disclosure.
  • an embodiment of the present disclosure provides an application control method.
  • the method includes the following S11 and S12.
  • the first screen and the second screen are different screens among at least two screens.
  • the interface of the target application is displayed on the first screen.
  • the terminal device In response to the first input, the terminal device performs a first function corresponding to the first input on the target application when the first input meets the first condition.
  • the first condition is any one of the following conditions 1 and 2.
  • Condition 1 The input position of the first input is in a functional area on the first screen.
  • different functional areas on the first screen correspond to different functions of the target application.
  • the first screen may include at least one functional area, and each functional area corresponds to a function of the target application.
  • Condition 2 The first input matches a preset input.
  • different preset inputs in the terminal device correspond to different functions of the target application.
  • At least one preset input may be pre-stored in the terminal device, and each preset input corresponds to a function of the target application.
  • the terminal device is a double-sided screen terminal device including a front screen and a back screen.
  • the terminal device is in a state of talking, and the phone application is running on the front screen, and the detection When the user approaches the front screen, if another phone comes in at this time, you can click twice on the back screen to hang up the call.
  • the application control method may specifically include the following S1 to S5 .
  • the terminal device detects that the W process of the target application is started.
  • the above W process needs to use input on the screen where the target application is not running to realize the function.
  • the two functions of adjusting the field of view and shooting can be achieved by the user clicking on the control for adjusting the field of view and the control for shooting on the front screen.
  • touching the above two functions on the front screen at the same time cannot be achieved.
  • the shooting function can be realized by clicking input on the back screen.
  • the terminal device determines the functional area of the back screen corresponding to the process W (that is, determines the functional area of the back screen corresponding to the function implemented by the running process W).
  • the upper area of the back screen can be divided into a first functional area 01 and a second functional area 02, and the first functional area 01 is used for triggering
  • the functional area of the scope function the second functional area 02 is a functional area for triggering the shooting function, and the area on the back screen except the first functional area 01 and the second functional area 02 does not respond to touch input.
  • the terminal device displays the functional area on the back screen.
  • the specific location of the functional area can be displayed on the back screen, and the identification of the function corresponding to the functional area can be displayed.
  • the functional area of the first functional area 01 is displayed, and the Identifies the identifier of the scope function corresponding to the first function area 01.
  • the terminal device receives the click operation in the first functional area on the back screen.
  • the above-mentioned click operation may be single-point, single-click, and double-click, which may be determined according to the process, which is not limited in the embodiment of the present disclosure.
  • the terminal device responds to the click operation to realize the first function of the target application.
  • the scope function is turned off/on (that is, in the case that the scope is opened, in response to the user's A click operation of a functional area 01 turns off the scope function, otherwise, the scope function is turned on), if the terminal device receives a user's click operation on the first functional area 02, in response to the click operation, shoot (for example, shoot a bullet) ).
  • the embodiment of the present disclosure provides an application control method.
  • the method can be applied to a terminal device including at least two screens.
  • the method includes: receiving a user's information on the second screen when the target application is running on the first screen.
  • the first input, the first screen and the second screen are different screens among the at least two screens; in response to the first input, in the case that the first input satisfies the first condition, the first input corresponding to the first input is executed on the target application A function; wherein the first screen includes at least one functional area, and the first condition is any one of the following: the input position of the first input is in a functional area on the first screen, and the first input matches a preset input , Different functional areas on the first screen correspond to different functions of the target application, and different preset inputs in the terminal device correspond to different functions of the target application.
  • the terminal device can trigger the terminal device to execute the target application running on the first screen through the first input on the second screen when the first input meets the first condition.
  • the first function in this way, the applications running on the first screen can be controlled through the input on the second screen, and the applications in the terminal device can be controlled more flexibly.
  • the first condition is that the input position of the first input is in a functional area on the first screen, each functional area in the at least one functional area displays an identifier, and an identifier displayed in a functional area is used for Indicates a function of the target application corresponding to the one function area.
  • a scope logo is displayed in the first functional area 01, which is used to indicate the scope function corresponding to the first functional area, and a bullet logo is displayed in the second functional area 02.
  • the mark is used to indicate the shooting function corresponding to the second functional area 02.
  • a function used to indicate the target application corresponding to the function area is displayed in each function area, so that the corresponding relationship between the function area on the second screen and the function of the target application can be visually shown, thereby Can facilitate user input.
  • the second screen is in the off-screen state.
  • the terminal device when the second screen is in the off-screen state, the terminal device can still respond to the user's input on the second screen (for example, the first input), which can save the power consumption of the screen of the terminal device.
  • the above at least one functional area may be set in a preset area in the second screen.
  • the above at least one functional area may be set in a preset area in the second screen, and the second screen except the preset area The area does not respond to touch input.
  • At least one functional area can be set in the upper area 03 in the second screen shown in FIG. 4, and the lower area 04 in the second screen does not respond to touch input, and the user can hold The lower area 04 in the second screen.
  • the above-mentioned at least two screens are a first screen and a second screen, and the first screen and the second screen have opposite directions.
  • a terminal device with a folding screen whether it is folded by a hinge or a flexible screen, it can finally be folded into a state where the two screens face opposite directions.
  • an embodiment of the present disclosure provides a terminal device 130.
  • the terminal device 130 includes at least two screens.
  • the terminal device further includes a receiving module 131 and a processing module 132.
  • the receiving module 131 is configured to receive the first input of the user on the second screen when the target application is running on the first screen, and the first screen and the second screen are different of the at least two screens; processing
  • the module 132 is configured to perform, in response to the first input, a first function corresponding to the first input on the target application when the first input satisfies the first condition.
  • the first screen includes at least one functional area, and the first condition may be any one of the following: the input position of the first input is in a functional area on the first screen, the first input matches a preset input, Different functional areas on the first screen correspond to different functions of the above-mentioned target application, and different preset inputs in the terminal device correspond to different functions of the target application.
  • the first condition is that the input position of the first input is in a functional area on the first screen.
  • an identifier is displayed in each function area in at least one function area, and an identifier displayed in one function area is used to indicate a function of the target application corresponding to one function area.
  • the above-mentioned second screen is in the off-screen state.
  • the at least two screens are a first screen and a second screen, the first screen and the second screen are located on two opposite surfaces in the terminal device, and the first screen and the second screen face opposite directions.
  • the aforementioned at least one functional area is in a preset area in the second screen.
  • the terminal device provided in the embodiments of the present disclosure can implement the various processes shown in the foregoing method embodiments, and to avoid repetition, details are not described herein again.
  • the embodiment of the present disclosure provides a terminal device, the terminal device includes at least two screens, and the terminal device can receive a user's first input on the second screen under the condition that the terminal device can run a target application on the first screen.
  • the screen and the second screen are different screens among at least two screens; in response to the first input, when the first input satisfies the first condition, execute the first function corresponding to the first input on the target application; wherein, the above
  • the first screen includes at least one functional area, and the first condition is any one of the following: the input position of the first input is in a functional area on the first screen, the first input matches a preset input, and the Different functional areas correspond to different functions of the target application, and different preset inputs in the terminal device correspond to different functions of the target application.
  • the terminal device can trigger the terminal device to execute the target application running on the first screen through the first input on the second screen when the first input meets the first condition.
  • the first function in this way, the applications running on the first screen can be controlled through the input on the second screen, and the applications in the terminal device can be controlled more flexibly.
  • the terminal device 100 includes but is not limited to: a radio frequency unit 101, a network module 102, an audio output unit 103, an input unit 104, a sensor 105, and a display unit 106 , User input unit 107, interface unit 108, memory 109, processor 110, power supply 111 and other components.
  • a radio frequency unit 101 includes but is not limited to: a radio frequency unit 101, a network module 102, an audio output unit 103, an input unit 104, a sensor 105, and a display unit 106 , User input unit 107, interface unit 108, memory 109, processor 110, power supply 111 and other components.
  • the structure of the terminal device shown in FIG. 6 does not constitute a limitation on the terminal device, and the terminal device may include more or fewer components than shown in the figure, or a combination of certain components, or different components Layout.
  • terminal devices include but are not limited to mobile phones, tablet computers, notebook computers, palmtop computers, vehicle-mounted terminal devices, wearable devices, and pedometer
  • the user input unit 107 is configured to receive a user's first input on the second screen when the target application is running on the first screen, and the first screen and the second screen are different screens among at least two screens.
  • the processor 110 is configured to perform a first function corresponding to the first input on the target application in response to the first input, when the first input satisfies the first condition; wherein, the first screen includes at least one functional area ,
  • the first condition is any one of the following: the input position of the first input is in a functional area on the first screen, the first input matches a preset input, and the different functional areas on the first screen correspond to different functions of the target application , Different preset inputs in the terminal device correspond to different functions of the target application.
  • the embodiment of the present disclosure provides a terminal device, the terminal device includes at least two screens, and the terminal device can receive a user's first input on the second screen under the condition that the terminal device can run a target application on the first screen.
  • the screen and the second screen are different screens among at least two screens; in response to the first input, when the first input satisfies the first condition, execute the first function corresponding to the first input on the target application; wherein, the above
  • the first screen includes at least one functional area, and the first condition is any one of the following: the input position of the first input is in a functional area on the first screen, the first input matches a preset input, and the Different functional areas correspond to different functions of the target application, and different preset inputs in the terminal device correspond to different functions of the target application.
  • the terminal device can trigger the terminal device to execute the target application running on the first screen through the first input on the second screen when the first input meets the first condition.
  • the first function in this way, the applications running on the first screen can be controlled through the input on the second screen, and the applications in the terminal device can be controlled more flexibly.
  • the radio frequency unit 101 can be used for receiving and sending signals in the process of sending and receiving information or talking. Specifically, the downlink data from the base station is received and processed by the processor 110; in addition, Uplink data is sent to the base station.
  • the radio frequency unit 101 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like.
  • the radio frequency unit 101 can also communicate with the network and other devices through a wireless communication system.
  • the terminal device provides users with wireless broadband Internet access through the network module 102, such as helping users to send and receive emails, browse web pages, and access streaming media.
  • the audio output unit 103 can convert the audio data received by the radio frequency unit 101 or the network module 102 or stored in the memory 109 into audio signals and output them as sounds. Moreover, the audio output unit 103 may also provide audio output related to a specific function performed by the terminal device 100 (for example, call signal reception sound, message reception sound, etc.).
  • the audio output unit 103 includes a speaker, a buzzer, a receiver, and the like.
  • the input unit 104 is used to receive audio or video signals.
  • the input unit 104 may include a graphics processing unit (GPU) 1041 and a microphone 1042, and the graphics processor 1041 is configured to monitor images of still pictures or videos obtained by an image capture device (such as a camera) in a video capture mode or an image capture mode. Data is processed.
  • the processed image frame can be displayed on the display unit 106.
  • the image frame processed by the graphics processor 1041 may be stored in the memory 109 (or other storage medium) or sent via the radio frequency unit 101 or the network module 102.
  • the microphone 1042 can receive sound, and can process such sound into audio data.
  • the processed audio data can be converted into a format that can be sent to the mobile communication base station via the radio frequency unit 101 for output in the case of a telephone call mode.
  • the terminal device 100 further includes at least one sensor 105, such as a light sensor, a motion sensor, and other sensors.
  • the light sensor includes an ambient light sensor and a proximity sensor.
  • the ambient light sensor can adjust the brightness of the display panel 1061 according to the brightness of the ambient light.
  • the proximity sensor can close the display panel 1061 and the display panel 1061 when the terminal device 100 is moved to the ear. / Or backlight.
  • the accelerometer sensor can detect the magnitude of acceleration in various directions (usually three-axis), and can detect the magnitude and direction of gravity when stationary, and can be used to identify terminal device posture (such as horizontal and vertical screen switching, related games , Magnetometer attitude calibration), vibration recognition related functions (such as pedometer, percussion), etc.; sensor 105 can also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, Infrared sensors, etc., will not be repeated here.
  • the display unit 106 is used to display information input by the user or information provided to the user.
  • the display unit 106 may include a display panel 1061, and the display panel 1061 may be configured in the form of a liquid crystal display (LCD), an organic light-emitting diode (OLED), etc.
  • LCD liquid crystal display
  • OLED organic light-emitting diode
  • the user input unit 107 may be used to receive inputted numeric or character information, and generate key signal input related to user settings and function control of the terminal device.
  • the user input unit 107 includes a touch panel 1071 and other input devices 1072.
  • the touch panel 1071 also called a touch screen, can collect the user's touch operations on or near it (for example, the user uses any suitable objects or accessories such as fingers, stylus, etc.) on the touch panel 1071 or near the touch panel 1071. operating).
  • the touch panel 1071 may include two parts: a touch detection device and a touch controller.
  • the touch detection device detects the user's touch position, and detects the signal brought by the touch operation, and transmits the signal to the touch controller; the touch controller receives the touch information from the touch detection device, converts it into contact coordinates, and then sends it To the processor 110, the command sent by the processor 110 is received and executed.
  • the touch panel 1071 can be implemented in multiple types such as resistive, capacitive, infrared, and surface acoustic wave.
  • the user input unit 107 may also include other input devices 1072.
  • other input devices 1072 may include, but are not limited to, a physical keyboard, function keys (such as volume control buttons, switch buttons, etc.), trackball, mouse, and joystick, which will not be repeated here.
  • the touch panel 1071 can be overlaid on the display panel 1061.
  • the touch panel 1071 detects a touch operation on or near it, it is transmitted to the processor 110 to determine the type of the touch event.
  • the type of event provides corresponding visual output on the display panel 1061.
  • the touch panel 1071 and the display panel 1061 are used as two independent components to implement the input and output functions of the terminal device, in some embodiments, the touch panel 1071 and the display panel 1061 can be integrated
  • the implementation of the input and output functions of the terminal device is not specifically limited here.
  • the interface unit 108 is an interface for connecting an external device with the terminal device 100.
  • the external device may include a wired or wireless headset port, an external power source (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device with an identification module, audio input/output (I/O) port, video I/O port, headphone port, etc.
  • the interface unit 108 can be used to receive input (for example, data information, power, etc.) from an external device and transmit the received input to one or more elements in the terminal device 100 or can be used to connect to the terminal device 100 and external Transfer data between devices.
  • the memory 109 can be used to store software programs and various data.
  • the memory 109 may mainly include a program storage area and a data storage area.
  • the program storage area may store an operating system, an application program required by at least one function (such as a sound playback function, an image playback function, etc.), etc.; Data (such as audio data, phone book, etc.) created by the use of mobile phones.
  • the memory 109 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, or other volatile solid-state storage devices.
  • the processor 110 is the control center of the terminal device. It uses various interfaces and lines to connect the various parts of the entire terminal device, runs or executes the software programs and/or modules stored in the memory 109, and calls the data stored in the memory 109 , Perform various functions of terminal equipment and process data, so as to monitor the terminal equipment as a whole.
  • the processor 110 may include one or more processing units; optionally, the processor 110 may integrate an application processor and a modem processor, where the application processor mainly processes the operating system, user interface, and application programs, etc.
  • the adjustment processor mainly deals with wireless communication. It can be understood that the foregoing modem processor may not be integrated into the processor 110.
  • the terminal device 100 may also include a power source 111 (such as a battery) for supplying power to various components.
  • a power source 111 such as a battery
  • the power source 111 may be logically connected to the processor 110 through a power management system, so as to manage charging, discharging, and power consumption through the power management system. Management and other functions.
  • the terminal device 100 includes some functional modules not shown, which will not be repeated here.
  • the embodiments of the present disclosure also provide a terminal device, which may include a processor, a memory, and a computer program stored in the memory and running on the processor.
  • the computer program can implement the foregoing method embodiments when executed by the processor.
  • Each process performed by the terminal device in the middle and can achieve the same technical effect, in order to avoid repetition, it will not be repeated here.
  • the embodiments of the present disclosure provide a computer-readable storage medium, which is characterized in that a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, each process executed by the terminal device in the foregoing method embodiment is implemented, and To achieve the same technical effect, in order to avoid repetition, I will not repeat them here.
  • the computer-readable storage medium may be a read-only memory (Read-Only Memory, ROM), a random access memory (Random Access Memory, RAM), a magnetic disk, or an optical disk.
  • the technical solution of the present disclosure can be embodied in the form of a software product in essence or the part that contributes to the related technology.
  • the computer software product is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk). ) Includes a number of instructions to enable a terminal device (which may be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.) to execute the method of each embodiment of the present disclosure.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Telephone Function (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention concerne un procédé de commande d'application et un équipement terminal, qui sont appliqués au domaine technique des communications, le procédé consistant à : recevoir une première entrée d'un utilisateur sur un second écran tout en exécutant une application cible sur un premier écran, le premier écran et le second écran étant différents écrans parmi au moins deux écrans (S11) ; réaliser, en réponse à la première entrée, une première fonction correspondant à la première entrée sur l'application cible lorsque la première entrée satisfait une première condition (S12) ; la première condition est l'une quelconque des suivantes : la position d'entrée de la première entrée est située dans une zone fonctionnelle sur le premier écran, la première entrée correspond à une entrée prédéfinie, différentes zones fonctionnelles sur le premier écran correspondent à différentes fonctions de l'application cible, et différentes entrées prédéfinies dans l'équipement terminal correspondent à différentes fonctions de l'application cible. Le procédé est appliqué à des scénarios de commande d'application.
PCT/CN2020/070722 2019-01-21 2020-01-07 Procédé de commande d'application et équipement terminal WO2020151490A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910053288.1A CN109947343A (zh) 2019-01-21 2019-01-21 一种应用的控制方法及终端设备
CN201910053288.1 2019-01-21

Publications (1)

Publication Number Publication Date
WO2020151490A1 true WO2020151490A1 (fr) 2020-07-30

Family

ID=67006698

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/070722 WO2020151490A1 (fr) 2019-01-21 2020-01-07 Procédé de commande d'application et équipement terminal

Country Status (2)

Country Link
CN (1) CN109947343A (fr)
WO (1) WO2020151490A1 (fr)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109947343A (zh) * 2019-01-21 2019-06-28 维沃移动通信有限公司 一种应用的控制方法及终端设备
CN110928508B (zh) * 2019-10-31 2024-03-29 维沃移动通信有限公司 一种控制方法及电子设备
CN111026562B (zh) * 2019-11-18 2023-08-22 维沃移动通信(杭州)有限公司 一种消息发送方法及电子设备

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20160143448A (ko) * 2015-06-05 2016-12-14 엘지이노텍 주식회사 단말기 및 이의 동작 방법
JP2018063522A (ja) * 2016-10-12 2018-04-19 アイシン・エィ・ダブリュ株式会社 表示制御システムおよび表示制御プログラム
CN108628565A (zh) * 2018-05-07 2018-10-09 维沃移动通信有限公司 一种移动终端操作方法及移动终端
CN108628521A (zh) * 2018-05-14 2018-10-09 维沃移动通信有限公司 一种屏幕操作方法及移动终端
CN108646959A (zh) * 2018-04-10 2018-10-12 Oppo广东移动通信有限公司 屏幕控制方法、装置以及移动终端
CN108769299A (zh) * 2018-04-10 2018-11-06 Oppo广东移动通信有限公司 屏幕控制方法、装置以及移动终端
CN109947343A (zh) * 2019-01-21 2019-06-28 维沃移动通信有限公司 一种应用的控制方法及终端设备

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140351745A1 (en) * 2013-05-22 2014-11-27 International Business Machines Corporation Content navigation having a selection function and visual indicator thereof
CN106250037A (zh) * 2016-07-25 2016-12-21 珠海市魅族科技有限公司 一种移动终端的控制方法及移动终端
CN106249995B (zh) * 2016-07-25 2020-01-24 珠海市魅族科技有限公司 通知方法和装置
CN108366163B (zh) * 2018-01-15 2020-11-17 Oppo广东移动通信有限公司 相机应用的控制方法、装置、移动终端及计算机可读介质
CN108549521A (zh) * 2018-03-09 2018-09-18 北京珠穆朗玛移动通信有限公司 终端操控方法、移动终端及存储介质
CN108958625A (zh) * 2018-06-26 2018-12-07 努比亚技术有限公司 一种屏幕交互调控方法、设备及计算机可读存储介质

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20160143448A (ko) * 2015-06-05 2016-12-14 엘지이노텍 주식회사 단말기 및 이의 동작 방법
JP2018063522A (ja) * 2016-10-12 2018-04-19 アイシン・エィ・ダブリュ株式会社 表示制御システムおよび表示制御プログラム
CN108646959A (zh) * 2018-04-10 2018-10-12 Oppo广东移动通信有限公司 屏幕控制方法、装置以及移动终端
CN108769299A (zh) * 2018-04-10 2018-11-06 Oppo广东移动通信有限公司 屏幕控制方法、装置以及移动终端
CN108628565A (zh) * 2018-05-07 2018-10-09 维沃移动通信有限公司 一种移动终端操作方法及移动终端
CN108628521A (zh) * 2018-05-14 2018-10-09 维沃移动通信有限公司 一种屏幕操作方法及移动终端
CN109947343A (zh) * 2019-01-21 2019-06-28 维沃移动通信有限公司 一种应用的控制方法及终端设备

Also Published As

Publication number Publication date
CN109947343A (zh) 2019-06-28

Similar Documents

Publication Publication Date Title
US20220276909A1 (en) Screen projection control method and electronic device
WO2020186945A1 (fr) Procédé d'affichage d'interface et dispositif terminal
US20220286504A1 (en) Application sharing method, electronic device and computer readable storage medium
US11604567B2 (en) Information processing method and terminal
WO2020215932A1 (fr) Procédé d'affichage de message non lu et dispositif terminal
WO2020186964A1 (fr) Procédé de transmission de signal audio et borne
WO2020192296A1 (fr) Procédé d'affichage d'interface et dispositif terminal
WO2019196864A1 (fr) Procédé de commande de bouton virtuel et terminal mobile
CN109032486B (zh) 一种显示控制方法及终端设备
WO2020215950A1 (fr) Procédé d'affichage d'interface et dispositif terminal
WO2020192282A1 (fr) Procédé d'affichage de messages de notification et dispositif terminal
WO2021057290A1 (fr) Procédé de commande d'informations et dispositif électronique
WO2019228296A1 (fr) Procédé de traitement d'affichage et dispositif terminal
WO2020199783A1 (fr) Procédé d'affichage d'interface et dispositif terminal
WO2020220893A1 (fr) Procédé de capture d'écran et terminal mobile
WO2020211612A1 (fr) Procédé d'affichage d'informations et dispositif terminal
WO2020192324A1 (fr) Procédé d'affichage d'interface et dispositif terminal
WO2021068885A1 (fr) Procédé de commande et dispositif électronique
WO2021004306A1 (fr) Procédé et terminal de commande d'opérations
WO2020151490A1 (fr) Procédé de commande d'application et équipement terminal
WO2021031717A1 (fr) Procédé de capture d'écran et dispositif terminal
WO2020192297A1 (fr) Procédé de commutation d'interface d'écran et dispositif terminal
WO2020181954A1 (fr) Procédé de commande de programme d'application et dispositif terminal
WO2021164716A1 (fr) Procédé d'affichage et dispositif électronique
WO2020192322A1 (fr) Procédé d'affichage et dispositif terminal

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20745553

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20745553

Country of ref document: EP

Kind code of ref document: A1