CN108471549B - Remote control method and terminal - Google Patents

Remote control method and terminal Download PDF

Info

Publication number
CN108471549B
CN108471549B CN201810204287.8A CN201810204287A CN108471549B CN 108471549 B CN108471549 B CN 108471549B CN 201810204287 A CN201810204287 A CN 201810204287A CN 108471549 B CN108471549 B CN 108471549B
Authority
CN
China
Prior art keywords
terminal
target
picture
input
remotely controlled
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810204287.8A
Other languages
Chinese (zh)
Other versions
CN108471549A (en
Inventor
刘秋菊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN201810204287.8A priority Critical patent/CN108471549B/en
Publication of CN108471549A publication Critical patent/CN108471549A/en
Application granted granted Critical
Publication of CN108471549B publication Critical patent/CN108471549B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42204User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor
    • H04N21/42206User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor characterized by hardware details
    • H04N21/42224Touch pad or touch panel provided on the remote control
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus

Abstract

The embodiment of the invention provides a remote control method and a terminal, relates to the technical field of communication, and solves the problem of complicated remote control operation. Wherein, the method comprises the following steps: acquiring a picture to be remotely controlled; displaying a target object on a touch screen of the first terminal according to the picture to be remotely controlled, wherein the target object comprises a touch area or an input keyboard; receiving a target input of a user for a target sub-object in the target object, wherein the target sub-object comprises a target sub-touch area of the touch area or a target input key in the input keyboard; and obtaining a target input instruction according to the target input, and sending the target input instruction to a second terminal. The embodiment of the invention can simplify the operation steps of the user.

Description

Remote control method and terminal
Technical Field
The present invention relates to the field of communications technologies, and in particular, to a remote control method and a terminal.
Background
Nowadays, electronic technology is more and more spread throughout daily life of users, and most intelligent terminals (such as intelligent televisions) have a remote control function, so that remote control operation can be realized.
Currently, a mobile terminal (e.g., a mobile phone) can realize remote control of a smart terminal by installing an analog remote controller. However, the existing simulator generally completely duplicates the function of the physical remote controller, so that when a user needs to select a target object (e.g., a movie or a channel) of the main display interface of the intelligent terminal or perform character input, the user needs to touch up, down, left and right direction keys of the simulated remote controller to select the target object of the main display interface of the intelligent terminal or perform character input, and the operation is cumbersome.
Disclosure of Invention
The embodiment of the invention provides a remote control method and a terminal, and aims to solve the problem of complicated control caused by the fact that upper, lower, left and right direction keys of a touch simulation remote controller are needed to select a target object of a main display interface of an intelligent terminal (such as an intelligent television) or character input is carried out in the prior art.
In a first aspect, an embodiment of the present invention provides a remote control method, which is applied to a first terminal, and the method includes:
acquiring a picture to be remotely controlled;
displaying a target object on a touch screen of the first terminal according to the picture to be remotely controlled, wherein the target object comprises a touch area or an input keyboard;
receiving a target input of a user for a target sub-object in the target object, wherein the target sub-object comprises a target sub-touch area of the touch area or a target input key in the input keyboard;
and obtaining a target input instruction according to the target input, and sending the target input instruction to a second terminal.
In a second aspect, an embodiment of the present invention further provides a first terminal, where the first terminal includes:
the acquisition module is used for acquiring a picture to be remotely controlled;
the display module is used for displaying a target object on a touch screen of the first terminal according to the picture to be remotely controlled, wherein the target object comprises a touch area or an input keyboard;
a receiving module, configured to receive a target input of a user for a target sub-object in the target object, where the target sub-object includes a target sub-touch area of the touch area or a target input key in the input keyboard;
and the sending module is used for obtaining a target input instruction according to the target input and sending the target input instruction to a second terminal.
In a third aspect, an embodiment of the present invention further provides a first terminal, where the first terminal includes a processor, a memory, and a computer program stored in the memory and being executable on the processor, and the computer program, when executed by the processor, implements the steps of the remote control method described above.
In a fourth aspect, the embodiment of the present invention further provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the steps of the remote control method described above.
In the embodiment of the invention, the first terminal displays the target object matched with the picture to be remotely controlled on the touch screen. In this way, the first terminal may send a target input instruction obtained according to the target input to the second terminal when receiving the target input of the user for the target object, so that the second terminal may determine the target selection sub-object of the user according to the received target input instruction. Therefore, compared with the prior art that the direction key of the user touch simulation remote controller controls the cursor of the second terminal to move on the display interface of the second terminal to generate the target input instruction, the method and the device for controlling the cursor of the second terminal can simplify the operation steps of the user.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the description of the embodiments of the present invention will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to these drawings without inventive exercise.
Fig. 1 is a flowchart of a remote control method according to an embodiment of the present invention;
FIG. 2 is a flow chart of a remote control method according to another embodiment of the present invention;
FIG. 3 is a flow chart of a remote control method according to another embodiment of the present invention;
FIG. 4 is a diagram illustrating a screen to be remotely controlled according to an embodiment of the present invention;
fig. 5 is a block diagram of a terminal according to an embodiment of the present invention;
fig. 6 is a block diagram of a terminal according to still another embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The remote control method of the embodiment of the invention is mainly applied to the first terminal and is used for remotely controlling the display interface of the second terminal so as to select or search the information such as videos and the like required by a user.
In the embodiment of the present invention, the first terminal is in communication connection with the second terminal, so that a user can remotely control the second terminal through the first terminal, specifically, the first terminal may implement remote control of the second terminal by using an infrared remote control technology, that is, an input instruction determined by the first terminal may be sent to the second terminal by infrared, and of course, the first terminal may also implement remote control of the second terminal by using a radio technology and the like. It can be seen that the first terminal serves as an input device to send an input instruction to the second terminal, and the second terminal serves as an output device to respond to receiving the input instruction.
The first terminal can be a Mobile phone, a Tablet Personal Computer (Tablet Personal Computer), a laptop Computer (L ap-top Computer), a Personal Digital Assistant (PDA), a Mobile Internet Device (MID), a Wearable Device (Wearable Device), or the like.
The remote control method of the embodiment of the present invention is explained below.
Referring to fig. 1, fig. 1 is a flowchart of a remote control method according to an embodiment of the present invention, and as shown in fig. 1, the remote control method according to the embodiment includes the following steps:
step 101, obtaining a picture to be remotely controlled.
In this embodiment, the first terminal is configured to remotely control the display interface of the second terminal, and therefore, the to-be-remotely-controlled operation screen may be understood as the display interface of the second terminal.
The first terminal may acquire the picture to be remotely controlled in a plurality of ways, which is described in detail below.
In the first mode, the first terminal can acquire a picture to be remotely controlled through the camera. In practical applications, the first terminal may start the camera to collect a picture to be remotely controlled when a display interface of a remote controller app (Application program) receives a wrist-lifting operation of a user, but is not limited thereto.
In the second mode, the first terminal can receive the picture to be remotely controlled and sent by the second terminal. In practical applications, the second terminal may send the picture to be remotely controlled to the first terminal under the condition that the communication connection with the first terminal is successful, for example, the second terminal may pair with the first terminal through a wireless communication manner such as bluetooth, and after the first terminal and the second terminal are successfully matched, the second terminal may directly send the picture to be remotely controlled to the first terminal, but is not limited thereto.
In a third mode, the first terminal may obtain a picture to be remotely controlled by scanning an image identification code, such as a two-dimensional code, displayed on the second terminal, where the image identification code links the picture to be remotely controlled.
Of course, the first terminal may also obtain the picture to be remotely controlled through other manners, which may be determined according to actual needs, and this is not limited in the embodiment of the present invention.
And 102, displaying a target object on a touch screen of the first terminal according to the picture to be remotely controlled.
In this embodiment, after acquiring the picture to be remotely controlled, the first terminal may perform image recognition on the picture to be remotely controlled, so as to display a target object matched with the picture to be remotely controlled on a touch screen of the first terminal, where the target object includes, but is not limited to, a touch area or an input keyboard.
Specifically, if it is recognized that the picture to be remotely controlled includes a focus input frame, that is, the picture to be remotely controlled includes an input frame in which a cursor is activated, and an input frame in which a character is required to be input, or it is recognized that the picture to be remotely controlled includes a selected search control, which indicates that the user aims to search for information such as movies and television through inputting the character, the first terminal may display an input keyboard on the touch screen, so that the user can control the second terminal to complete character input through the input keyboard, and search for information such as movies and television required by the user.
If the fact that the focus input frame is not contained in the picture to be remotely controlled and the selected search control is not contained indicates that the user aims to select the information such as the movie and the television from the remote control operation interface is recognized, the first terminal can display a touch area matched with the content of the picture to be remotely controlled on the touch screen, and therefore the user can conveniently control the second terminal to finish the selection of the information such as the movie and the television through touching the touch area displayed by the first terminal.
And 103, receiving target input of a user aiming at a target sub-object in the target object.
In this step, the target sub-object includes, but is not limited to, a target sub-touch area of the touch area or a target input key in the input keyboard. Wherein the representation of the target sub-object matches the representation of the target object. Specifically, when the target object is represented as a touch area, the target sub-object is represented as a target sub-touch area of the touch area, that is, one or more sub-touch areas included in the touch area; in the case that the target object is represented as an input keyboard, the target sub-object is represented as a target input key in the input keyboard, i.e. one or more input keys included in the input keyboard, but not limited thereto.
The target input is used to identify a target object of the target objects. The target input may be represented as a touch input, and taking a click input as an example, the first terminal may determine a sub-object clicked by the user as a target sub-object. In some embodiments, the target input may also be represented as a voice input, and in this application scenario, the first terminal may perform voice recognition upon receiving the voice input, and determine a sub-object extracted from the voice as the target sub-object. For example: the first terminal receives voice such as "select a first sub-object in the target object", and may determine the first sub-object as a target sub-object in the target object.
And 104, obtaining a target input instruction according to the target input, and sending the target input instruction to a second terminal.
In this step, the first terminal may obtain the target input instruction according to the target sub-object for the target input action, and may also determine, by searching a mapping relationship between a preset sub-object and the first input instruction, that the first input instruction mapped by the target sub-object for the target input action is the target input instruction, but is not limited thereto.
And after determining the target input instruction, the first terminal sends the target input instruction to the second terminal. In this way, after receiving the target input instruction, the second terminal may determine, in response to the target input instruction, information such as movies and the like selected by the user, or determine characters input by the user, so as to search, according to the characters, information such as movies and the like required by the user.
The first terminal may send the target input instruction to the second terminal through infrared or through a wireless communication manner such as bluetooth, which may be determined according to actual needs.
It should be understood that, if the display interface of the second terminal changes, the first terminal may reacquire the changed picture to be remotely controlled, and perform the above steps again, thereby implementing effective remote control on the second terminal and improving user experience.
In the remote control method of the embodiment, the first terminal displays the target object matched with the picture to be remotely controlled on the touch screen. In this way, the first terminal may send a target input instruction obtained according to the target input to the second terminal when receiving the target input of the user for the target object, so that the second terminal may determine the target selection sub-object of the user according to the received target input instruction. Therefore, compared with the prior art that the direction key of the user touch simulation remote controller controls the cursor of the second terminal to move on the display interface of the second terminal to generate the target input instruction, the method and the device for controlling the cursor of the second terminal can simplify the operation steps of the user.
In the embodiment of the invention, after the first terminal acquires the picture to be remotely controlled, the target object matched with the picture to be remotely controlled is displayed on the touch screen of the first terminal. In some embodiments, the first terminal may determine whether the to-be-remotely-controlled-operation screen includes a focus input box, so as to determine a concrete representation form of the target object according to a determination result.
In an application scenario that includes a focus input box in the remote operation interface, as shown in fig. 2, step 102 may include:
and 1021, under the condition that the picture to be remotely controlled does not contain a focus input frame, performing picture segmentation on the picture to be remotely controlled to obtain n picture segmentation areas, and identifying m operable target picture segmentation areas in the n picture segmentation areas.
In this application scenario, the first terminal determines that the to-be-remotely-controlled operation picture does not include the focus input frame, which indicates that the user intends to select information such as a movie and a television from the remote control operation interface, and then the first terminal may display a touch area matched with the to-be-remotely-controlled operation picture on the touch screen.
In order to reduce interference of an inoperable picture segmentation region in a picture to be remotely controlled to an operation step of a user, in the step, the first terminal can perform picture segmentation on the picture to be remotely controlled to obtain n picture segmentation regions. Specifically, the first terminal may perform screen segmentation on the picture to be remotely controlled based on the screen composition of the picture to be remotely controlled, such as dividing a menu bar and a poster into different screen segmentation areas. Of course, the first terminal may also perform screen segmentation on the to-be-remotely-operated screen based on a threshold segmentation method, an area segmentation method, an edge segmentation method, and the like, which may be specifically determined according to actual needs, and is not limited in this embodiment of the present invention.
After the n screen division regions are obtained by division, the first terminal may identify m operable target screen division regions from the n screen division regions by combining an OCR (Optical character recognition) technique and an image content recognition technique. Specifically, for each of the n screen division areas, if the first terminal can recognize a character matching a television menu name or the like by using an OCR technology or recognize a movie poster, a control, or the like by using an image, the screen division area can be determined as an operable target screen division area, but is not limited thereto.
The operable target picture segmentation area can be understood as an area which can respond to user operation; n and m are positive integers, it being understood that n is greater than or equal to m.
And 1022, displaying m touch areas matched with the m operable target picture segmentation areas on the touch screen of the first terminal under the condition that m is larger than 1.
In this step, the first terminal determines that the number of the operable target image segmentation areas in the acquired to-be-remotely-controlled operation image is greater than 1, and then m touch areas matched with the m operable target sub-object areas are displayed on a touch screen of the first terminal.
Specifically, the first terminal may display m touch areas on the touch screen according to the screen layouts of the m operable target screen split areas, where the operable target screen split areas and the touch areas are mapped one to one.
Further, when the first terminal displays m touch areas matched with the m operable target screen split areas, gaps between the operable target screen split areas (e.g., non-operable areas in the n screen split areas) can be eliminated, so that the utilization rate of the touch screen display area can be improved.
Therefore, compared with the method that the mapping picture of the picture to be remotely controlled is directly displayed on the touch screen, the interference of the inoperable area in the picture to be remotely controlled on the operation steps of the user can be effectively reduced, and the user experience is improved.
In the application scenario, the user can determine the contents such as movies and the like selected by the user through the touch target touch area, so that the operation steps of the user can be simplified, and the user experience is improved.
It should be understood that in this application scenario, the target object appears as a touch area and the target sub-object appears as a target sub-touch area of said touch area.
In addition, if the first terminal determines that the number of the operable target image segmentation areas in the acquired to-be-remotely-controlled operation image is less than or equal to 1, a simulated physical remote controller interface can be directly displayed on the touch screen, and an input instruction is sent to the second terminal through the infrared unit in a mode of simulating a physical remote controller.
In an application scenario where the remote control operation interface does not include a focus input box, as shown in fig. 3, step 102 may include:
and 1023, displaying an input keyboard on a touch screen of the first terminal under the condition that the picture to be remotely controlled contains a focus input frame.
In this application scenario, the first terminal determines that the to-be-remotely-controlled operation picture contains a focus input frame, which indicates that the user aims to select information such as movies and the like from the remote control operation interface, and then the first terminal can display an input keyboard on a touch screen of the first terminal, so that the user can conveniently control the second terminal to complete selection of information such as movies and the like by touching a touch area displayed by the first terminal.
The input keyboard may be a key layout form of 26 english letters and 10 numeric full keyboards, or a T9 input method keyboard, which may be determined according to actual needs, and is not limited in this embodiment of the present invention.
In some embodiments, the type of the input keyboard may be determined according to a keyboard display type in the to-be-remotely-controlled operation screen, and optionally, step 1023 includes:
under the condition that the picture to be remotely controlled comprises a focus input frame, identifying a keyboard display type in the picture to be remotely controlled;
and displaying an input keyboard matched with the keyboard display type on a touch screen of the first terminal.
In this embodiment, the first terminal identifies a keyboard display type in the to-be-remotely-controlled operation screen, and displays an input keyboard matched with the keyboard display type on a touch screen of the first terminal. For example: if the keyboard display type in the remote control operation interface is the key position layout form of the full keyboard, the first terminal displays an input keyboard of the key position layout form of the full keyboard on the touch screen; if the keyboard display type in the remote control operation interface is a T9 input method keyboard, the first terminal displays a T9 input method keyboard on the touch screen.
In this way, compared with the method that the input keyboard which is not matched with the keyboard display type is displayed on the touch screen of the first terminal, the method and the device for displaying the input keyboard on the touch screen of the second terminal can simplify the determination of the corresponding relation between the keyboard displayed on the display interface of the second terminal and the keyboard displayed on the first terminal, so that the response speed of the second terminal responding to the target input instruction can be improved, and the user experience is improved.
In the application scenario, the user can input the target input key of the keyboard through touch control, so that the second terminal determines the contents such as movies and the like which are searched and selected by the user, the operation steps of the user can be simplified, and the user experience is improved.
It should be understood that in this application scenario, the target object appears as an input keyboard and the target sub-objects appear as target input keys in said input keyboard.
In addition, if the first terminal determines that the acquired picture to be remotely controlled does not contain the focus input frame, the interface of the simulated physical remote controller can be directly displayed on the touch screen, and an input instruction is sent to the second terminal through the infrared device in a mode of simulating the physical remote controller.
In this embodiment of the present invention, when receiving a target input of a user for a target sub-object in the target object, the first terminal may obtain a target input instruction according to a target input action, and optionally, determining the target input instruction according to the target input includes:
and determining that the first input instruction mapped by the target sub-object is a target input instruction according to the mapping relation between the preset sub-object and the first input instruction.
In this embodiment, the mapping relationship between the sub-object and the first input instruction may be pre-established by the first terminal, or pre-established by another terminal, and sent to the first terminal.
Optionally, before determining, according to the mapping relationship between the sub-object and the first input instruction, that the first input instruction corresponding to the sub-object is the target input instruction, the method further includes:
and establishing a mapping relation between the sub-object and the first input instruction.
It should be noted that, in the embodiment, the input instruction of each sub-object map may be understood as: and in the remote control operation picture, the cursor jumps from the currently selected area (assumed to be the area A) to the cursor jump path of the area (assumed to be the area B) mapped by the sub-object to generate an input instruction. Therefore, in the picture to be remotely controlled, if the cursor can jump from the area a to the area B through i cursor jump paths, and meanwhile, the sub-object mapped with the area B is the sub-object B, the sub-object B generates one input instruction corresponding to i input instructions, that is, one cursor jump path.
In addition, it can be understood that different cursor jump paths correspond to their respective jump steps, i.e., respective input steps.
For easy understanding, please refer to fig. 4 together, in fig. 4, the small rectangle is used to identify the area in the picture 400 to be remotely controlled, the number in the rectangle frame is used to identify the label of each area, for example, the rectangle marked with the number 1 identifies the first area, the rectangle marked with the number 4 identifies the fourth area, and so on.
As shown in fig. 4, it is assumed that, in the target object displayed on the touch screen of the first terminal, the first sub-object thereof maps the first area 402 in the to-be-remotely-controlled operation screen 400, and the currently selected area of the cursor in the to-be-remotely-controlled operation screen 400 is the eighth area 401. In this way, in the case where the first terminal detects the target input of the user for the first sub-object, the cursor in the screen 400 to be remotely controlled jumps from the eighth area 401 to the first area 402.
As can be seen from fig. 4, the cursor jumping path for the cursor to jump from the eighth area 401 to the first area 402 includes:
the cursor jump path 1, the eighth region 401 → the fourth region → the third region → the second region → the first region 402. It can be seen that, in the cursor jump path 1, the cursor needs to jump 4 times, and the input step for controlling cursor jump is 4.
The cursor jump path 2, the eighth region 401 → the seventh region → the sixth region → the fifth region → the first region 402. It can be seen that, in the cursor jump path 2, the cursor needs to jump 4 times, and the input step for controlling cursor jump is 4.
The cursor jumping path 3, the eighth region 401 → the twelfth region → the eleventh region → the seventh region → the sixth region → the fifth region → the first region 402. It can be seen that the cursor needs to jump 6 times in the cursor jump path 3, and the number of input steps for controlling cursor jump is 6.
As can be seen, for the input instruction a generated when the cursor jumps from the eighth area 401 to the first area 402 through the cursor jump path 1, the number of corresponding input steps is 4; for an input instruction b generated by jumping the cursor from the eighth area 401 to the first area 402 through the cursor jumping path 2, the number of corresponding input steps is 4; for the input instruction c generated when the cursor jumps from the eighth area 401 to the first area 402 through the cursor jump path 3, the corresponding input steps are 6.
In some embodiments, after acquiring the input instruction mapped by each sub-object, the first terminal may determine any input sub-object mapped by each sub-object as the first input instruction, and establish a mapping relationship between the sub-object and the first input instruction.
For example: for a first sub-object in a target object displayed on a touch screen of a first terminal, under the condition that a currently selected area of a cursor in a picture to be remotely controlled is an eighth area, after acquiring an input instruction a, an input instruction b and an input instruction c mapped by the first sub-object, the first terminal may determine any one of the input instruction a, the input instruction b and the input instruction c as the first input instruction, and establish a mapping relationship between the first input instruction and the first sub-object.
Of course, after the first terminal acquires the input instructions mapped by each sub-object, the first terminal may also determine, as the first input instruction, the input instruction with the least input steps among the input instructions mapped by each sub-object, and establish a mapping relationship between the sub-object and the first input instruction. For the application scenario, optionally, the establishing a mapping relationship between the sub-object and the first input instruction includes:
acquiring an input instruction corresponding to the sub-object;
determining input steps for generating each input instruction, and determining the input instruction with the least input steps in the input instructions mapped by each sub-object as a first input instruction of each sub-object;
and establishing a mapping relation between the sub-object and the first input instruction.
Referring to fig. 4 again, it can be seen from the above that, in the input command a, the input command b, and the input command c mapped by the first sub-object, the input steps corresponding to the input command a and the input command b are the least.
Therefore, in the mapping relationship between the sub-objects and the first input instruction, when the currently selected area of the cursor in the screen to be remotely controlled is the eighth area for the first sub-object in the target object displayed on the touch screen of the first terminal, the mapped first input instruction is the input instruction a or the input instruction b.
In the application scenario, if the first terminal receives a target input of a user for the first sub-object, the first terminal may determine, by searching for a mapping relationship between the sub-object and the first input instruction, that the first input instruction corresponding to the first sub-object, that is, the input instruction a or the input instruction b is the target input instruction, and send the target input instruction to the second terminal.
In this way, the second terminal can control the cursor to jump from the eighth area to the first area mapped by the first sub-object according to the cursor jump path corresponding to the input instruction a or the input instruction b, so that the cursor is controlled to jump from the eighth area to the first area mapped by the first sub-object compared with the cursor jump path corresponding to the input instruction c, the jump times of the cursor can be reduced, and the operation load of the second terminal can be reduced.
It should be noted that, various optional implementations described in the embodiments of the present invention may be implemented in combination with each other or implemented separately, and the embodiments of the present invention are not limited thereto.
Referring to fig. 5, fig. 5 is a structural diagram of a terminal according to an embodiment of the present invention, and as shown in fig. 5, a terminal 500 includes:
an obtaining module 501, configured to obtain a picture to be remotely controlled;
a display module 502, configured to display a target object on a touch screen of the terminal according to the to-be-remotely-controlled operation screen, where the target object includes a touch area or an input keyboard;
a receiving module 503, configured to receive a target input of a user for a target sub-object in the target object, where the target sub-object includes a target sub-touch area of the touch area or a target input key in the input keyboard;
a sending module 504, configured to obtain a target input instruction according to the target input, and send the target input instruction to a receiving terminal.
In addition to fig. 5, a module included in terminal 500 and a unit included in each module will be described below.
Optionally, the display module 502 includes:
the device comprises a dividing unit, a processing unit and a display unit, wherein the dividing unit is used for carrying out picture division on a picture to be remotely controlled under the condition that the picture to be remotely controlled does not contain a focus input frame to obtain n picture dividing regions, and identifying m operable target picture dividing regions in the n picture dividing regions, wherein n and m are positive integers;
and the first display unit is used for displaying m touch areas matched with the m operable target picture segmentation areas on a touch screen of the terminal under the condition that m is larger than 1.
Optionally, the display module 502 is specifically configured to: and displaying an input keyboard on a touch screen of the terminal under the condition that the picture to be remotely controlled contains a focus input frame.
Optionally, the display module 502 includes:
the identification unit is used for identifying the keyboard display type in the picture to be remotely controlled under the condition that the picture to be remotely controlled contains the focus input frame;
and the second display unit is used for displaying an input keyboard matched with the keyboard display type on the touch screen of the terminal.
Optionally, the sending module 504 includes:
the first determining unit is used for determining a first input instruction mapped by the target sub-object as a target input instruction according to the mapping relation between a preset sub-object and the first input instruction;
and the sending unit is used for sending the target input instruction to the receiving terminal.
The terminal 500 can implement each process of the first terminal in the method embodiment of the present invention, and achieve the same beneficial effects, and for avoiding repetition, details are not described here; the receiving terminal can implement each process of the second terminal in the method embodiment of the present invention, and achieve the same beneficial effects, and for avoiding repetition, details are not described here.
Referring to fig. 6, fig. 6 is a structural diagram of a terminal according to another embodiment of the present invention, where the terminal may be a hardware structural diagram of a terminal for implementing various embodiments of the present invention. As shown in fig. 6, terminal 600 includes, but is not limited to: a radio frequency unit 601, a network module 602, an audio output unit 603, an input unit 604, a sensor 605, a display unit 606, a user input unit 607, an interface unit 608, a memory 609, a processor 610, and a power supply 611. Those skilled in the art will appreciate that the terminal configuration shown in fig. 6 is not intended to be limiting, and that the terminal may include more or fewer components than shown, or some components may be combined, or a different arrangement of components. In the embodiment of the present invention, the terminal includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal, a wearable device, a pedometer, and the like.
Wherein, the radio frequency unit 601 is configured to:
receiving a target input of a user for a target sub-object in the target object, wherein the target sub-object comprises a target sub-touch area of the touch area or a target input key in the input keyboard;
and sending the target input instruction to a receiving terminal.
A processor 610 configured to:
acquiring a picture to be remotely controlled;
displaying a target object on a touch screen of the terminal according to the picture to be remotely controlled, wherein the target object comprises a touch area or an input keyboard;
and obtaining a target input instruction according to the target input, and sending the target input instruction to a second terminal.
Optionally, the processor 610 is further configured to:
under the condition that the picture to be remotely controlled does not contain a focus input frame, picture segmentation is carried out on the picture to be remotely controlled to obtain n picture segmentation areas, m operable target picture segmentation areas in the n picture segmentation areas are identified, wherein n and m are positive integers;
and under the condition that m is larger than 1, displaying m touch areas matched with the m operable target picture segmentation areas on a touch screen of the terminal.
Optionally, the processor 610 is further configured to:
and displaying an input keyboard on a touch screen of the terminal under the condition that the picture to be remotely controlled contains a focus input frame.
Optionally, the processor 610 is further configured to:
under the condition that the picture to be remotely controlled comprises a focus input frame, identifying a keyboard display type in the picture to be remotely controlled;
and displaying an input keyboard matched with the keyboard display type on a touch screen of the terminal.
Optionally, the processor 610 is further configured to:
and determining that the first input instruction mapped by the target sub-object is a target input instruction according to the mapping relation between the preset sub-object and the first input instruction.
It should be noted that, in this embodiment, the terminal 600 may implement each process of the first terminal in the method embodiment of the present invention, and achieve the same beneficial effects, and for avoiding repetition, details are not described here; the receiving terminal can implement each process of the second terminal in the method embodiment of the present invention, and achieve the same beneficial effects, and for avoiding repetition, details are not described here.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 601 may be used for receiving and sending signals during a message sending and receiving process or a call process, and specifically, receives downlink data from a base station and then processes the received downlink data to the processor 610; in addition, the uplink data is transmitted to the base station. In general, radio frequency unit 601 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. Further, the radio frequency unit 601 may also communicate with a network and other devices through a wireless communication system.
The terminal provides wireless broadband internet access to the user through the network module 602, such as helping the user send and receive e-mails, browse web pages, and access streaming media.
The audio output unit 603 may convert audio data received by the radio frequency unit 601 or the network module 602 or stored in the memory 609 into an audio signal and output as sound. Also, the audio output unit 603 can also provide audio output related to a specific function performed by the terminal 600 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 603 includes a speaker, a buzzer, a receiver, and the like.
The input unit 604 is used to receive audio or video signals. The input Unit 604 may include a Graphics Processing Unit (GPU) 6041 and a microphone 6042, and the Graphics processor 6041 processes image data of a still picture or video obtained by an image capturing apparatus (such as a camera) in a video capture mode or an image capture mode. The processed image frames may be displayed on the display unit 606. The image frames processed by the graphic processor 6041 may be stored in the memory 609 (or other storage medium) or transmitted via the radio frequency unit 601 or the network module 602. The microphone 6042 can receive sound, and can process such sound into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 601 in case of the phone call mode.
The terminal 600 also includes at least one sensor 605, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor that can adjust the brightness of the display panel 6061 according to the brightness of ambient light, and a proximity sensor that can turn off the display panel 6061 and/or the backlight when the terminal 600 is moved to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used to identify the terminal posture (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration identification related functions (such as pedometer, tapping), and the like; the sensors 605 may also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, infrared sensors, etc., which are not described in detail herein.
The Display unit 606 may include a Display panel 6061, and the Display panel 6061 may be configured in the form of a liquid Crystal Display (L acquired Crystal Display, L CD), an Organic light-Emitting Diode (O L ED), or the like.
The user input unit 607 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the terminal. Specifically, the user input unit 607 includes a touch panel 6071 and other input devices 6072. Touch panel 6071, also referred to as a touch screen, may collect touch operations by a user on or near it (e.g., operations by a user on or near touch panel 6071 using a finger, stylus, or any suitable object or accessory). The touch panel 6071 may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 610, receives a command from the processor 610, and executes the command. In addition, the touch panel 6071 can be implemented by various types such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. The user input unit 607 may include other input devices 6072 in addition to the touch panel 6071. Specifically, the other input devices 6072 may include, but are not limited to, a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a track ball, a mouse, and a joystick, which are not described herein again.
Further, the touch panel 6071 can be overlaid on the display panel 6061, and when the touch panel 6071 detects a touch operation on or near the touch panel 6071, the touch operation is transmitted to the processor 610 to determine the type of the touch event, and then the processor 610 provides a corresponding visual output on the display panel 6061 according to the type of the touch event. Although in fig. 6, the touch panel 6071 and the display panel 6061 are two independent components to realize the input and output functions of the terminal, in some embodiments, the touch panel 6071 and the display panel 6061 may be integrated to realize the input and output functions of the terminal, and this is not limited here.
The interface unit 608 is an interface for connecting an external device to the terminal 600. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 608 may be used to receive input (e.g., data information, power, etc.) from an external device and transmit the received input to one or more elements within the terminal 600 or may be used to transmit data between the terminal 600 and an external device.
The memory 609 may be used to store software programs as well as various data. The memory 609 may mainly include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 609 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 610 is a control center of the terminal, connects various parts of the entire terminal using various interfaces and lines, and performs various functions of the terminal and processes data by operating or executing software programs and/or modules stored in the memory 609 and calling data stored in the memory 609, thereby performing overall monitoring of the terminal. Processor 610 may include one or more processing units; preferably, the processor 610 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 610.
The terminal 600 may further include a power supply 611 (e.g., a battery) for supplying power to the various components, and preferably, the power supply 611 is logically connected to the processor 610 via a power management system, so that functions of managing charging, discharging, and power consumption are performed via the power management system.
In addition, the terminal 600 includes some functional modules that are not shown, and are not described in detail herein.
Preferably, an embodiment of the present invention further provides a terminal, including a processor 610, a memory 609, and a computer program stored in the memory 609 and capable of running on the processor 610, where the computer program, when executed by the processor 610, implements each process of the above remote control method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not described here again.
The embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements each process of the above-mentioned remote control method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here. The computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
While the present invention has been described with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, which are illustrative and not restrictive, and it will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (11)

1. A remote control method is applied to a first terminal and is characterized by comprising the following steps:
acquiring a picture to be remotely controlled, wherein the picture to be remotely controlled is a display interface of a second terminal;
detecting whether the picture to be remotely controlled contains a focus input box or a selected search control to obtain a detection result;
displaying a target object corresponding to the detection result on a touch screen of the first terminal, wherein the target object is a touch area matched with the picture to be remotely controlled under the condition that the detection result indicates that the picture to be remotely controlled does not contain the focus input frame and the search control, and the target object is an input keyboard under the condition that the detection result indicates that the picture to be remotely controlled contains the focus input frame or the search control;
receiving a target input of a user for a target sub-object in the target object, wherein the target sub-object comprises a target sub-touch area of the touch area or a target input key in the input keyboard;
and obtaining a target input instruction according to the target input, and sending the target input instruction to a second terminal.
2. The remote control method according to claim 1, wherein the displaying a target object on a touch screen of the first terminal according to the to-be-remotely-controlled operation screen includes:
under the condition that the picture to be remotely controlled does not contain a focus input frame, picture segmentation is carried out on the picture to be remotely controlled to obtain n picture segmentation areas, m operable target picture segmentation areas in the n picture segmentation areas are identified, wherein n and m are positive integers;
and under the condition that m is larger than 1, displaying m touch areas matched with the m operable target picture segmentation areas on a touch screen of the first terminal.
3. The remote control method according to claim 1, wherein the displaying a target object on a touch screen of the first terminal according to the to-be-remotely-controlled operation screen includes:
and displaying an input keyboard on a touch screen of the first terminal under the condition that the picture to be remotely controlled contains a focus input frame.
4. The remote control method according to claim 3, wherein displaying an input keyboard on a touch screen of the first terminal when the to-be-remotely-controlled operation screen includes a focus input box includes:
under the condition that the picture to be remotely controlled comprises a focus input frame, identifying a keyboard display type in the picture to be remotely controlled;
and displaying an input keyboard matched with the keyboard display type on a touch screen of the first terminal.
5. The remote control method according to any one of claims 1 to 4, wherein said determining a target input instruction according to the target input comprises:
and determining that the first input instruction mapped by the target sub-object is a target input instruction according to the mapping relation between the preset sub-object and the first input instruction.
6. A terminal, comprising:
the acquisition module is used for acquiring a picture to be remotely controlled, wherein the picture to be remotely controlled is a display interface of the second terminal;
the display module is used for detecting whether the picture to be remotely controlled contains a focus input box or a selected search control to obtain a detection result; displaying a target object corresponding to the detection result on a touch screen of the terminal, wherein the target object is a touch area matched with the picture to be remotely controlled under the condition that the detection result indicates that the picture to be remotely controlled does not contain the focus input frame and the search control, and the target object is an input keyboard under the condition that the detection result indicates that the picture to be remotely controlled contains the focus input frame or the search control;
a receiving module, configured to receive a target input of a user for a target sub-object in the target object, where the target sub-object includes a target sub-touch area of the touch area or a target input key of the input keyboard;
and the sending module is used for obtaining a target input instruction according to the target input and sending the target input instruction to the receiving terminal.
7. The terminal of claim 6, wherein the display module comprises:
the device comprises a dividing unit, a processing unit and a display unit, wherein the dividing unit is used for carrying out picture division on a picture to be remotely controlled under the condition that the picture to be remotely controlled does not contain a focus input frame to obtain n picture dividing regions, and identifying m operable target picture dividing regions in the n picture dividing regions, wherein n and m are positive integers;
and the first display unit is used for displaying m touch areas matched with the m operable target picture segmentation areas on a touch screen of the terminal under the condition that m is larger than 1.
8. The terminal according to claim 6, wherein the display module is specifically configured to: and displaying an input keyboard on a touch screen of the terminal under the condition that the picture to be remotely controlled contains a focus input frame.
9. The terminal of claim 8, wherein the display module comprises:
the identification unit is used for identifying the keyboard display type in the picture to be remotely controlled under the condition that the picture to be remotely controlled contains the focus input frame;
and the second display unit is used for displaying an input keyboard matched with the keyboard display type on the touch screen of the terminal.
10. The terminal according to any of claims 6 to 9, wherein the sending module comprises:
the first determining unit is used for determining a first input instruction mapped by the target sub-object as a target input instruction according to the mapping relation between a preset sub-object and the first input instruction;
and the sending unit is used for sending the target input instruction to the receiving terminal.
11. A terminal, characterized in that it comprises a processor, a memory and a computer program stored on the memory and executable on the processor, the computer program, when executed by the processor, implementing the steps of the remote control method according to any one of claims 1 to 5.
CN201810204287.8A 2018-03-13 2018-03-13 Remote control method and terminal Active CN108471549B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810204287.8A CN108471549B (en) 2018-03-13 2018-03-13 Remote control method and terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810204287.8A CN108471549B (en) 2018-03-13 2018-03-13 Remote control method and terminal

Publications (2)

Publication Number Publication Date
CN108471549A CN108471549A (en) 2018-08-31
CN108471549B true CN108471549B (en) 2020-07-28

Family

ID=63265236

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810204287.8A Active CN108471549B (en) 2018-03-13 2018-03-13 Remote control method and terminal

Country Status (1)

Country Link
CN (1) CN108471549B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111736750B (en) * 2020-06-19 2022-08-19 联想(北京)有限公司 Control method and electronic equipment
CN114679616A (en) * 2022-03-28 2022-06-28 京东方科技集团股份有限公司 Method for controlling display device and related device

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20030033354A (en) * 2001-10-22 2003-05-01 (주)씨앤에스 테크놀로지 Wireless handset adding TV remute control
CN103428550A (en) * 2013-08-09 2013-12-04 华为终端有限公司 Object selecting method and terminal
CN103945251A (en) * 2014-04-03 2014-07-23 上海斐讯数据通信技术有限公司 Remote control system and mobile terminal
CN104202643A (en) * 2014-09-16 2014-12-10 李琢 Intelligent television touch remote-control terminal screen mapping method and control method and control system of touch remote-control terminal
CN105933746A (en) * 2016-06-20 2016-09-07 北京小米移动软件有限公司 Method and device for controlling playing device
CN106453872A (en) * 2016-09-28 2017-02-22 北京小米移动软件有限公司 Display control method and device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20030033354A (en) * 2001-10-22 2003-05-01 (주)씨앤에스 테크놀로지 Wireless handset adding TV remute control
CN103428550A (en) * 2013-08-09 2013-12-04 华为终端有限公司 Object selecting method and terminal
CN103945251A (en) * 2014-04-03 2014-07-23 上海斐讯数据通信技术有限公司 Remote control system and mobile terminal
CN104202643A (en) * 2014-09-16 2014-12-10 李琢 Intelligent television touch remote-control terminal screen mapping method and control method and control system of touch remote-control terminal
CN105933746A (en) * 2016-06-20 2016-09-07 北京小米移动软件有限公司 Method and device for controlling playing device
CN106453872A (en) * 2016-09-28 2017-02-22 北京小米移动软件有限公司 Display control method and device

Also Published As

Publication number Publication date
CN108471549A (en) 2018-08-31

Similar Documents

Publication Publication Date Title
CN108182019B (en) Suspension control display processing method and mobile terminal
CN108415652B (en) Text processing method and mobile terminal
CN109240577B (en) Screen capturing method and terminal
CN107943390B (en) Character copying method and mobile terminal
CN108737904B (en) Video data processing method and mobile terminal
CN108710458B (en) Split screen control method and terminal equipment
CN109032486B (en) Display control method and terminal equipment
CN110109593B (en) Screen capturing method and terminal equipment
CN111031398A (en) Video control method and electronic equipment
CN109710349B (en) Screen capturing method and mobile terminal
CN109379484B (en) Information processing method and terminal
CN108196753B (en) Interface switching method and mobile terminal
CN109495616B (en) Photographing method and terminal equipment
US11250046B2 (en) Image viewing method and mobile terminal
CN108958593B (en) Method for determining communication object and mobile terminal
CN110703972B (en) File control method and electronic equipment
CN110442279B (en) Message sending method and mobile terminal
CN109634438B (en) Input method control method and terminal equipment
CN108920040B (en) Application icon sorting method and mobile terminal
CN108093137B (en) Dialing method and mobile terminal
CN110096203B (en) Screenshot method and mobile terminal
CN109542321B (en) Control method and device for screen display content
CN108491143B (en) Object movement control method and mobile terminal
CN108471549B (en) Remote control method and terminal
CN107967086B (en) Icon arrangement method and device for mobile terminal and mobile terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant