CN113497851B - Control display method and electronic equipment - Google Patents

Control display method and electronic equipment Download PDF

Info

Publication number
CN113497851B
CN113497851B CN202010826249.3A CN202010826249A CN113497851B CN 113497851 B CN113497851 B CN 113497851B CN 202010826249 A CN202010826249 A CN 202010826249A CN 113497851 B CN113497851 B CN 113497851B
Authority
CN
China
Prior art keywords
interface
focus
type
electronic device
display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010826249.3A
Other languages
Chinese (zh)
Other versions
CN113497851A (en
Inventor
谢姝
汪碧海
唐茂开
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Publication of CN113497851A publication Critical patent/CN113497851A/en
Application granted granted Critical
Publication of CN113497851B publication Critical patent/CN113497851B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42204User interfaces specially adapted for controlling a client device through a remote control device; Remote control devices therefor

Abstract

The embodiment of the application provides a control display method and electronic equipment, wherein in the method, an operation event is received, wherein the operation event comprises an operation event type; judging the type of a display interface; if the type of the display interface is judged to be the first type, displaying a focus; if the type of the display interface is judged to be a second type, determining whether a focus is displayed or not according to the type of the operation event; if the input source corresponding to the operation event type is a remote controller, displaying a focus; if the input source corresponding to the operation event type is other electronic equipment, the focus is not displayed, so that whether the focus is displayed or not can be determined on the television side according to different input sources, and good operation experience is brought to a user through synchronization of display interfaces of the television side and the mobile phone side.

Description

Control display method and electronic equipment
Technical Field
The application relates to the technical field of intelligent terminals, in particular to a control display method and electronic equipment.
Background
With the continuous development of information technology, televisions are becoming more and more intelligent. The remote control method has the advantages that the remote control method can be used for carrying out remote control operation from the initial manual operation on a television to the later remote control operation through a remote controller, and can be used for carrying out remote control operation through mobile devices such as mobile phones, tablets and wearable devices nowadays, and convenience is brought to users.
Disclosure of Invention
The application provides a control display method and electronic equipment, so that a television side can display focuses according to different input sources, display contents of the television side can be kept consistent with those of a mobile phone side, and good operation experience is brought to a user.
In a first aspect, the present application provides a control display method applied to an electronic device, including:
receiving an operation event, wherein the operation event comprises an operation event type; specifically, the operation event type may correspond to a remote controller, or may correspond to an electronic device such as a mobile phone, and/or a tablet computer, that is, the television side may determine an input source (e.g., whether the remote controller or the mobile phone) according to the operation event type.
Judging the type of a display interface; specifically, the display interface is an interface displayed after the television side receives the operation event.
If the type of the display interface is judged to be the first type, displaying a focus; in particular, the first type of display interface may be a home page.
If the type of the display interface is judged to be a second type, determining whether a focus is displayed or not according to the type of the operation event; in particular, the second type of display interface may be a non-top page.
If the input source corresponding to the operation event type is a remote controller, displaying a focus;
and if the input source corresponding to the operation event type is other electronic equipment, not displaying the focus.
In one possible implementation manner, the display interface includes a current interface or a skipped interface; specifically, the current interface may be a display interface that is not skipped after the television side receives the operation event, for example, if the interface before the television side receives the operation event is the interface 100, and the interface 100 still stays in the interface 100 after the television side receives the operation event, the interface 100 may be regarded as the current interface. The interface after the jump may be an interface that jumps after the television side receives the operation event, for example, if the interface before the television side receives the operation event is the interface 100, and the interface jumps after the television side receives the operation event, for example, jumps to the interface 200, the interface 200 may be regarded as the interface after the jump.
In one possible implementation manner, if the display interface is a post-jump interface, the displaying the focus includes:
and acquiring a preset position of the focus, and displaying the focus at the preset position. Specifically, the preset position may be a preset default position, or a position where a control is arbitrarily selected in the interface after the jump.
In one possible implementation manner, if the display interface is a current interface, the displaying the focus includes:
and determining the position of the focus according to the key information or the operation position information, and displaying the focus at the determined position. Specifically, the key information may include information corresponding to up, down, left, and right keys on the remote controller, and the operation position information may include a coordinate position of the operation of the user on the display interface.
In one possible implementation manner, the other electronic device includes a mobile phone, a watch, and/or a tablet computer.
In one possible implementation manner, the method further includes:
and sending the display interface to other electronic equipment.
In a second aspect, the present application provides a control display device comprising:
the device comprises a receiving module, a processing module and a processing module, wherein the receiving module is used for receiving an operation event, and the operation event comprises an operation event type;
the judging module is used for judging the type of the display interface;
the display module is used for displaying the focus if the type of the display interface is judged to be the first type; if the type of the display interface is judged to be the second type, determining whether to display the focus according to the type of the operation event; if the input source corresponding to the operation event type is a remote controller, displaying a focus; and if the input source corresponding to the operation event type is other electronic equipment, not displaying the focus.
In one possible implementation manner, the display interface includes a current interface or a skipped interface.
In one possible implementation manner, the display interface is a jump interface, and the display module is further configured to obtain a preset position of the focus and display the focus at the preset position.
In one possible implementation manner, the display interface is a current interface, and the display module is further configured to determine a position of the focus according to the key information or the operation position information, and display the focus at the determined position.
In one possible implementation manner, the other electronic device includes a mobile phone, a watch and/or a tablet computer.
In one possible implementation manner, the apparatus further includes:
and the sending module is used for sending the display interface to other electronic equipment.
In a third aspect, the present application provides an electronic device, comprising: a display screen; one or more processors; a memory; a plurality of application programs; and one or more computer programs, wherein the one or more computer programs are stored in the memory, the one or more computer programs comprising instructions which, when executed by the electronic device, cause the electronic device to perform the steps of:
Receiving an operation event, wherein the operation event comprises an operation event type;
judging the type of a display interface;
if the type of the display interface is judged to be the first type, displaying the focus;
if the type of the display interface is judged to be the second type, determining whether to display the focus according to the type of the operation event;
if the input source corresponding to the operation event type is a remote controller, displaying a focus;
and if the input source corresponding to the operation event type is other electronic equipment, not displaying the focus.
In one possible implementation manner, the display interface includes a current interface or a post-jump interface.
In one possible implementation manner, the displaying interface is a post-jump interface, and when the instruction is executed by the electronic device, the step of causing the electronic device to execute the display focus includes:
and acquiring a preset position of the focus, and displaying the focus at the preset position.
In one possible implementation manner, the displaying interface is a current interface, and when the instruction is executed by the electronic device, the step of causing the electronic device to execute the display focus includes:
and determining the position of the focus according to the key information or the operation position information, and displaying the focus at the determined position.
In one possible implementation manner, the other electronic device includes a mobile phone, a watch, and/or a tablet computer.
In one possible implementation manner, when the instruction is executed by the electronic device, the electronic device further performs the following steps:
and sending the display interface to other electronic equipment.
It should be understood that the second to third aspects of the present application are consistent with the technical solution of the first aspect of the present application, and the beneficial effects obtained by the aspects and the corresponding possible implementation are similar, and are not described again.
In a fourth aspect, the present application provides a computer readable storage medium having stored thereon a computer program which, when run on a computer, causes the computer to perform the method according to the first aspect.
In a fifth aspect, the present application provides a computer program for performing the method of the first aspect when the computer program is executed by a computer.
In a possible design, the program of the fifth aspect may be stored in whole or in part on a storage medium packaged with the processor, or in part or in whole on a memory not packaged with the processor.
Drawings
Fig. 1 is a schematic flowchart of a display control method according to an embodiment of the present application;
FIG. 2 is a schematic diagram of an embodiment of an application scenario provided herein;
FIG. 3 is a schematic diagram illustrating an embodiment of remote control manipulation in a first type of display interface provided herein;
FIG. 4 is a schematic diagram illustrating another embodiment of remote control manipulation in a first type of display interface provided herein;
FIG. 5 is a schematic diagram illustrating one embodiment of a cell phone manipulation in a first type of display interface provided by the present application;
FIG. 6 is a schematic diagram illustrating another embodiment of cell phone manipulation in a first type of display interface provided herein;
FIG. 7 is a schematic diagram illustrating an embodiment of remote control manipulation in a second type of display interface provided herein;
FIG. 8 is a schematic diagram illustrating one embodiment of cell phone manipulation in a second type of display interface provided herein;
FIG. 9 is a schematic diagram illustrating another embodiment of cell phone manipulation in a second type of display interface provided by the present application;
fig. 10 is a schematic diagram illustrating an embodiment of switching from remote control operation to mobile phone operation according to the present application;
fig. 11 is a schematic diagram of another embodiment of switching from remote control operation to mobile phone operation provided by the present application;
Fig. 12 is a schematic diagram illustrating a remote control being switched to a mobile phone according to still another embodiment of the present disclosure;
FIG. 13 is a diagram illustrating an embodiment of switching from handset operation to remote control operation provided herein;
FIG. 14 is a schematic structural diagram of an embodiment of a control display device according to the present application;
fig. 15 is a schematic structural diagram of an embodiment of an electronic device of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the drawings in the embodiments of the present application. In the description of the embodiments herein, "/" means "or" unless otherwise specified, for example, a/B may mean a or B; "and/or" herein is merely an association relationship describing an associated object, and means that there may be three relationships, for example, a and/or B, and may mean: a exists alone, A and B exist simultaneously, and B exists alone.
In the following, the terms "first", "second" are used for descriptive purposes only and are not to be understood as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature. In the description of the embodiments of the present application, "a plurality" means two or more unless otherwise specified.
In the prior art, when a mobile phone is used to control a television, the mobile phone side and the television side both present focuses. The common usage habit of the user is that when the user normally uses the mobile phone to control the television, neither the mobile phone side nor the television side should present the focus. Therefore, the prior art is inconsistent with the long-term use habit of the user, and is easy to cause confusion of the user. Therefore, the display control method provided by the embodiment of the application can determine whether to display the focus according to the input source, so that the display interface of the television and the display interface of the mobile phone can be kept consistent, and further good operation experience can be brought to a user.
Fig. 1 is a flowchart illustrating an embodiment of a display control method according to the present application, which may include:
in step 101, a first electronic device receives an operation event.
Specifically, the first electronic device may be an electronic device having a display screen, such as a television or a computer.
For example, the user may press/release different keys on the remote controller to generate an operation event corresponding to the user operation to operate the first electronic device. In response to the user's operation, the remote controller may generate an operation event corresponding to the user's operation and transmit the operation event to the first electronic device. The type of the operation event may correspond to a remote controller, and the event type may be KeyEvent, for example. Illustratively, the operational event may include a key type. The key type is used to identify a key on the remote controller, and may be, for example, a direction key, a return key, a confirm key, and the like, where the direction key may include keys such as an up-down left-right key, and the like.
The user can click, long press or slide on the touch screen of the second electronic device to generate an operation event corresponding to the second electronic device so as to control the first electronic device. The second electronic device can be an electronic device with a touch screen, such as a mobile phone, a tablet, a smart watch, and the like. In response to the user operation, the second electronic device may generate an operation event corresponding to the user operation and transmit the operation event to the first electronic device. The type of the operation event may correspond to the second electronic device, for example, the event type may be TouchEvent. For example, the operation event may include an operation position, and the operation position may be used to identify an operation position of the user on the display interface of the second electronic device, for example, the operation position may be represented by coordinates.
And 102, judging the type of the display interface.
Specifically, after the first electronic device receives the operation event, the type of the display interface may be determined. The display interface can comprise a current interface or a jump interface; for example, if the interfaces displayed by the first electronic device are all the interfaces 100 before and after the operation event is received, the interface 100 is called a current interface; if the interface displayed by the first electronic device is the interface 100 before the operation event is received, the interface jumps after the operation event is received, and the interface displayed by the first electronic device is the interface 200, the interface 200 is called as a interface after the jump.
The display interface type can include a first type and a second type. If the display interface is of the first type, the first electronic device may directly display the focus without distinguishing an input source, and exemplarily, the display interface of the first type may include a home page; if the display interface is of the second type, the first electronic device may display the focus according to the input source, and for example, the interface of the second type may include a non-home page. It should be noted that, in the embodiment of the present application, the first type and the second type may be distinguished through a home page and a non-home page, or the first type and the second type may be distinguished through other forms, for example, if a focus in the display interface changes, and a preview screen in the display interface may change content with the change of the focus, the type of the display interface may be the first type; if the content of the preview screen in the display interface is not changed with the change of the focus when the focus in the display interface is changed, the type of the display interface may be the second type, which is not limited in the embodiment of the present application.
And 103, if the type of the display interface is judged to be the first type, displaying the focus.
That is, if the first electronic device determines that the type of the display interface is the first type, the focus is directly displayed without distinguishing the input source. If the display interface is the current interface, the focus can be changed on the current interface according to the operation because the page does not jump. For example, when a user operates the first electronic device through the up-down left-right keys on the remote controller, the focus may change the position according to the key operation of the remote controller, for example, when the user presses the left key of the remote controller, the focus is located on the control a, and when the user presses the left key of the remote controller, the focus may move to the control B located on the left side of the control a. If the display interface is a skipped interface, the first electronic device needs to acquire the focus position on the skipped interface and then can display the focus because the page is skipped. The first electronic equipment can acquire the position of any control in a display interface; for example, if the display interface includes at least one control, the first electronic device may arbitrarily select one control from the controls as the position of the focus; preferably, the first electronic device may further obtain a preset position of a control in the display interface, for example, a position of a first control; for example, in the display interface, the first electronic device may detect from top to bottom and from left to right, and the first detected control is the first control. The first electronic device may perform detection in other sequences, which is not limited in the embodiments of the present application.
And 104, if the type of the display interface is judged to be the second type, judging the type of the operation event, and determining whether to display the focus according to the type of the operation event.
Specifically, if the first electronic device determines that the type of the display interface is the second type, the type of the operation event is further determined, and whether to display the focus is determined according to the type of the operation event (the type of the input source).
If the type of the operation event corresponds to the remote controller, for example, the type of the operation event is KeyEvent, the first electronic device acquires a key type corresponding to the operation event of the remote controller, and displays a focus according to the key type.
If the type of the operation event corresponds to the mobile phone, for example, the type of the operation event is touch event, the first electronic device does not display the focus.
Step 105, sending the image or display element to the second electronic device.
Optionally, the first electronic device may send the image or the display element of the display interface to the second electronic device, so as to complete synchronization of the display content between the first electronic device and the second electronic device. For example, the image of the display interface may be a screen shot image of the display interface, and after receiving the screen shot image, the second electronic device may directly display the screen shot image on a touch screen of the second electronic device; the display elements of the display interface may include information of each element used for displaying in the display interface, for example, the information may include information of a position of the control, a size of the control, a color of the background map, and the like, and after receiving the display elements, the second electronic device may generate a corresponding display interface according to the display elements and display the display interface.
The control display method will be described below with reference to fig. 2 to 13 as an example.
Fig. 2 is a schematic diagram of an application scenario of the control display method according to an embodiment of the present application. Referring to fig. 2, a user can operate a television 13 (first electronic device) through a remote controller 11 (remote controller) and a mobile phone 12 (second electronic device). The remote control 11 may be an infrared remote control or a bluetooth remote control. The remote control 11 may communicate with the television 13 via an infrared channel or a bluetooth channel. The handset 12 may communicate with the television 13 through a WIFI channel. In the embodiment of the present application, the communication method between the mobile phone 12 and the television 13 and between the remote controller 11 and the television 13 is not limited.
As shown in fig. 3, when the television 13 displays the interface 310, the user may operate the remote control 11, for example, the user may press a direction key in the remote control 11 to change the position of the focus in the interface 310. The interface 310 includes a preview area 3101, control options 3102, and a focus 3103, where the preview area 3101 is used to display a preview screen corresponding to the control, the control options 3102 are used to provide different controls for the user to select, and the focus 3103 is used to identify the current position of the focus. Before receiving an operation event transmitted from the remote controller 11, the focus 3103 is positioned on the "live" control, and the preview area 3101 displays a preview screen corresponding to the "live" control. After receiving the operation event sent by the remote controller 11, the television 13 determines that the type of the display interface (e.g., the interface 310) is the first type, and needs to display the focus 3103. The television 13 determines the position after the focus change based on the received operation event, and for example, may determine the position after the focus change based on the key type in the operation event and display the focus. The position of the focus 3103 is moved from the "live" control to the "set" control, and the preview area 3101 displays a preview screen corresponding to the "set" control. After the tv 13 changes the focus position in the interface 310, the synchronization of the display content between the tv 13 and the mobile phone 12 is completed, that is, the focus is also displayed on the mobile phone 12.
As shown in fig. 4, when the television 13 displays the interface 410, the user may operate the remote control 11, for example, the user may press a return key in the remote control 11 to jump to a page. Before receiving an operation event transmitted from the remote controller 11, the television 13 displays a focus 3103. After receiving the operation event sent by the remote controller 11, the interface jumps (for example, jumps from the interface 410 to the interface 411), and at this time, the type of the interface after jumping may be determined. If the type of the interface after the jump is the first type, the television 13 displays the focus 3103, and at this time, the focus 3103 may be located at the position of any one control or a preset control (e.g., the position of the first control in the interface 411) in the interface after the jump (e.g., the interface 411).
As can be seen from the above embodiments of fig. 3 and 4, the display interface may be a current interface or a jump interface. The display interface is not limited in this embodiment.
As shown in fig. 5, when the interface 510 is displayed on the tv 13, the user may operate the cell phone 12, for example, the user may click on a control 3102 option on the touch screen of the cell phone 12, which is not the focus point, to change the position of the focus point 3103 in the interface 510. Before receiving an operation event transmitted from the mobile phone 12, the focus 3103 is positioned on the "live" control, and the preview area 3101 displays a preview screen corresponding to the "live" control. After receiving the operation event sent by the mobile phone 12, the television 13 determines that the type of the display interface (e.g., the interface 510) is the first type, and needs to display the focus 3103. The television 13 determines the position after the focus change from the received operation event, and for example, may determine the position after the focus change from the operation position in the operation event and display the focus. The position of the focus 3103 is moved from the "live" control to the "set" control, and the preview area 3101 displays a preview screen corresponding to the "set" control.
As shown in fig. 6, when the television 13 displays the interface 610, the user may operate the mobile phone 12, for example, the user may click on a control option 3102 on a touch screen of the mobile phone 12 to jump to a page. The television 13 does not display the focus until receiving the operation event transmitted by the handset 12. After receiving the operation event sent by the mobile phone 12, the interface jumps (for example, jumps from the interface 610 to the interface 611), and at this time, the television 13 may determine the type of the interface after jumping. If the type of the interface after the jump is the first type, the focus 3103 is displayed.
The above-mentioned fig. 3 to 6 describe the scenes in which the display interface is of the first type, and the following describes the scenes in which the display interface is of the second type.
As shown in fig. 7, when the television 13 displays the interface 710, the user may operate the remote control 11, for example, the user may press a confirmation key in the remote control 11 to jump to a page. Before receiving an operation event transmitted from the remote controller 11, the television 13 displays a focus 3103, and the focus 3103 is located in the "set" control. After receiving the operation event sent by the remote controller 11, the interface 710 jumps to a page corresponding to the "setting" control (e.g., the interface 711). The interface after the jump is the interface 711, and the interface 711 is the second type, so the television 13 determines the type of the input source, and the current operation event is the KeyEvent, so that it can be known that the input source is the remote controller 11 and the focus 3103 needs to be displayed.
As shown in fig. 8, when the television 13 displays the interface 810, the user may operate the mobile phone 12, for example, the user may click the focus 3103 on the touch screen of the mobile phone 12 to jump to a page. Before receiving the operation event sent by the handset 12, the television 13 displays a focus point 3103, and the focus point 3103 is located in the "set" control. After receiving the operation event sent by the mobile phone 12, the interface 810 jumps to a page corresponding to the "setting" control (for example, the interface 811). After the jump, the interface is the interface 811, and the interface 811 is the second type, so the tv 13 determines the type of the input source, and the current operation event is TouchEvent, so it can be known that the input source is the mobile phone 12, and the display focus 3103 is not needed.
As shown in fig. 9, when the tv 13 displays the interface 910, the user may operate the mobile phone 12, for example, the user may click on any of the control options 3102 on the touch screen of the mobile phone 12 to jump to a page. The embodiment of fig. 9 differs from the embodiment of fig. 8 in that: the television 13 does not display the focus 3103 until the operation event transmitted by the handset 12 is received. After receiving the operation event sent by the mobile phone 12, the interface 910 jumps to the interface 911, and the television 13 also does not display the focus 3103. That is, the embodiment shown in fig. 8 is from a focused display to an afocal display, while the embodiment shown in fig. 9 is from an afocal display to an afocal display. For specific steps after receiving the operation event sent by the mobile phone 12, reference may be made to the embodiment in fig. 8, which is not described herein again.
Fig. 7 to 9 describe a scenario in which the display interface is of the second type, and a scenario in which the input source is switched will be described below.
As shown in fig. 10, when the television 13 displays the interface 1010, the mobile phone 12 also displays the interface 1010 through synchronization with the television 13, and at this time, the user can switch from the remote controller 11 to the mobile phone 12. For example, the user may perform an operation on the touch screen of the mobile phone 12 by gesture sliding or clicking to complete the switching of the remote control 11 to the mobile phone 12. The display interface 1010 includes a control option 3102, a focus 3103, and a non-control area 10101, where the non-control area 10101 is used to identify an area where a non-control is located. The user can click on the non-control region 10101 through the mobile phone 12, and the user can also perform gesture sliding operation on the non-control region 10101 or the control option 3102 through the mobile phone 12, so as to complete switching from the remote control 11 to the mobile phone 12. The gesture sliding may include up-down sliding, left-right sliding, multi-finger sliding, and the like, and the clicking may include clicking or double clicking, and the like, which is not limited in this embodiment of the present application. Before receiving the operation event sent by the mobile phone 12, the remote controller 11 operates the television 13, so that both the mobile phone 12 and the television 13 display the focus 3103. After receiving the operation event sent by the mobile phone 12, the television 13 determines that the type of the display interface (e.g., the interface 1010) is the second type, so that the television 13 determines the type of the input source, and the current operation event is touch event, thereby knowing that the input source is the mobile phone 12 and the display focus 3103 is not needed.
As shown in fig. 11, when the television 13 displays the interface 1110, the mobile phone 12 also displays the interface 1110 through synchronization with the television 13, and at this time, the user can switch from the remote control 11 operation to the mobile phone 12 operation. For example, the user may click on any of the control options 3102 to complete the switch from the remote control 11 to the handset 12. Before receiving the operation event sent by the mobile phone 12, the remote controller 11 operates the television 13, so that the mobile phone 12 and the television 13 both display the focus 3103. After receiving the operation event sent by the mobile phone 12, the interface 1110 jumps to the interface 1111, and the television 13 determines that the type of the display interface (e.g., the interface 1011) is the second type, so that the television 13 determines the type of the input source, and the current operation event is TouchEvent, thereby knowing that the input source is the mobile phone 12 and the display focus 3103 is not needed.
As shown in fig. 12, when the television 13 displays the interface 1210, the mobile phone 12 also displays the interface 1210 through synchronization with the television 13, and at this time, the user can switch from the remote controller 11 to the mobile phone 12. For example, the user may click on the focus 3103 to complete the switching of the remote control 11 manipulation to the handset 12 manipulation. Before receiving the operation event sent by the mobile phone 12, the remote controller 11 operates the television 13, so that both the mobile phone 12 and the television 13 display the focus 3103. After receiving the operation event sent by the mobile phone 12, the interface 1210 jumps to the interface 1211, and the television 13 determines that the type of the display interface (e.g., the interface 1211) is the first type, so that the focus 3103 needs to be displayed.
As shown in fig. 13, when the television 13 displays the interface 1310, the user can switch from the operation of the mobile phone 12 to the operation of the remote controller 11. For example, the user may operate any key (e.g., a direction key, a return key, a confirmation key, or the like) of the remote control 11 to complete the switching of the operation of the mobile phone 12 to the operation of the remote control 11. Before receiving the operation event sent by the mobile phone 12, the mobile phone 12 operates the television 13, so that neither the mobile phone 12 nor the television 13 displays the focus 3103. After receiving the operation event of the remote controller 11, the tv 13 determines that the type of the display interface (e.g., the interface 1310) is the second type, so that the tv 13 determines the type of the input source, and the current operation event is the KeyEvent, thereby knowing that the input source is the remote controller 11 and the focus 3103 needs to be displayed.
Fig. 14 is a schematic structural diagram of an embodiment of the control display device of the present application, and as shown in fig. 14, the control display device 1400 may include: a receiving module 1410, a determining module 1420 and a displaying module 1430;
a receiving module 1410, configured to receive an operation event, where the operation event includes an operation event type;
the judging module 1420 is configured to judge a type of the display interface;
the display module 1430 is configured to display the focus if the type of the display interface is determined to be the first type; if the type of the display interface is judged to be the second type, determining whether to display the focus according to the type of the operation event; if the input source corresponding to the operation event type is a remote controller, displaying a focus; and if the input source corresponding to the operation event type is other electronic equipment, not displaying the focus.
In one possible implementation manner, the display interface includes a current interface or a post-jump interface.
In one possible implementation manner, the display interface is a post-jump interface, and the display module 1430 is further configured to obtain a preset position of the focus, and display the focus at the preset position.
In one possible implementation manner, the display interface is a current interface, and the display module 1430 is further configured to determine a position of the focus according to the key information or the operation position information, and display the focus at the determined position.
In one possible implementation manner, the other electronic device includes a mobile phone, a watch, and/or a tablet computer.
In one possible implementation manner, the apparatus 1400 further includes: a transmitting module 1440;
the sending module 1440 is configured to send a display interface to another electronic device.
The control display device provided in the embodiment shown in fig. 14 may be used to implement the technical solutions of the method embodiments shown in fig. 1 to fig. 13 of the present application, and the implementation principles and technical effects may further refer to the related descriptions in the method embodiments.
It should be understood that the division of the modules of the control display device shown in fig. 14 is only a logical division, and the actual implementation may be wholly or partially integrated into a physical entity or may be physically separated. And these modules can be realized in the form of software called by processing element; or may be implemented entirely in hardware; and part of the modules can be realized in the form of calling by the processing element in software, and part of the modules can be realized in the form of hardware. For example, the detection module may be a separate processing element, or may be integrated into a chip of the electronic device. The other modules are implemented similarly. In addition, all or part of the modules can be integrated together or can be independently realized. In implementation, each step of the above method or each module above may be implemented by an integrated logic circuit of hardware in a processor element or an instruction in the form of software.
For example, the above modules may be one or more integrated circuits configured to implement the above methods, such as: one or more Application Specific Integrated Circuits (ASICs), or one or more microprocessors (DSPs), or one or more Field Programmable Gate Arrays (FPGAs), among others. For another example, these modules may be integrated together and implemented in the form of a System-On-a-Chip (SOC)
Fig. 15 is a schematic structural diagram of an embodiment of an electronic device 1500 according to the present application, where the first electronic device may be the electronic device 1500. As shown in fig. 15, the electronic device 1500 may include a processor 1510, an external memory interface 1520, an internal memory 1521, a Universal Serial Bus (USB) interface 1530, a charging management module 1540, a power management module 1541, a battery 1542, an antenna 1, a wireless communication module 1560, an audio module 1570, a speaker 1570A, a microphone 1570C, a headset interface 1570D, a sensor module 1580, keys 1590, a motor 1591, an indicator 1592, a camera 1593, a display 1594, and so forth. The sensor module 1580 may include a temperature sensor 1580J, a touch sensor 1580K, an ambient light sensor 1580L, and the like
It is to be understood that the illustrated structure of the embodiment of the present application does not constitute a specific limitation to the electronic device 1500. In other embodiments of the present application, electronic device 1500 may include more or fewer components than illustrated, or some components may be combined, or some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 1510 may include one or more processing units, such as: the processors 1510 may include an Application Processor (AP), a modem processor, a Graphics Processor (GPU), an Image Signal Processor (ISP), a controller, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processor (NPU), among others. Wherein, the different processing units may be independent devices or may be integrated in one or more processors.
A memory may also be provided in the processor 1510 for storing instructions and data. In some embodiments, the memory in the processor 1510 is a cache memory. The memory may hold instructions or data that have just been used or recycled by processor 1510. If the processor 1510 needs to reuse the instruction or data, it may be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 1510, thereby increasing the efficiency of the system.
In some embodiments, processor 1510 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc.
The I2C interface is a bi-directional synchronous serial bus that includes a serial data line (SDA) and a Serial Clock Line (SCL). In some embodiments, processor 1510 may include multiple sets of I2C buses. The processor 1510 may be respectively coupled with the touch sensor 1580K, the charger, the flash, the camera 1593, and the like through different I2C bus interfaces. For example: the processor 1510 may be coupled to the touch sensor 1580K through an I2C interface, so that the processor 1510 and the touch sensor 1580K communicate through an I2C bus interface, thereby implementing a touch function of the electronic device 1500.
The I2S interface may be used for audio communication. In some embodiments, processor 1510 may include multiple sets of I2S buses. Processor 1510 may be coupled to audio module 1570 via an I2S bus enabling communication between processor 1510 and audio module 1570. In some embodiments, audio module 1570 may transfer audio signals to wireless communication module 1560 via I2S interface to implement the function of keeping on call while sharing screen.
The PCM interface may also be used for audio communication, sampling, quantizing and encoding analog signals. In some embodiments, audio module 1570 and wireless communication module 1560 may be coupled by a PCM bus interface. In some embodiments, the audio module 1570 may also transmit an audio signal to the wireless communication module 1560 through the PCM interface, so as to implement the function of keeping a call while sharing a screen. Both the I2S interface and the PCM interface may be used for audio communication.
The UART interface is a universal serial data bus used for asynchronous communications. The bus may be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is typically used to connect the processor 1510 with the wireless communication module 1560. For example: the processor 1510 communicates with the wireless communication module 1560 through a UART interface to implement the function of video coding data transmission.
The MIPI interface may be used to connect the processor 1510 to peripheral devices such as a display screen 1594 and a camera 1593. The MIPI interface includes a Camera Serial Interface (CSI), a Display Serial Interface (DSI), and the like. In some embodiments, the processor 1510 and the camera 1593 communicate over a CSI interface to enable the capture functionality of the electronic device 1500. The processor 1510 and the display screen 1594 communicate via the DSI interface to implement the display functions of the electronic device 1500.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal and may also be configured as a data signal. In some embodiments, GPIO interfaces may be used to connect the processor 1510 with the camera 1593, the display 1594, the wireless communication module 1560, the audio module 1570, the sensor module 1580, and the like. The GPIO interface may also be configured as an I2C interface, an I2S interface, a UART interface, a MIPI interface, and the like.
The USB interface 1530 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 1530 can be used to connect a charger to charge the electronic device 1500, and can also be used to transmit data between the electronic device 1500 and a peripheral device. And the earphone can also be used for connecting an earphone and playing audio through the earphone. The interface may also be used to connect other electronic devices, such as: augmented Reality (AR) devices, and the like.
It should be understood that the connection relationship between the modules shown in the embodiment of the present invention is only illustrative and is not limited to the structure of the electronic device 1500. In other embodiments of the present application, the electronic device 1500 may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The charge management module 1540 is configured to receive a charging input from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 1540 may receive charging input for a wired charger through the USB interface 1530. In some wireless charging embodiments, the charging management module 1540 can receive wireless charging input through a wireless charging coil of the electronic device 1500. While the charging management module 1540 charges the battery 1542, the power management module 1541 may also be used to supply power to the electronic device.
The power management module 1541 is configured to connect the battery 1542, the charging management module 1540, and the processor 1510. The power management module 1541 receives input from the battery 1542 and/or the charge management module 1540 and provides power to the processor 1510, the internal memory 1521, the display 1594, the camera 1593, the wireless communication module 1560, and the like. Power management module 1541 may also be used to monitor parameters such as battery capacity, battery cycle count, and battery state of health (leakage, impedance). In some other embodiments, the power management module 1541 may also be disposed in the processor 1510. In other embodiments, the power management module 1541 and the charging management module 1540 may be disposed in the same device.
The wireless communication function of the electronic apparatus 1500 may be realized by the antenna 1, the wireless communication module 1560, and the like.
The antenna 1 is used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 1500 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The wireless communication module 1560 may provide a solution for wireless communication applied to the electronic device 1500, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), bluetooth (bluetooth, BT), Global Navigation Satellite System (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR), and so on. The wireless communication module 1560 may be one or more devices that integrate at least one communication processing module. The wireless communication module 1560 receives electromagnetic waves via the antenna 1, performs frequency modulation and filtering on the electromagnetic wave signal, and transmits the processed signal to the processor 1510. The wireless communication module 1560 may also receive signals to be transmitted from the processor 1510, modulate the frequency of the signals, amplify the signals, and convert the signals into electromagnetic waves via the antenna 1 for radiation.
In some embodiments, antenna 1 of electronic device 1500 and wireless communication module 1560 are coupled such that electronic device 1500 can communicate with networks and other devices over wireless communication technologies. The wireless communication technology may include BT, GNSS, WLAN, NFC, FM, and/or IR technology, among others. The GNSS may include a Global Positioning System (GPS), a global navigation satellite system (GLONASS), a beidou navigation satellite system (BDS), a quasi-zenith satellite system (QZSS), and/or a Satellite Based Augmentation System (SBAS).
The electronic device 1500 implements display functionality via the GPU, the display screen 1594, and the application processor, among other things. The GPU is a microprocessor for image processing and is connected with a display screen 1594 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 1510 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 1594 is used to display images, video, and the like. The display screen 1594 includes a display panel. The display panel may be a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), or the like. In some embodiments, the electronic device 1500 may include 1 or N display screens 1594, N being a positive integer greater than 1.
The electronic device 1500 may implement a capture function via the ISP, the camera 1593, the video codec, the GPU, the display 1594, the application processor, and the like.
The ISP is used to process the data fed back by the camera 1593. For example, when a photo is taken, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing and converting into an image visible to naked eyes. The ISP can also carry out algorithm optimization on noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in the camera 1593.
The camera 1593 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, and then transmits the electrical signal to the ISP to be converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into image signal in standard RGB, YUV and other formats. In some embodiments, the electronic device 1500 can include 1 or N cameras 1593, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals. For example, when the electronic device 1500 is in frequency bin selection, the digital signal processor is used to perform fourier transform or the like on the frequency bin energy.
Video codecs are used to compress or decompress digital video. The electronic device 1500 may support one or more video codecs. Thus, the electronic device 1500 may play or record video in a variety of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The NPU is a neural-network (NN) computing processor that processes input information quickly by using a biological neural network structure, for example, by using a transfer mode between neurons of a human brain, and can also learn by itself continuously. Applications such as intelligent recognition of the electronic device 1500 can be implemented by the NPU, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
The external memory interface 1520 may be used to connect an external memory card, such as a Micro SD card, to enable expansion of the memory capability of the electronic device 1500. The external memory card communicates with the processor 1510 through the external memory interface 1520 to implement a data storage function. For example, files such as music, video, etc. are saved in an external memory card.
The internal memory 1521 may be used to store computer-executable program code, which includes instructions. The internal memory 1521 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The data storage area may store data (e.g., audio data, phone book, etc.) created during use of the electronic device 1500, and the like. In addition, the internal memory 1521 may include a high-speed random access memory, and may also include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like. The processor 1510 executes various functional applications and data processing of the electronic device 1500 by executing instructions stored in the internal memory 1521 and/or instructions stored in a memory provided in the processor.
Electronic device 1500 may implement audio functionality via audio module 1570, speaker 1570A, microphone 1570C, headset interface 1570D, and an application processor, among others. Such as music playing, recording, etc.
Audio module 1570 is configured to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. Audio module 1570 may also be used to encode and decode audio signals. In some embodiments, audio module 1570 may be disposed within processor 1510, or some functional modules of audio module 1570 may be disposed within processor 1510.
The speaker 1570A, also known as a "horn," is used to convert electrical audio signals into sound signals. The electronic apparatus 1500 can listen to music or listen to a hands-free call through the speaker 1570A.
Microphone 1570C, also known as a "microphone," converts sound signals into electrical signals. When making a call or sending voice information, the user can input a voice signal to the microphone 1570C by making a sound by approaching the microphone 1570C with the mouth. The electronic device 1500 may be provided with at least one microphone 1570C. In other embodiments, the electronic device 1500 may be provided with two microphones 1570C to achieve noise reduction functions in addition to collecting sound signals. In other embodiments, the electronic device 1500 may further include three, four, or more microphones 1570C to collect sound signals, reduce noise, identify sound sources, and implement directional recording functions.
The headset interface 1570D is used to connect wired headsets. The headset interface 1570D may be the USB interface 1530 or may be the 3.5mm open mobile electronic device platform (OMTP) standard interface or the CTIA (cellular telecommunications industry association) standard interface for the USA.
The ambient light sensor 1580L is used to sense ambient light brightness. The electronic device 1500 may adaptively adjust the brightness of the display 1594 based on the perceived ambient light brightness. The ambient light sensor 1580L can also be used to automatically adjust white balance at the time of shooting.
The temperature sensor 1580J is used for detecting temperature. In some embodiments, the electronic device 1500 executes a temperature processing strategy using the temperature detected by the temperature sensor 1580J. For example, when the temperature reported by the temperature sensor 1580J exceeds a threshold, the electronic device 1500 performs a performance reduction on a processor located near the temperature sensor 1580J, so as to reduce power consumption and implement thermal protection. In other embodiments, electronic device 1500 heats battery 1542 when the temperature is below another threshold to avoid a low temperature causing abnormal shutdown of electronic device 1500. In other embodiments, when the temperature is below yet another threshold, electronic device 1500 performs a boost on the output voltage of battery 1542 to avoid an abnormal shutdown due to low temperatures.
The touch sensor 1580K is also referred to as a "touch device". The touch sensor 1580K may be disposed on a display screen 1594, and the touch sensor 1580K and the display screen 1594 form a touch screen, which is also referred to as a "touch screen". The touch sensor 1590K is used to detect a touch operation applied thereto or nearby. The touch sensor can communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to the touch operation may be provided through the display screen 1594. In other embodiments, the touch sensor 1580K can be disposed on a different location of the electronic device 1500 than the display screen 1594.
The keys 1590 include a power-on key, a volume key, and the like. The keys 1590 may be mechanical keys. Or may be touch keys. The electronic apparatus 1500 may receive a key input, generate a key signal input related to user setting and function control of the electronic apparatus 1500.
The motor 1591 may generate a vibration cue. The motor 1591 may be used for incoming call vibration prompts, as well as for touch vibration feedback. For example, touch operations applied to different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 1591 may also respond to different vibration feedback effects in response to touch operations applied to different areas of the display 1594. Different application scenes (such as time reminding, receiving information, alarm clock, game and the like) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
The indicator 1592 may be an indicator light that may be used to indicate a state of charge, a change in charge, or a message, missed call, notification, etc.
The electronic device 1500 shown in fig. 15 may be an intelligent electronic device such as a television, an intelligent screen, a tablet computer, a notebook computer, or a PC, and the embodiment does not limit the form of the electronic device 1500. The electronic device 1500 may be configured to perform functions/steps in the methods provided in the embodiments of the present application, and specific reference may be made to the description in the embodiments of the methods of the present application, so that detailed description is appropriately omitted here to avoid redundancy.
Embodiments of the present application further provide a computer-readable storage medium, in which a computer program is stored, and when the computer program runs on a computer, the computer is caused to execute the method provided by the embodiments of the present application.
Embodiments of the present application also provide a computer program product, which includes a computer program, when the computer program runs on a computer, the computer is caused to execute the method provided by the embodiments of the present application.
The electronic device, the computer storage medium, or the computer program product provided in the embodiments of the present application are all configured to execute the corresponding method provided above, and therefore, the beneficial effects achieved by the electronic device, the computer storage medium, or the computer program product may refer to the beneficial effects in the corresponding method provided above, and are not described herein again.
In the embodiments of the present application, "at least one" means one or more, and "a plurality" means two or more. "and/or" describes the association relationship of the associated objects, and indicates that three relationships may exist, for example, a and/or B, and may indicate that a exists alone, a and B exist simultaneously, and B exists alone. Wherein A and B can be singular or plural. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship. "at least one of the following" and the like, refer to any combination of these items, including any combination of singular or plural items. For example, at least one of a, b, and c may represent: a, b, c, a and b, a and c, b and c or a and b and c, wherein a, b and c can be single or multiple.
Those of ordinary skill in the art will appreciate that the various elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of electronic hardware and computer software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, any function, if implemented in the form of a software functional unit and sold or used as a standalone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The above description is only an embodiment of the present application, and any person skilled in the art can easily conceive of changes or substitutions within the technical scope of the present disclosure, and all of them should be covered by the protection scope of the present application. The protection scope of the present application shall be subject to the protection scope of the claims.

Claims (13)

1. A control display method is applied to electronic equipment and is characterized by comprising the following steps:
receiving an operation event, wherein the operation event comprises an operation event type;
judging the type of a display interface;
if the type of the display interface is judged to be the first type, displaying a focus;
if the type of the display interface is judged to be a second type, determining whether to display a focus according to the type of the operation event;
if the input source corresponding to the operation event type is a remote controller, displaying a focus;
and if the input source corresponding to the operation event type is other electronic equipment, not displaying the focus.
2. The method of claim 1, wherein the display interface comprises a current interface or a post-jump interface.
3. The method of claim 2, wherein the display interface is a post-jump interface, and the display focus comprises:
And acquiring a preset position of the focus, and displaying the focus at the preset position.
4. The method of claim 2, wherein the display interface is a current interface, and the displaying the focus comprises:
and determining the position of the focus according to the key information or the operation position information, and displaying the focus at the determined position.
5. The method according to any one of claims 1-4, wherein the other electronic devices comprise a mobile phone, a watch, and/or a tablet computer.
6. The method according to any one of claims 1-4, further comprising:
and sending the display interface to the other electronic equipment.
7. An electronic device, comprising:
a display screen; one or more processors; a memory; a plurality of application programs; and one or more computer programs, wherein the one or more computer programs are stored in the memory, the one or more computer programs comprising instructions which, when executed by the electronic device, cause the electronic device to perform the steps of:
receiving an operation event, wherein the operation event comprises an operation event type;
Judging the type of a display interface;
if the type of the display interface is judged to be the first type, displaying a focus;
if the type of the display interface is judged to be a second type, determining whether to display a focus according to the type of the operation event;
if the input source corresponding to the operation event type is a remote controller, displaying a focus;
and if the input source corresponding to the operation event type is other electronic equipment, not displaying the focus.
8. The electronic device of claim 7, wherein the display interface comprises a current interface or a post-jump interface.
9. The electronic device of claim 8, wherein the display interface is a post-jump interface, and wherein the instructions, when executed by the electronic device, cause the electronic device to perform displaying focus comprises:
and acquiring a preset position of the focus, and displaying the focus at the preset position.
10. The electronic device of claim 8, wherein the display interface is a current interface, and wherein the instructions, when executed by the electronic device, cause the electronic device to perform the step of displaying focus comprises:
and determining the position of the focus according to the key information or the operation position information, and displaying the focus at the determined position.
11. Electronic device according to any of claims 7-10, wherein the other electronic device comprises a mobile phone, a watch and/or a tablet.
12. The electronic device of any of claims 7-10, wherein the instructions, when executed by the electronic device, cause the electronic device to further perform the steps of:
and sending the display interface to the other electronic equipment.
13. A computer-readable storage medium, in which a computer program is stored which, when run on a computer, causes the computer to carry out the method according to any one of claims 1 to 6.
CN202010826249.3A 2020-04-07 2020-08-17 Control display method and electronic equipment Active CN113497851B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010262715 2020-04-07
CN202010262715X 2020-04-07

Publications (2)

Publication Number Publication Date
CN113497851A CN113497851A (en) 2021-10-12
CN113497851B true CN113497851B (en) 2022-07-19

Family

ID=77994962

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010826249.3A Active CN113497851B (en) 2020-04-07 2020-08-17 Control display method and electronic equipment

Country Status (1)

Country Link
CN (1) CN113497851B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114089906A (en) * 2021-11-08 2022-02-25 百度在线网络技术(北京)有限公司 Intelligent mirror control method, device, equipment, storage medium and intelligent mirror
CN116266871A (en) * 2021-12-17 2023-06-20 广州迈聆信息科技有限公司 Control mode switching method and device of Android device, storage medium and electronic device

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102595224A (en) * 2011-11-01 2012-07-18 华为技术有限公司 Remote control method, remote controller, remote control response method and set top box
CN108600796A (en) * 2018-03-09 2018-09-28 百度在线网络技术(北京)有限公司 Control mode switch method, equipment and the computer-readable medium of smart television
CN110704146A (en) * 2019-08-30 2020-01-17 华为技术有限公司 Focus management method applied to electronic equipment and electronic equipment

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102595224A (en) * 2011-11-01 2012-07-18 华为技术有限公司 Remote control method, remote controller, remote control response method and set top box
CN108600796A (en) * 2018-03-09 2018-09-28 百度在线网络技术(北京)有限公司 Control mode switch method, equipment and the computer-readable medium of smart television
CN110704146A (en) * 2019-08-30 2020-01-17 华为技术有限公司 Focus management method applied to electronic equipment and electronic equipment

Also Published As

Publication number Publication date
CN113497851A (en) 2021-10-12

Similar Documents

Publication Publication Date Title
CN110072070B (en) Multi-channel video recording method, equipment and medium
CN113810601B (en) Terminal image processing method and device and terminal equipment
CN111526407B (en) Screen content display method and device
CN114489533A (en) Screen projection method and device, electronic equipment and computer readable storage medium
CN112954251B (en) Video processing method, video processing device, storage medium and electronic equipment
CN113810600A (en) Terminal image processing method and device and terminal equipment
CN113497851B (en) Control display method and electronic equipment
CN114205336A (en) Cross-device audio playing method, mobile terminal, electronic device and storage medium
CN113593567B (en) Method for converting video and sound into text and related equipment
CN111492678B (en) File transmission method and electronic equipment
CN114500901A (en) Double-scene video recording method and device and electronic equipment
CN114257920A (en) Audio playing method and system and electronic equipment
CN114339429A (en) Audio and video playing control method, electronic equipment and storage medium
CN109285563B (en) Voice data processing method and device in online translation process
WO2022095752A1 (en) Frame demultiplexing method, electronic device and storage medium
CN112637481B (en) Image scaling method and device
CN113596320B (en) Video shooting variable speed recording method, device and storage medium
CN111885768B (en) Method, electronic device and system for adjusting light source
CN113963732B (en) Audio playing method and terminal equipment
CN113923528B (en) Screen sharing method, terminal and storage medium
CN111294905B (en) Image processing method, image processing device, storage medium and electronic apparatus
CN115696067B (en) Image processing method for terminal, terminal device and computer readable storage medium
CN113364067B (en) Charging precision calibration method and electronic equipment
WO2023020420A1 (en) Volume display method, electronic device, and storage medium
WO2022105670A1 (en) Display method and terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant