CN111221602A - Interface display method and electronic equipment - Google Patents

Interface display method and electronic equipment Download PDF

Info

Publication number
CN111221602A
CN111221602A CN201911039607.XA CN201911039607A CN111221602A CN 111221602 A CN111221602 A CN 111221602A CN 201911039607 A CN201911039607 A CN 201911039607A CN 111221602 A CN111221602 A CN 111221602A
Authority
CN
China
Prior art keywords
information
scene
electronic device
sub
object corresponding
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911039607.XA
Other languages
Chinese (zh)
Inventor
陈曦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN201911039607.XA priority Critical patent/CN111221602A/en
Publication of CN111221602A publication Critical patent/CN111221602A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Abstract

The embodiment of the invention discloses an interface display method and electronic equipment, relates to the technical field of communication, and can solve the problems that the operation of a user is complicated and time-consuming in the process of displaying information related to external equipment. The method comprises the following steps: acquiring first scene information under the state of being connected with target external equipment, wherein the first scene information comprises characteristic information used for indicating the current use scene of the electronic equipment; and displaying a first object corresponding to the target external device based on the first scene information. The embodiment of the invention is applied to the process that the electronic equipment displays the corresponding object according to the current application scene.

Description

Interface display method and electronic equipment
Technical Field
The embodiment of the invention relates to the technical field of communication, in particular to an interface display method and electronic equipment.
Background
Generally, after a user connects an external device (e.g., a bluetooth sports bracelet or a smart watch) to an electronic device, the user may search for an application icon corresponding to the external device from a plurality of application icons displayed in a desktop of the electronic device, and then perform selective input on the application icon to trigger the electronic device to display an interface of an application indicated by the application icon, and the user may perform input on the interface of the application to enable the electronic device to display information related to the external device.
However, in the above method, since the user needs to search the application icons corresponding to the external device one by one from the application icons and then perform input for multiple times, the electronic device can display the information related to the external device, and therefore, the user operation is cumbersome and time-consuming in the process of displaying the information related to the external device.
Disclosure of Invention
The embodiment of the invention provides an interface display method and electronic equipment, which can solve the problems that the operation of a user is complicated and time-consuming in the process of displaying information related to external equipment.
In order to solve the technical problem, the embodiment of the invention adopts the following technical scheme:
in a first aspect of the embodiments of the present invention, an interface display method is provided, which is applied to an electronic device, and includes: acquiring first scene information under the state of being connected with target external equipment, wherein the first scene information comprises characteristic information used for indicating the current use scene of the electronic equipment; and displaying a first object corresponding to the target external device based on the first scene information.
In a second aspect of the embodiments of the present invention, there is provided an electronic device, including: the device comprises an acquisition module and a display module. The acquisition module is used for acquiring first scene information under the state of being connected with the target external device, wherein the first scene information comprises characteristic information used for indicating the current use scene of the electronic device. And the display module is used for displaying a first object corresponding to the target external device based on the first scene information acquired by the acquisition module.
In a third aspect of the embodiments of the present invention, an electronic device is provided, where the electronic device includes a processor, a memory, and a computer program stored in the memory and being executable on the processor, and the computer program, when executed by the processor, implements the steps of the interface display method according to the first aspect.
A fourth aspect of the embodiments of the present invention provides a computer-readable storage medium, on which a computer program is stored, which, when executed by a processor, implements the steps of the interface display method according to the first aspect.
In the embodiment of the invention, under the condition that the electronic equipment is connected with the target external equipment, the electronic equipment can display the first object corresponding to the target external equipment according to the acquired first scene information. The electronic equipment can determine the current use scene of the electronic equipment based on the connected target external equipment, and adaptively display the first object corresponding to the use scene without inputting multiple times by the user, so that the operation of the user can be simplified, and the time consumption can be saved.
Drawings
Fig. 1 is a schematic structural diagram of an android operating system according to an embodiment of the present invention;
fig. 2 is a schematic diagram of an interface display method according to an embodiment of the present invention;
fig. 3 is a schematic diagram of an example of an interface of a mobile phone according to an embodiment of the present invention;
fig. 4 is a second schematic diagram of an example of an interface of a mobile phone according to an embodiment of the present invention;
fig. 5 is a third schematic diagram of an example of an interface of a mobile phone according to the embodiment of the present invention;
fig. 6 is a second schematic diagram illustrating an interface display method according to a second embodiment of the present invention;
fig. 7 is a fourth schematic view of an example of an interface of a mobile phone according to an embodiment of the present invention;
fig. 8 is a fifth schematic view of an example of an interface of a mobile phone according to an embodiment of the present invention;
fig. 9 is a sixth schematic view of an example of an interface of a mobile phone according to an embodiment of the present invention;
fig. 10 is a seventh schematic diagram of an example of an interface of a mobile phone according to an embodiment of the present invention;
fig. 11 is a third schematic view illustrating an interface display method according to an embodiment of the present invention;
fig. 12 is an eighth schematic diagram of an example of an interface of a mobile phone according to an embodiment of the present invention;
fig. 13 is a ninth schematic diagram illustrating an example of an interface of a mobile phone according to an embodiment of the present invention;
fig. 14 is a schematic structural diagram of an electronic device according to an embodiment of the present invention;
fig. 15 is a second schematic structural diagram of an electronic apparatus according to a second embodiment of the invention;
fig. 16 is a third schematic structural diagram of an electronic apparatus according to an embodiment of the invention;
fig. 17 is a hardware schematic diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The terms "first" and "second," and the like, in the description and in the claims of embodiments of the present invention are used for distinguishing between different objects and not for describing a particular order of the objects. For example, the first object and the second object, etc. are for distinguishing different objects, not for describing a particular order of the objects.
In the description of the embodiments of the present invention, the meaning of "a plurality" means two or more unless otherwise specified. For example, a plurality of elements refers to two elements or more.
The term "and/or" herein is an association relationship describing an associated object, and means that there may be three relationships, for example, a display panel and/or a backlight, which may mean: there are three cases of a display panel alone, a display panel and a backlight at the same time, and a backlight alone. The symbol "/" herein denotes a relationship in which the associated object is or, for example, input/output denotes input or output.
In the embodiments of the present invention, words such as "exemplary" or "for example" are used to mean serving as examples, illustrations or descriptions. Any embodiment or design described as "exemplary" or "e.g.," an embodiment of the present invention is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, use of the word "exemplary" or "such as" is intended to present concepts related in a concrete fashion.
The embodiment of the invention provides an interface display method and electronic equipment. The electronic equipment can determine the current use scene of the electronic equipment based on the connected target external equipment, and adaptively display the first object corresponding to the use scene without inputting multiple times by the user, so that the operation of the user can be simplified, and the time consumption can be saved.
The interface display method and the electronic equipment provided by the embodiment of the invention can be applied to the process of displaying the interface adaptive to the use scene by the electronic equipment. Specifically, the method can be applied to a process of displaying a corresponding interface by the electronic device according to the current application scene.
The electronic device in the embodiment of the present invention may be an electronic device having an operating system. The operating system may be an Android (Android) operating system, an ios operating system, or other possible operating systems, and embodiments of the present invention are not limited in particular.
The following describes a software environment to which the interface display method provided by the embodiment of the present invention is applied, by taking an android operating system as an example.
Fig. 1 is a schematic diagram of an architecture of a possible android operating system according to an embodiment of the present invention. In fig. 1, the architecture of the android operating system includes 4 layers, which are respectively: an application layer, an application framework layer, a system runtime layer, and a kernel layer (specifically, a Linux kernel layer).
The application program layer comprises various application programs (including system application programs and third-party application programs) in an android operating system.
The application framework layer is a framework of the application, and a developer can develop some applications based on the application framework layer under the condition of complying with the development principle of the framework of the application.
The system runtime layer includes libraries (also called system libraries) and android operating system runtime environments. The library mainly provides various resources required by the android operating system. The android operating system running environment is used for providing a software environment for the android operating system.
The kernel layer is an operating system layer of an android operating system and belongs to the bottommost layer of an android operating system software layer. The kernel layer provides kernel system services and hardware-related drivers for the android operating system based on the Linux kernel.
Taking an android operating system as an example, in the embodiment of the present invention, a developer may develop a software program for implementing the interface display method provided in the embodiment of the present invention based on the system architecture of the android operating system shown in fig. 1, so that the interface display method may operate based on the android operating system shown in fig. 1. Namely, the processor or the electronic device can implement the interface display method provided by the embodiment of the invention by running the software program in the android operating system.
The electronic device in the embodiment of the invention can be a mobile terminal device and can also be a non-mobile terminal device. For example, the mobile terminal device may be a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal device, a wearable device, an ultra-mobile personal computer (UMPC), a netbook or a Personal Digital Assistant (PDA), and the like, and the non-mobile terminal device may be a Personal Computer (PC), a Television (TV), a teller machine, a self-service machine, and the like, and the embodiment of the present invention is not particularly limited.
An interface display method and an electronic device provided by the embodiments of the present invention are described in detail below with reference to the accompanying drawings through specific embodiments and application scenarios thereof.
Fig. 2 shows a flowchart of an interface display method provided in an embodiment of the present invention, and the method can be applied to an electronic device having an android operating system shown in fig. 1. As shown in fig. 2, the interface display method provided by the embodiment of the present invention may include steps 201 and 202 described below.
Step 201, in a state that the electronic device is connected with the target external device, the electronic device acquires first scene information.
In an embodiment of the present invention, the first scenario information includes feature information indicating a current usage scenario of the electronic device.
Optionally, in the embodiment of the present invention, when the electronic device is connected to the target external device, the electronic device may obtain type information of the target external device, and determine a current usage scenario of the electronic device according to the type information.
Optionally, in this embodiment of the present invention, the usage scenario may include at least one of the following: outdoor scene, indoor scene and at least two external device scenes.
Optionally, in this embodiment of the present invention, the feature information may include at least one of the following: the type information of the target external equipment, the position information of the electronic equipment and the system time information of the electronic equipment.
It should be noted that, the usage scenario of the electronic device may be understood as: the environment in which the electronic device is located, an application program run by the electronic device, or a function performed by the electronic device.
Optionally, in the embodiment of the present invention, the electronic device may pre-store an association relationship between the identifier of the at least one external device and the type information of the at least one external device, that is, the identifier of one external device corresponds to the type information of one external device; under the condition that the electronic equipment is connected with the target external equipment, the electronic equipment can acquire the identifier of the target external equipment, search the type information corresponding to the identifier of the target external equipment from the type information of at least one external equipment, and determine the current use scene of the electronic equipment according to the searched type information.
Optionally, in the embodiment of the present invention, the association relationship may be set in the electronic device by the user in advance, or may be obtained by the electronic device according to a historical connection record.
Optionally, in the embodiment of the present invention, the identifier of the external device may be a name of the external device.
Optionally, in the embodiment of the present invention, the target external device may be one external device or multiple external devices.
Optionally, in the embodiment of the present invention, the usage scenario may be an outdoor usage scenario, an indoor usage scenario, or an external device control usage scenario.
Optionally, in this embodiment of the present invention, the first scenario information may include first sub information, where the first sub information is used to indicate a type of the target external device.
Optionally, in this embodiment of the present invention, the type of the external device may include: smart bracelet type, smart watch type, smart light type, bluetooth headset type, smart speaker type, smart television type, smart phone type, router type, and the like.
Optionally, in this embodiment of the present invention, the first scene information may include first sub information and second sub information, where the second sub information includes at least one of the following items: the first position information is used for indicating the type of the target external device, the first time information is used for indicating the current position of the electronic device, and the first position information is used for indicating the current system time of the electronic device.
Optionally, in the embodiment of the present invention, the electronic device may first acquire the first sub information, and determine the second sub information according to the first sub information.
Optionally, in the embodiment of the present invention, if the electronic device determines that the type of the target external device (for example, a bluetooth sports bracelet) is the smart bracelet type according to the acquired first sub-information, the electronic device may acquire the first location information, that is, the first scenario information includes the target type information and the first location information.
Exemplarily, if the type of the target external device is an intelligent bracelet type and the first location information is an outdoor location, the first scene information is used for indicating that a current usage scene of the electronic device is an outdoor sports scene; if the type of the target external device is an intelligent bracelet type and the first position information is an indoor position, the first scene information is used for indicating that the current use scene of the electronic device is an indoor motion scene.
It should be noted that, the method for determining the first position information for the electronic device will be described in the following embodiments, which are not described herein again.
Optionally, in the embodiment of the present invention, if the electronic device determines that the type of the target external device is the type of the intelligent watch or the type of the intelligent lamp according to the obtained first sub-information, the electronic device may obtain the first time information, that is, the first scene information includes the first type information and the first time information.
Illustratively, if the type of the target external device is an intelligent watch type and the first time information is sleep time, the first scene information is used for indicating that the current use scene of the electronic device is a night sleep scene; if the type of the target external device is the type of the intelligent watch and the first time information is the non-sleep time, the first scene information is used for indicating that the current use scene of the electronic device is a day-day daily scene.
It should be noted that, the method for determining the first time information for the electronic device will be described in the following embodiments, which are not described herein again.
Optionally, in this embodiment of the present invention, the target external device includes at least one external device, the first scene information includes at least one first type information, and each first type information is used to indicate a type of one external device.
Optionally, in this embodiment of the present invention, if the target external device includes two external devices, and at least one piece of first type information includes two pieces of first type information, the electronic device determines, according to the one piece of first type information, that the type of one external device is a smart television type or a smart phone type, and determines, according to the other piece of first type information, that the type of the other external device is a router type.
Optionally, in the embodiment of the present invention, the electronic device may establish a connection with the target external device in a bluetooth mode, a Wireless-Fidelity (WiFi) mode, a wired connection mode, or the like.
Step 202, the electronic device displays a first object corresponding to the target external device based on the first scene information.
Optionally, in the embodiment of the present invention, the electronic device may display a target interface according to the first scene information, where the target interface includes a first object corresponding to the target external device.
It should be noted that the target interface may be a current interface of the electronic device.
Optionally, in the embodiment of the present invention, the electronic device may display a first interface according to the first scene information, where the first interface includes a first object corresponding to the target external device, and the first interface may be different from the target interface.
It is understood that the electronic device may generate a first object corresponding to the first scene information according to the first scene information and display the first object.
Optionally, in an embodiment of the present invention, the first object includes at least one of: textual content, controls, identification of the application, and linking information.
Optionally, in the embodiment of the present invention, the text content may be text content generated by the electronic device according to information acquired from the target external device.
Optionally, in the embodiment of the present invention, the control may be a control used for a user to trigger the electronic device to control the target external device to execute a corresponding action, or a text control used for displaying text content.
Optionally, in this embodiment of the present invention, the identifier of the application program may be an icon of the application program.
Optionally, in this embodiment of the present invention, the link information may be used for a user to trigger the electronic device to display a corresponding interface.
Optionally, in this embodiment of the present invention, if the first scene information includes first sub information and first location information, the electronic device may generate a first object corresponding to the first sub information and the first location information, where the first object may include text content, a control, an identifier of an application, and link information.
The electronic device is taken as a mobile phone for illustration. As shown in fig. 3, if the mobile phone determines that the type of the target external device is an intelligent bracelet type and the acquired first location information is an outdoor location, the mobile phone may display a target interface 10, where the target interface 10 includes a first object (e.g., text content, a control, an identifier of an application program, and link information) corresponding to a bluetooth motion bracelet; the text content is generated by the mobile phone according to the information acquired from the bluetooth sports bracelet (for example, "current temperature: 25 ℃", "air quality: good", "weather condition: clear", and the like), the control is a text control (for example, a step counting control, a weather control, a heart rate control, and the like), the identifier of the application is an icon of the application corresponding to the bluetooth sports bracelet (for example, an icon of the step counting application, an icon of the heart rate monitoring application, and the like), and the link information is link information corresponding to the bluetooth sports bracelet (for example, "running path planning" link information).
It can be understood that the electronic device may determine a current usage scenario of the user according to different target external devices connected to the electronic device, and display different objects according to the usage scenario, so that the user may quickly use a function that the user wants to use when in different usage scenarios.
Optionally, in this embodiment of the present invention, if the first scene information includes first sub information and first time information, the electronic device may generate a first object corresponding to the first sub information and the first time information, where the first object may include text content, a control, and an identifier of an application program.
For example, as shown in fig. 4, if the mobile phone determines that the type of the target external device is an intelligent watch type and an intelligent lamp type, and the obtained first time information is sleep time, the mobile phone may display a target interface 11, where the target interface 11 includes first objects (such as text, controls, and identifiers of application programs) corresponding to an intelligent watch and an intelligent desk lamp; the text content is generated by the mobile phone according to information acquired from the smart watch and the smart desk lamp (such as 'yesterday sleep condition', 'deep sleep: 4 h', 'light sleep: 3 h' and the like), the controls are text controls (such as a sleep condition control, a desk lamp control, an alarm clock setting control and the like) and control controls (such as a 'setting timing off' control, a 'setting timing on' control and the like), and the identification of the application program is an icon of the application program corresponding to the smart watch and the smart desk lamp (such as an icon of a sleep-aid application program, an icon of a sleep-monitor application program and the like).
It can be understood that the current usage scenario of the user can be determined by introducing a determination dimension of time, so that the current usage scenario of the user can be determined more accurately.
Optionally, in this embodiment of the present invention, if the target external device includes two external devices, and at least one piece of first type information includes two pieces of first type information, the electronic device may generate a first object corresponding to the two pieces of first type information, where the first object may include text content, a control, an identifier of an application, and link information.
For example, as shown in fig. 5, if the mobile phone determines that the type of the target external device is a smart television type and a router type, the mobile phone may display a target interface 12, where the target interface 12 includes a first object (for example, text content, a control, and an identifier of an application) corresponding to the smart television and the router; the text content is generated by the mobile phone according to information obtained from the smart television and the router (for example, "recent hot video recommendation:" text content, "recently watched by the television:" text content and the like), the controls are text controls and control controls (for example, "volume" controls, "channel" controls, "set migration" controls and the like), the application program identifier is an application program icon corresponding to the smart television and the router (for example, an icon of an A video application program, an icon of an A live application program, a "set" application program icon and the like), and the link information is link information corresponding to the smart television and the router (for example, "A hot video" link information, "B hot video" link information and the like).
The embodiment of the invention provides an interface display method, and under the condition that an electronic device is connected with a target external device, the electronic device can display a first object corresponding to the target external device according to acquired first scene information. The electronic equipment can determine the current use scene of the electronic equipment based on the connected target external equipment, and adaptively display the first object corresponding to the use scene without inputting multiple times by the user, so that the operation of the user can be simplified, and the time consumption can be saved.
Optionally, in this embodiment of the present invention, the first scene information includes first sub information and second sub information. Referring to fig. 2, as shown in fig. 6, the step 202 can be implemented by the following steps 202a to 202 c.
Step 202a, the electronic device determines a current first usage scenario of the electronic device according to the first sub information and the second sub information.
Optionally, in this embodiment of the present invention, the first usage scenario may include any one of the following: outdoor scene, indoor scene and at least two external device scenes.
Step 202b, the electronic device determines a first object corresponding to the first usage scenario and the target external device.
Optionally, in this embodiment of the present invention, the electronic device may determine the first sub-object according to the first sub-information, then determine the second sub-object according to the second sub-information, and determine the first object corresponding to the first usage scenario and the target external device based on the first sub-object and the second sub-object.
It is to be understood that the first object includes a first sub-object and a second sub-object.
Optionally, the step 202b may be specifically realized by the following steps 202b1, 202b2 or 202b 3.
In step 202b1, in the case that the first usage scenario is an outdoor scenario, the electronic device determines an object corresponding to a preset outdoor event as the first object.
Optionally, in an embodiment of the present invention, the preset outdoor event may be an outdoor sport event.
Optionally, in this embodiment of the present invention, when the first usage scene is an outdoor scene, the electronic device may generate a first sub-object corresponding to the first sub-information, then generate a second sub-object corresponding to the second sub-information, and then determine the first object based on the first sub-object and the second sub-object.
Optionally, in this embodiment of the present invention, if the electronic device determines that the type of the target external device is the smart band type according to the acquired first sub-information, the electronic device generates a first sub-object corresponding to the first sub-information, where the first sub-object may include at least one of the following: textual content, controls, identification of the application, and linking information.
It is understood that the first sub-object and the second sub-object are objects corresponding to a preset outdoor event.
Optionally, in an embodiment of the present invention, the second sub-object includes at least one of: textual content, controls, identification of the application, and linking information.
Optionally, in the embodiment of the present invention, if the second sub-information matches with the first preset position information in the electronic device, the electronic device determines the third sub-object as the second sub-object.
In an embodiment of the present invention, the third sub-object is an object corresponding to the first sub-information and the first preset position information.
Optionally, in an embodiment of the present invention, the third sub-object includes at least one of the following items: textual content, controls, identification of the application, and linking information.
For example, as shown in fig. 7, if the mobile phone determines that the type of the target external device is a smart band type, the mobile phone determines a first sub-object corresponding to the bluetooth sports band, where the first sub-object may include text content (e.g., "number of steps today: 11860 step", "distance: 5 km", "consumption: 125 kcal", etc.), a text control (e.g., a step counting control), an identifier of an application is an icon of an application corresponding to the bluetooth sports band (e.g., an icon of a step counting application, an a sports application icon, etc.), and link information is link information corresponding to the bluetooth sports band (e.g., "view number of steps nearby" link information).
Optionally, in an embodiment of the present invention, the second sub information includes first position information, and if the first position information matches the first preset position information, the electronic device determines a third sub object corresponding to the first sub information and the first preset position information as the second sub object.
It should be noted that, the above "the first position information matches the first preset position information" may be understood as: the difference between the first position information and the first preset position information is less than or equal to a preset threshold.
Optionally, in the embodiment of the present invention, the electronic device may obtain the first location information of the electronic device through a Global Positioning System (GPS), or obtain the first location information of the electronic device according to cell information of a network in which the electronic device is registered. The method can be determined according to actual use requirements, and the embodiment of the invention is not limited.
For example, as shown in fig. 8, if the second sub-information (e.g., the first location information) matches the first preset location information in the mobile phone (i.e., a difference between the first location information and the first preset location information is less than or equal to a preset threshold), the third sub-object (e.g., the text content, the control, the identifier of the application, and the link information) is determined as the second sub-object (i.e., the object corresponding to the preset outdoor event (outdoor sport event)); the text content is text content (for example, "current temperature: 25 ℃", "air quality: good", "weather condition: fine") generated by the mobile phone according to information acquired from the bluetooth motion bracelet, the control is a control corresponding to the bluetooth motion bracelet (for example, a text control for displaying the text content), the identifier of the application is an identifier of an application corresponding to the bluetooth motion bracelet (for example, an icon of a weather application, an icon of an a motion application, and the like), and the link information is link information corresponding to the bluetooth motion bracelet (for example, "running path planning" link information).
Optionally, in the embodiment of the present invention, if the second sub information matches with second preset position information in the electronic device, the electronic device determines the fourth sub object as the second sub object.
In an embodiment of the present invention, the fourth sub-object is an object corresponding to the first sub-information and the second preset position information.
Optionally, in an embodiment of the present invention, the fourth sub-object includes at least one of the following items: textual content, controls, identification of the application, and linking information.
Optionally, in this embodiment of the present invention, the second sub-information includes first position information, and if the first position information matches with the second preset position information, the electronic device determines the fourth sub-object as the second sub-object.
It should be noted that, the above "the first position information matches the second preset position information" may be understood as: the difference between the first location information and the second predetermined location information is less than or equal to a predetermined threshold, and the second predetermined location information may be location information other than the location of the first predetermined information.
Optionally, in the embodiment of the present invention, if the electronic device determines that the type of the target external device is the smart band type according to the acquired first sub-information, and determines that the first location information is matched with the second preset location information according to the second sub-information, the electronic device may determine an object corresponding to a preset outdoor event (for example, an outdoor sport event) as the first object.
Step 202b2, in the case that the first usage scenario is an indoor scenario, the electronic device determines an object corresponding to a preset indoor event as the first object.
Optionally, in this embodiment of the present invention, when the first usage scene is an indoor scene, the electronic device may generate a first sub-object corresponding to the first sub-information, then generate a second sub-object corresponding to the second sub-information, and then determine the first object based on the first sub-object and the second sub-object.
Optionally, in an embodiment of the present invention, the preset indoor event may include at least one of the following: indoor sporting events, indoor sleep events, and other events indoors, among others.
It is understood that the first sub-object and the second sub-object are objects corresponding to a preset indoor event.
Optionally, in this embodiment of the present invention, if the electronic device determines that the type of the target external device is an intelligent watch type or an intelligent lamp type, the electronic device generates a first sub-object corresponding to the target type information, where the first sub-object may include at least one of the following: identification of controls and applications.
For example, as shown in fig. 9, if the mobile phone determines that the type of the target external device is an intelligent watch type and an intelligent lamp type, the mobile phone generates a first sub-object corresponding to the intelligent watch and the intelligent table lamp, where the first sub-object may include a control (e.g., "set timing off" control, "set timing on" control, etc.), and the identifier of the application is an icon of an application corresponding to the intelligent watch and the intelligent table lamp (e.g., an icon of a light music application, an indoor environment application icon, etc.).
Optionally, in an embodiment of the present invention, the second sub information includes first time information, and if the first time information matches the first preset time period information, the electronic device determines a third sub object, which is associated with the first sub information and the first preset time period information, as the second sub object.
It should be noted that, the above "matching the first time information with the first preset time period information" may be understood as: the time indicated by the first time information is within the time period indicated by the first preset time period information.
Optionally, in the embodiment of the present invention, if the electronic device determines, according to the obtained first sub-information, that the type of the target external device is an intelligent watch type and an intelligent lamp type, and determines, according to the second sub-information, that the first time information matches the first preset time period information, the electronic device may determine, as the first object, an object corresponding to a preset indoor event (for example, an indoor sleep event).
For example, as shown in fig. 10, if the second sub-information (e.g., the first time information (e.g., 23:00)) matches the first preset time period information (e.g., 22: 00-8: 00) in the mobile phone (i.e., the time indicated by the first time information is within the time period indicated by the first preset time period information), the mobile phone determines a third sub-object (e.g., the text content and the identifier of the application) as the second sub-object (i.e., the object corresponding to the preset indoor event (indoor sleep event)), the text content is the text content generated by the mobile phone according to the smart watch and the smart desk lamp (e.g., "yesterday sleep condition", "deep sleep: 4 h", "light sleep: 3 h", etc.), the control (e.g., the text control) and the identifier of the application are the identifiers of the applications corresponding to the smart watch and the smart desk lamp (e.g., the icon of the sleep-aid application, icon of sleep monitoring application, etc.)).
Optionally, in the embodiment of the present invention, if the first time information matches the second preset time period information, the electronic device determines the fourth sub-object as the second sub-object.
It should be noted that, the above "matching the first time information with the second preset time period information" may be understood as: the time indicated by the first time information is within the time period indicated by the second preset time period information, and the second preset time period information and the first preset time period information may be different time period information.
Step 202b3, in the case that the first usage scenario is a scenario in which at least two external devices are connected, the electronic device determines an object corresponding to the external device control event as the first object.
Optionally, in this embodiment of the present invention, when the first usage scenario is a scenario in which at least two external devices are connected, the electronic device may first generate a fifth sub-object corresponding to at least one piece of first type information, and then determine the first object based on the fifth sub-object.
Optionally, in the embodiment of the present invention, the external device control event may be an intelligent home control event.
Optionally, in the embodiment of the present invention, if the electronic device determines that the type of the target external device is at least two external devices according to the obtained first sub-information, the electronic device may determine an object corresponding to an external device control event (for example, an intelligent home control event) as the first object.
It is to be understood that the first object includes a fifth sub-object, which is an object corresponding to an external device control event.
Optionally, in this embodiment of the present invention, if the target external device includes two external devices, and at least one piece of first type information includes two pieces of first type information, the electronic device may generate a fifth sub-object corresponding to the two pieces of first type information, where the fifth sub-object may include text content, a control, an identifier of an application, and link information.
It can be understood that, in the embodiment of the present invention, the electronic device may determine the current user's scene through the combination type of the target external device, and provide different desktop contents for the user. With the increasing types of peripherals connected with the electronic equipment, the method can more accurately position the current use scene of the user and give intelligent recommendation.
In the embodiment of the invention, the electronic equipment can determine the first object to be displayed according to the target type information and the target information of the external equipment so as to display the first object corresponding to the target external equipment without inputting the first object for many times by a user, so that the operation of the user can be simplified and the time consumption is saved.
Step 202c, the electronic device displays the first object.
Illustratively, the cell phone may determine the first object based on the first sub-object (e.g., "this day step: 11860 step", "distance: 5 km", "consumption: 125 kcal", etc.), the control (e.g., step counting control), the icon of the application (e.g., icon of step counting application, icon of sports application, etc.), and the link information (e.g., "view near day step count case" link information), and the second sub-object (e.g., "current temperature: 25 ℃", "air quality: good", "weather case: fine", etc.), the control (e.g., text control for displaying text content), the identification of the application (e.g., icon of weather application, icon of sports application, etc.), the link information (e.g., "running path planning" link information)), display the target interface 10 as shown in fig. 3, the target interface 10 includes a first object therein.
As another example, the mobile phone may determine the first object according to a first sub-object (a control (e.g., "set timing off" control, "set timing on" control, etc.), an icon of an application (e.g., "light music application icon, indoor environment application icon, etc.), and a second sub-object (text content (e.g.," yesterday sleep condition, "deep sleep: 4h," "shallow sleep: 3h," etc.), an identification of the control (e.g., text control), and an identification of the application corresponding to the smart watch and the smart desk lamp (e.g., a sleep aid application icon, a sleep monitor application icon, etc.)), and display a target interface 11 as shown in fig. 4, where the target interface 11 includes the first object.
As another example, the mobile phone may display the target interface 12 as shown in fig. 5 according to a fifth sub-object (text content (e.g., "recent popular video recommendation:" text content, "" recently watched of the television: "text content, etc.), text control and control (e.g.," volume "control," channel "control," set migration "control, etc.), icon of application (e.g.," icon of a video application, icon of a live application, "set" application icon, etc.), and link information (e.g., "a popular video" link information, "B popular video" link information, etc.)), where the target interface 12 includes the first object.
In the embodiment of the invention, the electronic equipment can determine the first object corresponding to the first use scene and the target external equipment according to the target type information and the target information of the external equipment and display the first object without inputting the first object for many times by a user, so that the operation of the user can be simplified and the time consumption can be saved.
Optionally, in the embodiment of the present invention, after the step 202, the interface display method provided in the embodiment of the present invention may further include the following steps 301 and 302.
Step 301, the electronic device detects whether the first scene information changes.
Step 302, in the case that it is detected that the first scene information is changed into the second scene information, the electronic device updates the first object to a second object corresponding to the second scene information and the target external device.
Optionally, in this embodiment of the present invention, the second scene information is different from the first scene information.
It is understood that, if the first scene information is detected to be changed, the electronic device may perform step 201 and step 202 again.
In the embodiment of the invention, under the condition that the first scene information changes, the electronic device can display the first object corresponding to the target external device according to the changed first scene information without inputting for many times by a user, so that the operation of the user can be simplified, and the use experience of the user is improved.
Optionally, in the embodiment of the present invention, with reference to fig. 2, as shown in fig. 11, after step 202, the interface display method provided in the embodiment of the present invention may further include step 401 and step 402 described below.
Step 401, the electronic device receives a first input of a user.
In an embodiment of the present invention, the first input is an input of a target object in the first object by a user.
In an embodiment of the present invention, the first input is used to trigger the electronic device to execute an action corresponding to the target object.
Optionally, in the embodiment of the present invention, the target object may be a control (e.g., a control), or may be link information.
Optionally, in this embodiment of the present invention, the first input may specifically be a selection input of a target object by a user.
Optionally, in this embodiment of the present invention, the target object may be one object or multiple objects in the first object.
Step 402, the electronic device executes a first operation in response to a first input.
In an embodiment of the present invention, the first operation is an operation corresponding to a target object.
Illustratively, in conjunction with fig. 3, as shown in fig. 12, after the user makes a first input on the target interface 10 for a target object (e.g., "running route planning" link information) in the first object, the mobile phone performs a first operation (e.g., displaying map information of the user's current location) in response to the first input; the user can make an input (e.g., a slide input) on the map information to cause the cellular phone to display navigation information (e.g., the route 13), and the cellular phone can output the navigation information in the form of voice broadcast.
Further exemplarily, referring to fig. 5, as shown in fig. 13, after the user makes a first input on the target interface 12 for a target object (e.g., a "set migration" control) in the first object, the mobile phone performs a first operation (e.g., displays a "set migration function" interface) in response to the first input; the user can make an input (e.g., a capture input) on the interface to cause the cell phone to migrate the settings of the living room smart television to other smart televisions (e.g., a smart television being captured).
In the embodiment of the invention, a user can directly perform the first input on the electronic device for the first object to trigger the electronic device to execute the first operation (namely, the operation corresponding to the target object), so that the electronic device can rapidly control the target external device to execute the operation corresponding to the first object, and the use experience of the user is improved.
Fig. 14 shows a schematic diagram of a possible structure of an electronic device involved in the embodiment of the present invention. As shown in fig. 14, the electronic device 90 may include: an acquisition module 91 and a display module 92.
The obtaining module 91 is configured to obtain first scene information in a state of being connected to a target external device, where the first scene information includes feature information indicating a current usage scene of an electronic device. And a display module 92, configured to display a first object corresponding to the target external device based on the first scene information acquired by the acquisition module 91.
In a possible implementation manner, the first scene information includes first sub information and second sub information. The first sub information is used for indicating the type of the target external device. The second sub information includes at least one of: first location information and first time information. The first position information is used for indicating the current position of the electronic equipment, and the first time information is used for indicating the current system time of the electronic equipment.
In a possible implementation manner, the display module 92 is specifically configured to determine a current first usage scenario of the electronic device according to the first sub-information and the second sub-information; determining a first object corresponding to the first use scene and the target external device; and displaying the first object.
In a possible implementation manner, the display module 92 is specifically configured to determine, as the first object, an object corresponding to a preset outdoor event when the first usage scene is an outdoor scene; or, under the condition that the first usage scene is an indoor scene, determining an object corresponding to a preset indoor event as a first object; or, when the first usage scenario is a scenario in which at least two external devices are connected, determining an object corresponding to the external device control event as the first object.
In a possible implementation manner, referring to fig. 14, as shown in fig. 15, an electronic device 90 provided in an embodiment of the present invention further includes: a detection module 93 and an update module 94. The detecting module 93 is configured to detect whether the first scene information changes after the display module 92 displays the first object corresponding to the target external device. An updating module 94, configured to update the first object to a second object corresponding to the second scene information and the target external device when the detecting module 93 detects that the first scene information becomes the second scene information.
In a possible implementation manner, referring to fig. 14, as shown in fig. 16, an electronic device 90 provided in an embodiment of the present invention further includes: a receiving module 95 and an executing module 96. The receiving module 95 is configured to receive a first input of a user, where the first input is an input of the first object by the user. An executing module 96, configured to execute a first operation in response to the first input received by the receiving module 95, where the first operation is an operation corresponding to the first object.
The electronic device provided by the embodiment of the present invention can implement each process implemented by the electronic device in the above method embodiments, and for avoiding repetition, detailed descriptions are not repeated here.
The embodiment of the invention provides electronic equipment, which can determine the current use scene of the electronic equipment based on the connected target external equipment, adaptively display the first object corresponding to the use scene, and does not need a user to input for many times, so that the operation of the user can be simplified, and the time consumption is saved.
Fig. 17 is a hardware schematic diagram of an electronic device implementing various embodiments of the invention. As shown in fig. 17, electronic device 100 includes, but is not limited to: radio frequency unit 101, network module 102, audio output unit 103, input unit 104, sensor 105, display unit 106, user input unit 107, interface unit 108, memory 109, processor 110, and power supply 111.
It should be noted that the electronic device structure shown in fig. 17 does not constitute a limitation of the electronic device, and the electronic device may include more or less components than those shown in fig. 17, or combine some components, or arrange different components, as will be understood by those skilled in the art. In the embodiment of the present invention, the electronic device includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal, a wearable device, a pedometer, and the like.
The processor 110 is configured to acquire first scene information in a state of being connected with a target external device, where the first scene information includes feature information indicating a current usage scene of an electronic device.
The display unit 106 is configured to display a first object corresponding to the target external device based on the first scene information acquired by the processor 110.
The embodiment of the invention provides electronic equipment, which can determine the current use scene of the electronic equipment based on the connected target external equipment, adaptively display the first object corresponding to the use scene, and does not need a user to input for many times, so that the operation of the user can be simplified, and the time consumption is saved.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 101 may be used for receiving and sending signals during a message transmission or call process, and specifically, after receiving downlink data from a base station, the downlink data is processed by the processor 110; in addition, the uplink data is transmitted to the base station. Typically, radio frequency unit 101 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 101 can also communicate with a network and other devices through a wireless communication system.
The electronic device provides wireless broadband internet access to the user via the network module 102, such as assisting the user in sending and receiving e-mails, browsing web pages, and accessing streaming media.
The audio output unit 103 may convert audio data received by the radio frequency unit 101 or the network module 102 or stored in the memory 109 into an audio signal and output as sound. Also, the audio output unit 103 may also provide audio output related to a specific function performed by the electronic apparatus 100 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 103 includes a speaker, a buzzer, a receiver, and the like.
The input unit 104 is used to receive an audio or video signal. The input unit 104 may include a Graphics Processing Unit (GPU) 1041 and a microphone 1042, and the graphics processor 1041 processes image data of a still picture or video obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 106. The image frames processed by the graphic processor 1041 may be stored in the memory 109 (or other storage medium) or transmitted via the radio frequency unit 101 or the network module 102. The microphone 1042 may receive sound and may be capable of processing such sound into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 101 in case of a phone call mode.
The electronic device 100 also includes at least one sensor 105, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor that can adjust the brightness of the display panel 1061 according to the brightness of ambient light, and a proximity sensor that can turn off the display panel 1061 and/or the backlight when the electronic device 100 is moved to the ear. As one type of motion sensor, an accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used to identify the posture of an electronic device (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), and vibration identification related functions (such as pedometer, tapping); the sensors 105 may also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, infrared sensors, etc., which are not described in detail herein.
The display unit 106 is used to display information input by a user or information provided to the user. The display unit 106 may include a display panel 1061, and the display panel 1061 may be configured in the form of a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), or the like.
The user input unit 107 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the electronic device. Specifically, the user input unit 107 includes a touch panel 1071 and other input devices 1072. Touch panel 1071, also referred to as a touch screen, may collect touch operations by a user on or near the touch panel 1071 (e.g., operations by a user on or near touch panel 1071 using a finger, stylus, or any suitable object or attachment). The touch panel 1071 may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 110, and receives and executes commands sent by the processor 110. In addition, the touch panel 1071 may be implemented in various types, such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. In addition to the touch panel 1071, the user input unit 107 may include other input devices 1072. Specifically, other input devices 1072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein.
Further, the touch panel 1071 may be overlaid on the display panel 1061, and when the touch panel 1071 detects a touch operation thereon or nearby, the touch panel 1071 transmits the touch operation to the processor 110 to determine the type of the touch event, and then the processor 110 provides a corresponding visual output on the display panel 1061 according to the type of the touch event. Although in fig. 17, the touch panel 1071 and the display panel 1061 are two independent components to implement the input and output functions of the electronic device, in some embodiments, the touch panel 1071 and the display panel 1061 may be integrated to implement the input and output functions of the electronic device, and is not limited herein.
The interface unit 108 is an interface for connecting an external device to the electronic apparatus 100. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 108 may be used to receive input (e.g., data information, power, etc.) from an external device and transmit the received input to one or more elements within the electronic apparatus 100 or may be used to transmit data between the electronic apparatus 100 and the external device.
The memory 109 may be used to store software programs as well as various data. The memory 109 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 109 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 110 is a control center of the electronic device, connects various parts of the entire electronic device using various interfaces and lines, performs various functions of the electronic device and processes data by operating or executing software programs and/or modules stored in the memory 109 and calling data stored in the memory 109, thereby performing overall monitoring of the electronic device. Processor 110 may include one or more processing units; alternatively, the processor 110 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 110.
The electronic device 100 may further include a power supply 111 (e.g., a battery) for supplying power to each component, and optionally, the power supply 111 may be logically connected to the processor 110 through a power management system, so as to implement functions of managing charging, discharging, and power consumption through the power management system.
In addition, the electronic device 100 includes some functional modules that are not shown, and are not described in detail herein.
Optionally, an embodiment of the present invention further provides an electronic device, which includes the processor 110 shown in fig. 17, the memory 109, and a computer program stored in the memory 109 and capable of running on the processor 110, where the computer program, when executed by the processor 110, implements the processes of the foregoing method embodiment, and can achieve the same technical effects, and details are not described here to avoid repetition.
The embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements the processes of the method embodiments, and can achieve the same technical effects, and in order to avoid repetition, the details are not repeated here. The computer-readable storage medium may be, for example, a read-only memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling an electronic device (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
While the present invention has been described with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, which are illustrative and not restrictive, and it will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (13)

1. An interface display method is applied to electronic equipment, and is characterized by comprising the following steps:
acquiring first scene information under the state of being connected with target external equipment, wherein the first scene information comprises characteristic information used for indicating the current use scene of the electronic equipment;
and displaying a first object corresponding to the target external equipment based on the first scene information.
2. The method according to claim 1, wherein the first scene information includes first sub information and second sub information;
the first sub information is used for indicating the type of the target external device;
the second sub information includes at least one of: first location information and first time information;
the first position information is used for indicating the current position of the electronic equipment, and the first time information is used for indicating the current system time of the electronic equipment.
3. The method of claim 2, wherein displaying the first object corresponding to the target peripheral based on the first scene information comprises:
determining a current first use scene of the electronic equipment according to the first sub information and the second sub information;
determining a first object corresponding to the first usage scene and the target external device;
and displaying the first object.
4. The method of claim 3, wherein determining the first object corresponding to the first usage scenario and the target peripheral device comprises:
determining an object corresponding to a preset outdoor event as the first object when the first usage scene is an outdoor scene;
determining an object corresponding to a preset indoor event as the first object when the first usage scene is an indoor scene;
and under the condition that the first use scene is a scene for connecting at least two external devices, determining an object corresponding to the external device control event as the first object.
5. The method of claim 1, wherein after displaying the first object corresponding to the target peripheral, the method further comprises:
detecting whether the first scene information changes;
and updating the first object to a second object corresponding to the second scene information and the target external device when the first scene information is detected to be changed into the second scene information.
6. The method of claim 1, wherein after displaying the first object corresponding to the target peripheral, the method further comprises:
receiving a first input of a user, wherein the first input is input of the first object by the user;
in response to the first input, performing a first operation, the first operation being an operation corresponding to the first object.
7. An electronic device, characterized in that the electronic device comprises: the device comprises an acquisition module and a display module;
the acquisition module is used for acquiring first scene information under the state of being connected with target external equipment, wherein the first scene information comprises characteristic information used for indicating the current use scene of the electronic equipment;
the display module is configured to display a first object corresponding to the target external device based on the first scene information acquired by the acquisition module.
8. The electronic device according to claim 7, wherein the first scene information includes first sub information and second sub information;
the first sub information is used for indicating the type of the target external device;
the second sub information includes at least one of: first location information and first time information;
the first position information is used for indicating the current position of the electronic equipment, and the first time information is used for indicating the current system time of the electronic equipment.
9. The electronic device according to claim 8, wherein the display module is specifically configured to determine a current first usage scenario of the electronic device according to the first sub-information and the second sub-information; determining a first object corresponding to the first use scene and the target external device; and displaying the first object.
10. The electronic device according to claim 9, wherein the display module is specifically configured to determine, as the first object, an object corresponding to a preset outdoor event when the first usage scene is an outdoor scene; or, when the first usage scene is an indoor scene, determining an object corresponding to a preset indoor event as the first object; or, when the first usage scenario is a scenario in which at least two external devices are connected, determining an object corresponding to an external device control event as the first object.
11. The electronic device of claim 7, further comprising: the device comprises a detection module and an updating module;
the detection module is used for detecting whether the first scene information changes after the display module displays the first object corresponding to the target external device;
the updating module is configured to update the first object to a second object corresponding to the second scene information and the target external device when the detecting module detects that the first scene information is changed into the second scene information.
12. The electronic device of claim 7, further comprising: a receiving module and an executing module;
the receiving module is used for receiving a first input of a user, wherein the first input is input of the first object by the user;
the executing module is configured to execute a first operation in response to the first input received by the receiving module, where the first operation is an operation corresponding to the first object.
13. An electronic device comprising a processor, a memory and a computer program stored on the memory and executable on the processor, the computer program, when executed by the processor, implementing the steps of the interface display method according to any one of claims 1 to 6.
CN201911039607.XA 2019-10-29 2019-10-29 Interface display method and electronic equipment Pending CN111221602A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911039607.XA CN111221602A (en) 2019-10-29 2019-10-29 Interface display method and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911039607.XA CN111221602A (en) 2019-10-29 2019-10-29 Interface display method and electronic equipment

Publications (1)

Publication Number Publication Date
CN111221602A true CN111221602A (en) 2020-06-02

Family

ID=70810121

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911039607.XA Pending CN111221602A (en) 2019-10-29 2019-10-29 Interface display method and electronic equipment

Country Status (1)

Country Link
CN (1) CN111221602A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113301546A (en) * 2021-04-25 2021-08-24 荣耀终端有限公司 Method and device for searching wearable device
CN113407076A (en) * 2021-06-04 2021-09-17 荣耀终端有限公司 Method for starting application and electronic equipment
CN114003325A (en) * 2021-10-21 2022-02-01 深圳市欧瑞博科技股份有限公司 Method and device for intelligently displaying content

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102308272A (en) * 2011-07-07 2012-01-04 华为终端有限公司 Method and device for automatic display of applications on home screen
CN102622159A (en) * 2011-01-28 2012-08-01 炬力集成电路设计有限公司 Portable equipment as well as realizing method and system of user interface of same
CN104298505A (en) * 2014-09-23 2015-01-21 深圳市金立通信设备有限公司 Operation method for application program
CN105045469A (en) * 2015-09-11 2015-11-11 北京金山安全软件有限公司 Display method and display device for shortcut entrance of application program and electronic equipment
CN106484392A (en) * 2016-09-08 2017-03-08 北京小米移动软件有限公司 icon display method and device
CN106547533A (en) * 2016-07-15 2017-03-29 乐视控股(北京)有限公司 A kind of display packing and device
CN110221737A (en) * 2019-04-29 2019-09-10 东莞市步步高通信软件有限公司 A kind of icon display method and terminal device

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102622159A (en) * 2011-01-28 2012-08-01 炬力集成电路设计有限公司 Portable equipment as well as realizing method and system of user interface of same
CN102308272A (en) * 2011-07-07 2012-01-04 华为终端有限公司 Method and device for automatic display of applications on home screen
CN104298505A (en) * 2014-09-23 2015-01-21 深圳市金立通信设备有限公司 Operation method for application program
CN105045469A (en) * 2015-09-11 2015-11-11 北京金山安全软件有限公司 Display method and display device for shortcut entrance of application program and electronic equipment
CN106547533A (en) * 2016-07-15 2017-03-29 乐视控股(北京)有限公司 A kind of display packing and device
CN106484392A (en) * 2016-09-08 2017-03-08 北京小米移动软件有限公司 icon display method and device
CN110221737A (en) * 2019-04-29 2019-09-10 东莞市步步高通信软件有限公司 A kind of icon display method and terminal device

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113301546A (en) * 2021-04-25 2021-08-24 荣耀终端有限公司 Method and device for searching wearable device
CN113301546B (en) * 2021-04-25 2022-07-01 荣耀终端有限公司 Method and device for searching wearable device
CN113407076A (en) * 2021-06-04 2021-09-17 荣耀终端有限公司 Method for starting application and electronic equipment
CN114003325A (en) * 2021-10-21 2022-02-01 深圳市欧瑞博科技股份有限公司 Method and device for intelligently displaying content

Similar Documents

Publication Publication Date Title
CN110995923B (en) Screen projection control method and electronic equipment
CN109743498B (en) Shooting parameter adjusting method and terminal equipment
CN109525874B (en) Screen capturing method and terminal equipment
CN109543099B (en) Content recommendation method and terminal equipment
CN109862504B (en) Display method and terminal equipment
CN110062105B (en) Interface display method and terminal equipment
CN111142991A (en) Application function page display method and electronic equipment
CN108174103B (en) Shooting prompting method and mobile terminal
CN108279948B (en) Application program starting method and mobile terminal
CN110109593B (en) Screen capturing method and terminal equipment
CN109558046B (en) Information display method and terminal equipment
CN110099296B (en) Information display method and terminal equipment
CN109407920B (en) Status icon display method, status icon processing method and related equipment
CN108762877B (en) Control method of mobile terminal interface and mobile terminal
CN107783709B (en) Image viewing method and mobile terminal
CN107808107B (en) Application message display method and mobile terminal
CN108984066B (en) Application icon display method and mobile terminal
CN108762613B (en) State icon display method and mobile terminal
CN108174109B (en) Photographing method and mobile terminal
CN108874906B (en) Information recommendation method and terminal
CN111221602A (en) Interface display method and electronic equipment
CN108549660B (en) Information pushing method and device
CN110990679A (en) Information searching method and electronic equipment
CN107832067B (en) Application updating method, mobile terminal and computer readable storage medium
CN110167006B (en) Method for controlling application program to use SIM card and terminal equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination