CN110688179A - Display method and terminal equipment - Google Patents

Display method and terminal equipment Download PDF

Info

Publication number
CN110688179A
CN110688179A CN201910818118.8A CN201910818118A CN110688179A CN 110688179 A CN110688179 A CN 110688179A CN 201910818118 A CN201910818118 A CN 201910818118A CN 110688179 A CN110688179 A CN 110688179A
Authority
CN
China
Prior art keywords
interface
screen
display
intelligent household
functional component
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910818118.8A
Other languages
Chinese (zh)
Other versions
CN110688179B (en
Inventor
刘德
杨嘉辰
许天亮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN201910818118.8A priority Critical patent/CN110688179B/en
Publication of CN110688179A publication Critical patent/CN110688179A/en
Application granted granted Critical
Publication of CN110688179B publication Critical patent/CN110688179B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04817Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/445Program loading or initiating
    • G06F9/44521Dynamic linking or loading; Link editing at or after load time, e.g. Java class loading
    • G06F9/44526Plug-ins; Add-ons

Abstract

The application provides a display method and terminal equipment, and the method can be applied to the terminal equipment in an intelligent home control system, wherein the intelligent home control system comprises a server, the terminal equipment and at least one piece of intelligent home equipment. The method comprises the following steps: the terminal equipment displays a first interface, and the first interface comprises icons of a plurality of different intelligent household equipment. When the terminal equipment receives opening operation of icons of at least two pieces of intelligent household equipment acted by a user, responding to the opening operation, displaying a second interface in a first display area of a display screen of the terminal equipment, and displaying a third interface in a second display area of the display screen. The terminal equipment receives configuration operation of a user, responds to the configuration operation, generates control information related to the first intelligent home equipment and the second intelligent home equipment, and sends the control information to the server. The method can simplify the complexity of the setting operation of the intelligent household equipment by the user through the terminal equipment, and further improve the user experience.

Description

Display method and terminal equipment
Technical Field
The present application relates to the field of terminal technologies, and in particular, to a display method and a terminal device.
Background
More and more intelligent devices are used for communication based on a wireless-fidelity (Wi-Fi) network, and in the field of intelligent home, a large number of intelligent home devices can be accessed to the Wi-Fi network, so that control is realized through terminal devices (such as smart phones). For example, a user may control the smart home device by operating an Application (APP) installed on the terminal device.
In the intelligent home system, more and more intelligent home devices are provided, and the intelligent degree is higher and higher. At present, linkage is supported between intelligent household devices, so that intelligent experience of a user is obviously improved. For example, when the temperature sensor detects that the temperature is higher than a certain value, the air conditioner is automatically turned on for cooling, namely, the linkage between the temperature sensor and the air conditioner can be realized.
However, the operation that the user carries out intelligent linkage management on the intelligent home devices through the intelligent home APP is more complicated, for example, the linkage rules of a plurality of intelligent home devices are: and if the condition 1 of the intelligent household equipment A is met, executing the action 2 of the intelligent household equipment B. The user operates the intelligent home APP, the user can complete the configuration of the linkage rule only by carrying out level jump among a plurality of human-computer interaction interfaces, the operation is complex, and the use experience of the user is influenced.
Disclosure of Invention
The application provides a display method and terminal equipment, which are used for simplifying control operation of a user on intelligent household equipment and improving user experience.
In a first aspect, an embodiment of the present application provides a display method, which may be applied to a terminal device of an intelligent home control system, where the intelligent home control system includes a server, the terminal device and at least two intelligent home devices, where the terminal device and the at least two intelligent home devices establish a network connection with the server, and the at least two intelligent home devices include a first intelligent home device and a second intelligent home device. The method comprises the following steps: the terminal equipment displays a first interface, the first interface comprises icons of the at least two pieces of intelligent household equipment, the icons of the at least two pieces of intelligent household equipment comprise the icon of the first intelligent household equipment and the icon of the second intelligent household equipment, then the terminal equipment receives opening operation of a user acting on the icon of the first intelligent household equipment and the icon of the second intelligent household equipment, in response to the opening operation, a first display area of the display screen displays a second interface, a second display area of the display screen displays a third interface, the second interface comprises functional components of the first intelligent household equipment, and the third interface comprises functional components of the second intelligent household equipment. Further, receiving configuration operation of a user, wherein the configuration operation is used for establishing an association relationship between the functional component of the second interface and the functional component of the third interface; and then responding to the configuration operation, the terminal equipment generates control information related to the first intelligent household equipment and the second intelligent household equipment, and sends the control information to the server.
In the embodiment of the application, the terminal device control display screen can display the control interfaces of the intelligent household devices on the same screen, the user does not need to carry out level jump and can complete the association setting only by simply configuring and operating, so that the control operation of the user on the intelligent household devices can be simplified, and the user use experience is improved.
In one possible design, the configuration operation received by the terminal includes the following information: the terminal equipment can receive confirmation operation of a first functional component acted on the second interface by a user, the first functional component corresponds to a first state supported by the first intelligent household equipment, and a third interface displayed by the display screen is switched to a fourth interface in response to the confirmation operation; the fourth interface includes at least one second functional component of the second smart home device, the second functional component corresponds to a control function supported by the second smart home device in the first state, the functional component corresponding to the control function that is not supported by the second smart home device in the first state is hidden or set to an inoperable state, and then the terminal device receives a confirmation operation that a user acts on the at least one second functional component.
In the embodiment of the application, the user can complete the setting of the linkage rule without carrying out level jump.
In one possible design, the terminal device may first obtain configuration information from the server, where the configuration information includes association rules between functional components of different smart home devices; and then determining at least one second functional component of second intelligent household equipment corresponding to the first functional component according to the configuration information.
In the embodiment of the application, the configuration information which is acquired by the terminal device from the server in advance comprises the association rule, so that the terminal device can automatically recommend information to the user, and the configuration operation of the user is further simplified.
In one possible design, when the display screen is a foldable screen, the foldable screen is foldable to form at least two screens, the at least two screens including a first screen and a second screen. And when the folding screen is in the unfolding state, responding to the opening operation, controlling the first screen of the folding screen of the terminal equipment to display a second interface, and controlling the second screen of the folding screen to display a third interface.
In the embodiment of the application, the display area of the folded screen after being completely unfolded is fully utilized, and the utilization rate of the screen is improved.
In one possible design, when the display screen of the terminal device is a foldable screen, the foldable screen can be folded to form at least two screens, and the at least two screens include a first screen and a second screen. After the terminal device sends the control information to the server, the method further comprises the following steps:
the terminal equipment receives the operation of a user, closes the second interface and the third interface, and displays the first interface in a folded state; and receiving a second operation of the user on an icon of the third intelligent household equipment in the first interface.
Responding to the second operation, controlling the first screen of the folding screen to display a fifth interface of the third intelligent household equipment, wherein the fifth interface comprises a displayed functional component and a control used for triggering and displaying the hidden functional component;
in response to the folding screen being converted from the folding state to the unfolding state, controlling a first screen of the folding screen to display the displayed functional components and controlling a second screen of the folding screen to display the hidden functional components;
the folding state is a state that an included angle between the first screen and the second screen is smaller than or equal to a preset angle threshold value, and the unfolding state is a state that the included angle between the first screen and the second screen is larger than the preset angle threshold value.
In the above embodiment, when the folding screen completely expanded, all functional components of a single smart home device can be displayed, so that the user operation is facilitated, meanwhile, the screen display area of the folding screen is also fully utilized, and the user experience can be prompted to a certain extent.
In one possible embodiment, the first display area and the second display area are pre-configured in the terminal device; alternatively, the first display area and the second display area are set by the user in the terminal device.
In the embodiment of the application, the method can enable the display mode of the display screen to be more flexible and changeable.
In a second aspect, an embodiment of the present application provides a terminal device, including a sensor, a touch screen, a processor, and a memory, where the memory is used to store one or more computer programs; the one or more computer programs stored in the memory, when executed by the processor, enable the terminal device to implement any of the possible design methods of any of the aspects described above.
In a third aspect, the present application further provides an apparatus including a module/unit for performing the method of any one of the possible designs of any one of the above aspects. These modules/units may be implemented by hardware, or by hardware executing corresponding software.
In a fourth aspect, this embodiment also provides a computer-readable storage medium, which includes a computer program and when the computer program runs on an electronic device, causes the electronic device to execute any one of the possible design methods of any one of the above aspects.
In a fifth aspect, the present application further provides a computer program product, which when run on a terminal, causes the electronic device to execute any one of the possible design methods of any one of the above aspects.
In a sixth aspect, the present application further provides a chip, coupled with a memory, for executing a computer program stored in the memory to perform any one of the possible design methods of the foregoing aspects.
In a seventh aspect, an embodiment of the present application further provides a display method, which is applied to a terminal device in an intelligent home control system, where the intelligent home control system includes the terminal device and at least two intelligent home devices, and the at least two intelligent home devices include a first intelligent home device and a second intelligent home device, and the display method includes: displaying a first interface, wherein the first interface comprises icons of the at least two pieces of intelligent household equipment, and the icons of the at least two pieces of intelligent household equipment comprise an icon of the first intelligent household equipment and an icon of the second intelligent household equipment; receiving opening operation of a user on the icon of the first intelligent household equipment and the icon of the second intelligent household equipment; responding to the opening operation, a first display area of a display screen of the terminal equipment displays a second interface, a second display area of the display screen displays a third interface, the second interface comprises a functional component of the first intelligent household equipment, and the third interface comprises a functional component of the second intelligent household equipment; receiving configuration operation of a user, wherein the configuration operation is used for establishing an association relationship between the functional components of the second interface and the functional components of the third interface; and responding to the configuration operation, and the terminal equipment generates control information related to the first intelligent household equipment and the second intelligent household equipment.
In an eighth aspect, an embodiment of the present application further provides a terminal device, where the terminal device is in an intelligent home control system, the intelligent home control system includes at least two pieces of intelligent home equipment, the at least two pieces of intelligent home equipment establish network connection, the at least two pieces of intelligent home equipment include a first piece of intelligent home equipment and a second piece of intelligent home equipment, and the terminal device includes a display screen, a processor, and a memory; the memory stores program instructions; the processor is configured to execute the program instructions stored in the memory, so as to enable the terminal device to execute the method of the seventh aspect.
In a ninth aspect, embodiments of the present application further provide a computer-readable storage medium, where the computer-readable storage medium includes program instructions, and when the program instructions are executed on a terminal device, the terminal device is caused to execute the method of the seventh aspect.
Drawings
Fig. 1A to fig. 1C are schematic diagrams of a group of terminal devices displaying a main interface according to an embodiment of the present application;
fig. 2 is a schematic diagram of an intelligent home control system architecture provided in an embodiment of the present application;
fig. 3 is a schematic structural diagram of a mobile phone according to an embodiment of the present application;
fig. 4 is a block diagram of a software structure of a terminal device according to an embodiment of the present disclosure;
fig. 5A and 5B are schematic diagrams of a wide screen device according to an embodiment of the present application;
fig. 6 is a schematic flowchart of a display method according to an embodiment of the present application;
fig. 7A to 7F are schematic diagrams of a set of terminal device interfaces provided in the embodiment of the present application;
fig. 8A to 8J are schematic diagrams of another set of interfaces of terminal devices according to an embodiment of the present disclosure;
fig. 9A to 9B are schematic diagrams of another terminal device interface provided in the embodiment of the present application;
fig. 10A and 10D are schematic flow charts of another display method provided in the embodiments of the present application;
fig. 10B and fig. 10C are schematic diagrams of another terminal device interface provided in the embodiment of the present application;
fig. 11 is a schematic structural diagram of a terminal according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions of the embodiments of the present application will be described in detail below with reference to the drawings and specific embodiments of the specification.
In the following, some terms in the embodiments of the present application are explained to facilitate understanding by those skilled in the art.
1) The APP related to the embodiment of the present application, referred to as an application for short, is a software program capable of implementing a certain specific function or functions. Generally, a plurality of applications can be installed in a terminal device. For example, a camera application, a mailbox application, an intelligent home control application, and the like. The application mentioned below may be a system application installed when the terminal device leaves a factory, or may be a third-party application downloaded from a network or acquired from another terminal device by a user during the process of using the terminal device.
2) The Service (Service) according to the embodiment of the present application refers to a function or capability that the smart home device has. For example, if the smart home device is a temperature detector, then Service may be temperature detection and display, etc., and if the smart home device is a lamp, then Service may be adjusting brightness, color, or color temperature, etc. For another example, if the smart home device is a socket, the Service may be a switch, etc.
3) The functional component refers to a control on an interface providing Service, wherein the corresponding relationship between the Service and the functional component can be a one-to-many relationship. For example, Service may be brightness, color, or color temperature, etc., and then the functional component may be a switch functional component, a brightness functional component, a color functional component, a timing functional component as shown in (h) of fig. 1B, or for example, Service may be temperature, and then the functional component may be a temperature state functional component as shown in fig. 8B.
4) The intelligent home is characterized in that a home is used as a platform, facilities related to home life are integrated by utilizing a comprehensive wiring technology, a network communication technology, a safety precaution technology, an automatic control technology and an audio and video technology, an efficient management system of home facilities and family schedule affairs is constructed, home safety, convenience, comfortableness and artistry are improved, and environment-friendly and energy-saving living environments are realized.
In the prior art, for example, if the human body sensor and the writing desk lamp need to be linked, that is, when the human body sensor detects that a person moves, the writing desk lamp is automatically turned on, and usually a user completes setting of a linkage rule in a manner shown in fig. 1A to 1C. When a user opens the smart home APP in the mobile phone, the mobile phone displays an interface 10 as shown in (a) in fig. 1A, and when the mobile phone detects an opening operation of the user on the new control 11, the mobile phone displays an interface 20 as shown in (b) in fig. 1A. When the cell phone detects that the user acts on the opening operation for creating the smart control 21, the cell phone displays the interface 30 as shown in (c) of fig. 1A. When the cell phone detects an opening operation by the user on smart orchestration control 31, the cell phone displays interface 40 as shown in (d) in fig. 1A. On the one hand, when the mobile phone detects an opening operation of the condition control 41 by the user, the mobile phone displays an interface 50 as shown in (e) of fig. 1B. When the mobile phone detects an opening operation of the user on the human body sensor control 51, the mobile phone displays an interface 60 as shown in (f) of fig. 1B. When the mobile phone detects that the user selects the switch control 61, the human body sensor is turned on. On the other hand, when the cell phone detects an opening operation by the user on the add task control 42, the cell phone displays an interface 70 as shown in (g) in fig. 1B. When the mobile phone detects an opening operation of the user on the desk lamp 71 control 71, the mobile phone displays an interface 80 as shown in (h) in fig. 1B. When the mobile phone detects that the user acts on the selection operation of the reading mode control 81, the display interface of the mobile phone automatically jumps to an interface 90 shown in (i) of fig. 1C, and when the mobile phone detects that the user acts on the confirmation operation of the completion control 91, the linkage setting between the human body sensor and the writing desk lamp is completed, that is, when the human body sensor detects that someone moves, the writing desk lamp is automatically turned on, and the default working mode of the writing desk lamp is the reading mode. Therefore, in the prior art, when a user controls the smart home devices to be linked through a mobile phone, the user needs to switch between the control interfaces of different smart home devices for multiple times to complete the setting, and the operation is very complicated.
Based on this, the embodiment of the application provides a display method, which can simplify the linkage setting operation steps of the smart home devices, and can be applied to the terminal device for controlling the smart home devices. The method comprises the following steps: the display screen of the terminal equipment can simultaneously display the control interfaces of the plurality of intelligent household equipment and receive configuration operation of a user on each control interface, wherein the configuration operation is used for establishing association relation among the functional components of the plurality of intelligent household equipment. And finally, the terminal equipment generates control information related to the plurality of intelligent household equipment, the control information is sent to the server, and the server sends instructions corresponding to the control information to the plurality of intelligent household equipment through the network, so that the plurality of intelligent household equipment can be controlled simultaneously, the problem that the operation steps are complicated due to the fact that a user switches back and forth among a plurality of control interfaces is avoided, and the user experience can be improved.
Fig. 2 schematically illustrates a system architecture diagram applicable to the embodiment of the present application.
As shown in fig. 2, the system architecture includes: server 200, smart home devices 201 and terminal devices 202.
The server 200 can perform information interaction with the smart home devices 201 and the terminal devices 202. In some scenarios, the server 200 may be connected to the smart home devices 201 and the terminal device 202 through a network. The server 200 may also be a cloud server located on the network side, or an intelligent gateway provided in the home network, or a router in the home network, or the like.
The smart home devices 201 may be one or more, or may be multiple types of smart home devices, and for example, may include: the intelligent LED lamp, intelligent stereo set, intelligent air conditioner, auxiliary robot, all kinds of sensors (such as temperature sensor, humidity transducer, light sensor, human body sensor etc.), intelligent humidifier etc..
The terminal device 202 may communicate with the server 200 through a local area network, and may also communicate with the server 200 through a cellular network, which is not limited in this embodiment of the application. Specifically, after accessing the network, the terminal device 202 sends an instruction for controlling the smart home device 201 to the server 200 in response to the operation of the user. After receiving the instruction of the terminal device 202, the server 200 sends the instruction to the smart home devices 201 which are registered in the server 200 and bound with the terminal device 202.
In the system architecture, the server 200 may be an application server corresponding to an intelligent home application program, and is configured to implement an intelligent home control function, where the intelligent home application program may be a system application or a third-party application.
Optionally, the terminal device 202 may integrate an intelligent home application program, which is used to collect user information and user instructions related to personalized control of the intelligent home device, and upload the user information and the user instructions to the server 200.
The server 200 may further store user information and control strategies corresponding to various control scenarios of the smart home. Of course, the user information or the control policy may also be stored in other devices, and the server 200 may obtain the information from other devices.
The terminal device in the embodiment of the present application may also be referred to as a terminal, a terminal device, a User Equipment (UE), and the like. For example, the terminal device according to the embodiment of the present application may be a portable electronic device, such as a mobile phone, a tablet computer, a wearable device with a wireless communication function (e.g., a smart watch), and the like, where the APP may be installed. Exemplary embodiments of the portable electronic device include, but are not limited to, a mountOr other operating system.
Taking the terminal device 202 as a mobile phone as an example, fig. 3 shows a schematic structural diagram of the mobile phone 300.
The mobile phone 300 may include a processor 310, an external memory interface 320, an internal memory 321, a USB interface 330, a charging management module 340, a power management module 341, a battery 342, an antenna 1, an antenna 2, a mobile communication module 351, a wireless communication module 352, an audio module 370, a speaker 370A, a receiver 370B, a microphone 370C, an earphone interface 370D, a sensor module 380, keys 390, a motor 391, an indicator 392, a camera 393, a display 394, a SIM card interface 395, and the like. The sensor module 380 may include a gyroscope sensor 380A, an acceleration sensor 380B, a proximity light sensor 380G, a fingerprint sensor 380H, and a touch sensor 380K (of course, the mobile phone 300 may further include other sensors, such as a temperature sensor, a pressure sensor, a distance sensor, a magnetic sensor, an ambient light sensor, an air pressure sensor, a bone conduction sensor, and the like, which are not shown in the figure).
It is to be understood that the illustrated structure of the embodiment of the present invention is not intended to limit the mobile phone 300 specifically. In other embodiments of the present application, the handset 300 may include more or fewer components than shown, or combine certain components, or split certain components, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 310 may include one or more processing units, such as: the processor 310 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a memory, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a Neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors. The controller can be the neural center and the command center of the mobile phone 300. The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 310 for storing instructions and data. In some embodiments, the memory in the processor 310 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 310. If the processor 310 needs to reuse the instruction or data, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 310, thereby increasing the efficiency of the system.
The processor 310 may operate the display method provided by the embodiment of the present application, so as to simplify the control operation of the user on the smart home device and improve the user experience. When the processor 310 may include different devices, such as an integrated CPU and a GPU, the CPU and the GPU may cooperate to execute the display method provided in the embodiment of the present application, for example, part of the algorithm in the display method is executed by the CPU, and another part of the algorithm is executed by the GPU, so as to obtain faster processing efficiency.
The display screen 394 is used to display images, video, and the like. The display screen 394 includes a display panel. The display panel may be a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), or the like. In some embodiments, the cell phone 300 may include 1 or N display screens 394, N being a positive integer greater than 1.
In this embodiment, the display screen 394 may be an integrated flexible display screen, or a spliced display screen formed by two rigid screens and a flexible screen located between the two rigid screens may be used. After the processor 310 executes the display method provided by the embodiment of the present application, the processor 310 can control the window sizes of different interfaces of the same application on the display 394.
The cameras 393 (front or rear, or one camera can be both front and rear) are used to capture still images or video. In general, the camera 393 may include a photosensitive element such as a lens group including a plurality of lenses (convex or concave lenses) for collecting light signals reflected by an object to be photographed and transferring the collected light signals to an image sensor, and an image sensor. And the image sensor generates an original image of the object to be shot according to the optical signal.
The internal memory 321 may be used to store computer-executable program code, which includes instructions. The processor 310 executes various functional applications of the cellular phone 300 and data processing by executing instructions stored in the internal memory 321. The internal memory 321 may include a program storage area and a data storage area. Wherein the storage program area may store an operating system, codes of application programs (such as a camera application, a WeChat application, etc.), and the like. The data storage area can store data (such as images, videos and the like acquired by a camera application) and the like created during the use of the mobile phone 300.
The internal memory 321 may also store codes of the display area adjustment algorithm provided by the embodiment of the present application. When the code of the display region adjustment algorithm stored in the internal memory 321 is executed by the processor 310, the processor 310 may control the display position on the display 394 of the message in the notification bar.
In addition, the internal memory 321 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like.
Of course, the code of the display area adjustment algorithm provided in the embodiment of the present application may also be stored in the external memory. In this case, the processor 310 may run the code of the display region adjustment algorithm stored in the external memory through the external memory interface 320, and the processor 310 may control the window sizes of different interfaces of the same application on the display 394.
The function of the sensor module 380 is described below.
The gyro sensor 380A may be used to determine the motion attitude of the handset 300. In some embodiments, the angular velocity of the cell phone 300 about three axes (i.e., x, y, and z axes) may be determined by the gyro sensor 380A. I.e., the gyro sensor 380A may be used to detect the current state of motion of the handset 300, such as shaking or standing still.
The gyro sensor 380A in the embodiment of the present application may be used to detect a folding or unfolding operation acting on the display screen 394. The gyro sensor 380A may report the detected folding operation or unfolding operation as an event to the processor 310 to determine the folded state or unfolded state of the display screen 394.
The acceleration sensor 380B can detect the magnitude of acceleration of the mobile phone 300 in various directions (typically three axes). I.e., the gyro sensor 380A may be used to detect the current state of motion of the handset 300, such as shaking or standing still. The acceleration sensor 380B in the present embodiment may be used to detect a folding or unfolding operation applied to the display screen 394. The acceleration sensor 380B may report the detected folding operation or unfolding operation as an event to the processor 310 to determine the folded state or unfolded state of the display 394.
The proximity light sensor 380G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The mobile phone emits infrared light outwards through the light emitting diode. The handset uses a photodiode to detect infrared reflected light from nearby objects. When sufficient reflected light is detected, it can be determined that there is an object near the handset. When insufficient reflected light is detected, the handset can determine that there are no objects near the handset. The proximity optical sensor 380G may be disposed on a first screen of the foldable display screen 394, and the proximity optical sensor 380G may detect a folding angle or an unfolding angle of the first screen and the second screen according to an optical path difference of the infrared signal.
The sensors may be used in combination to detect the folding or unfolding operation of the foldable display 394.
The gyro sensor 380A (or the acceleration sensor 380B) may transmit the detected motion state information (such as the angular velocity) to the processor 310. The processor 310 determines whether the mobile phone 300 is in the handheld state or the tripod state (for example, when the angular velocity is not 0, it indicates that the mobile phone 300 is in the handheld state) based on the motion state information.
The fingerprint sensor 380H is used to capture a fingerprint. The mobile phone 300 can utilize the collected fingerprint characteristics to realize fingerprint unlocking, access to an application lock, fingerprint photographing, fingerprint incoming call answering and the like.
The touch sensor 380K is also referred to as a "touch panel". The touch sensor 380K may be disposed on the display screen 394, and the touch sensor 380K and the display screen 394 form a touch screen, which is also referred to as a "touch screen". The touch sensor 380K is used to detect a touch operation applied thereto or thereabout. The touch sensor can communicate the detected touch operation to the application processor to determine the touch event type. Visual output associated with the touch operation may be provided via the display 394. In other embodiments, the touch sensor 380K may be disposed on the surface of the mobile phone 300 at a different location than the display 394.
Illustratively, the display 394 of the cell phone 300 displays a main interface including icons for a plurality of applications (e.g., a camera application, a WeChat application, etc.). The user clicks the icon of the camera application in the home interface through the touch sensor 380K, which triggers the processor 310 to start the camera application and open the camera 393. The display screen 394 displays an interface, such as a viewfinder interface, for a camera application.
The wireless communication function of the mobile phone 300 can be implemented by the antenna 1, the antenna 2, the mobile communication module 351, the wireless communication module 352, the modem processor, the baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the handset 300 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 351 can provide a solution including wireless communication of 2G/3G/4G/5G, etc. applied to the handset 300. The mobile communication module 351 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 351 may receive electromagnetic waves from the antenna 1, filter, amplify, etc. the received electromagnetic waves, and transmit the electromagnetic waves to the modem processor for demodulation. The mobile communication module 351 can also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least part of the functional modules of the mobile communication module 351 may be provided in the processor 310. In some embodiments, at least some of the functional modules of the mobile communication module 351 may be provided in the same device as at least some of the modules of the processor 310.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs sound signals through an audio device (not limited to the speaker 370A, the receiver 370B, etc.) or displays images or video through the display 394. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be separate from the processor 310 and may be disposed in the same device as the mobile communication module 351 or other functional modules.
The wireless communication module 352 may provide a solution for wireless communication applied to the mobile phone 300, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), Bluetooth (BT), Global Navigation Satellite System (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR), and the like. The wireless communication module 352 may be one or more devices that integrate at least one communication processing module. The wireless communication module 352 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 310. The wireless communication module 352 may also receive a signal to be transmitted from the processor 310, frequency-modulate and amplify the signal, and convert the signal into electromagnetic waves via the antenna 2 to radiate the electromagnetic waves.
In addition, the mobile phone 300 can implement an audio function through the audio module 370, the speaker 370A, the receiver 370B, the microphone 370C, the earphone interface 370D, and the application processor. Such as music playing, recording, etc. The handset 300 may receive key 390 inputs to generate key signal inputs relating to user settings and function controls of the handset 300. The cell phone 300 may generate a vibration alert (such as an incoming call vibration alert) using the motor 391. The indicator 392 in the mobile phone 300 may be an indicator light, which may be used to indicate the charging status, the change of the electric quantity, or may be used to indicate a message, a missed call, a notification, etc. The SIM card interface 395 in the handset 300 is used to connect a SIM card. The SIM card can be attached to and detached from the cellular phone 300 by being inserted into or pulled out of the SIM card interface 395.
It should be understood that in practical applications, the mobile phone 300 may include more or less components than those shown in fig. 3, and the embodiment of the present application is not limited thereto.
Fig. 4 is a block diagram of a software structure of a terminal device according to an embodiment of the present application. The layered architecture can divide the software into several layers, each layer having a clear role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into three layers, which are an application layer (referred to as an application layer), an application framework layer (referred to as a framework layer), and a kernel layer (also referred to as a driver layer) from top to bottom.
Wherein the application layer may comprise a series of application packages. As shown in fig. 4, the application layer may include a plurality of application packages such as application 1 and application 2. For example, the application package may be a camera, gallery, calendar, phone call, map, navigation, WLAN, bluetooth, music, video, short message, and desktop Launcher (Launcher) application.
The framework layer provides an Application Programming Interface (API) and a programming framework for the application program of the application layer. The application framework layer includes a number of predefined functions. As shown in fig. 4, the framework layer may include a Window Manager (WMS), an Activity Manager (AMS), and the like. Optionally, the framework layer may further include a content provider, a view system, a telephony manager, an explorer, a notification manager, etc. (not shown in the drawings).
Among them, the window manager WMS is used to manage the window program. The window manager can obtain the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like. The Activity manager AMS is used for managing Activity and is used for starting, switching and scheduling each component in the system, managing and scheduling application programs and the like.
The kernel layer is a layer between hardware and software. The kernel layer contains at least display drivers, camera drivers, audio drivers, sensor drivers, input/output device drivers (e.g., keyboard, touch screen, headphones, speakers, microphones, etc.), and the like.
The terminal device receives an input operation (such as a split screen operation) acted on the display screen by a user, and the kernel layer can generate a corresponding input event according to the input operation and report the event to the application framework layer. A window mode (e.g., a multi-window mode) corresponding to the input operation, a window position and size, and the like are set by the activity management server AMS of the application framework layer. And the window management server WMS of the application framework layer draws a window according to the setting of the AMS, then sends the drawn window data to the display driver of the kernel layer, and the display driver displays the corresponding application interface in different display areas of the display screen.
The display method provided by the embodiment of the application can be realized based on the freestyle window (freeform) characteristic of Google and a multi-window multi-task infrastructure. The display method provided by the embodiment of the present application can be seen in fig. 6 described below. As shown in fig. 4, in an embodiment of the present application, the Activity manager AMS may include an Activity native management module and an Activity extension module. The Activity native management module is used for managing Activity, and is responsible for starting, switching and scheduling each component in the system, managing and scheduling application programs and the like. The Activity expansion module is used for setting a window mode and the property of the window according to the folding state or the unfolding state of the folding screen.
The properties of the window may include, among other things, the position and size of the Activity window, and the visible properties of the Activity window (i.e., the state of the Activity window). The position of the Activity window is the position of the Activity window on the folding screen when the folding screen displays the Activity window, and the size of the Activity window can be high-level information in the application starting config. The visible property of the Activity window may be true or false. When the visible property of the Activity window is true, it indicates that the Activity window is visible to the user, i.e., the display driver will display the content of the Activity window. When the visible property of the Activity window is false, it indicates that the Activity window is invisible to the user, i.e., the display driver does not display the content of the Activity window.
Illustratively, an application (e.g., application 1) may invoke a launch Activity interface to launch a corresponding Activity. The Activity manager AMS may request the window manager WMS to draw a window corresponding to the Activity in response to the application call, and call a display driver to implement display of the interface.
It is understood that, during the process of displaying the application interface by the terminal device, the input/output device driver of the driver layer may detect an input event of the user (e.g., an input event corresponding to the drag operation shown in fig. 7B). The input/output device driver may report the input event to a window manager WMS of the framework layer (i.e., the application framework layer). After the window manager WMS monitors the input event, the window manager WMS sends a display area change event to the activity manager AMS, that is, sends an operation coordinate in the input event from an application process to an AMS of an Android Open-Source Project (AOSP) process through an aid interface (aid interface definition language), and the AMS adjusts the operation coordinate into a multi-window mode and a window attribute according to the operation coordinate, for example, determines a size of a display area (display). After the Activity manager AMS sets the Activity window mode and the property, the Activity manager AMS can request the window manager WMS to redraw the window, and then the display driver is called to display the redrawn window content, so that different interfaces of the application are displayed to the user in different new display areas.
The technical solutions in the embodiments of the present application will be described below with reference to the drawings in the embodiments of the present application. In the description of the embodiments of the present application, the terms "first" and "second" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implying any number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature. In the description of the embodiments of the present application, "a plurality" means two or more unless otherwise specified.
The display method provided by the embodiment of the application can be applied to the wide-screen device shown in fig. 5A and 5B. Fig. 5A shows a tablet device (pad) whose touch screen is not foldable. Fig. 5B shows a foldable handset.
As shown in fig. 5b (a), the foldable cell phone may include a first screen 51, a second screen 52, and a bendable region 53. When the bendable region is bent or deformed, an included angle between the first screen 51 and the second screen 52 (hereinafter, may be simply referred to as an unfolding angle) may be changed. In the folded state of the folding screen, the first screen 51 and the second screen 52 may be back-to-back or face-to-face, i.e. the folding screen may be folded outwards or inwards.
In some embodiments, the display of the foldable mobile phone may be an integrated flexible display, and the first screen, the second screen and the bendable region may be different regions of the flexible display. In other embodiments, the display screen of the foldable mobile phone may also be a spliced display screen formed by two rigid screens and one flexible screen located between the two rigid screens.
In some embodiments, when the angle between the first screen 51 and the second screen 52 is different, the foldable mobile phone can be formed into different physical forms, such as a folded form, a half-folded form, an unfolded form, and the like.
Illustratively, when the foldable mobile phone is in the unfolded state, the included angle between the first screen and the second screen is a first angle; for example, the first angle is 150 degrees or 180 degrees, as shown in fig. 5b (a) or as shown in fig. 5b (b).
Illustratively, when the foldable mobile phone is in a half-folded configuration, the included angle between the first screen and the second screen is a second angle α; α may be 60 degrees as shown in fig. 5b (c).
Illustratively, when the foldable mobile phone is in the folded configuration, the angle between the first screen and the second screen is a third angle β; for example, β may be 0 degrees, 5 degrees, etc., as may be shown in fig. 5b (d).
In the embodiment of the application, for convenience of description, the state that the included angle between the first screen and the second screen is smaller than or equal to the preset angle threshold is defined as the folded state, the state that the included angle between the first screen and the second screen is larger than the preset angle threshold is defined as the unfolded state, and the preset angle threshold can be determined according to practical experience. It should be noted that the specific value ranges mentioned in the present application are only examples and are not limiting. It should be understood that the present application is not limited to the division of the physical form of the display of the foldable mobile phone and the definition of each physical form.
Based on the terminal structures shown in fig. 3, fig. 5A, and fig. 5B, an embodiment of the present application provides a display method, as shown in fig. 6, which may be applied to the terminal device 202 in the smart home control system shown in fig. 2.
Step 601, the terminal device displays a first interface. The first interface comprises icons of a plurality of different smart home devices.
Illustratively, as shown in fig. 7A, when the foldable mobile phone or tablet device detects an opening operation of a smart home APP icon 701 on the interface 700 by a user, the mobile phone displays an interface 710 as shown in fig. 7B, where the interface 710 includes icons of 6 smart home devices connected by the user, respectively, a router for lying in the main, a desk lamp for writing in a study room, a human body sensor for the study room, a thermostat and humidistat in a baby room, an air conditioner in the baby room, and a humidifier in the baby room.
In step 602, the processor 310 of the terminal device receives an opening operation of a user on an icon of the first smart home device and an icon of the second smart home device.
The at least two pieces of smart home equipment may include first smart home equipment and second smart home equipment. For example, the opening operation of the icons of the at least two smart home devices by the user may be a dragging operation. As shown in fig. 7B, the user drags the "desk lamp writing" icon to the position of the "human body sensor" icon 712, after detecting the dragging operation, the touch sensor 380K in the terminal device reports dragging information of the user on the display screen 394 to the processor 310 (for example, the dragging information includes a touch position when the dragging is started and a touch position when the dragging is stopped), and the processor 310 determines the smart home devices to be turned on as the desk lamp writing and the human body sensor according to the dragging information.
For another example, in some embodiments, when the user performs a pressing operation on the interface 710, at this time, the terminal device displays the interface 720, as shown in fig. 7C, the interface 720 of the terminal device may include an opening control 721 for implementing the opening operation, and the icon of each smart home device of the interface 720 includes a control 722 for prompting the user to select. When the user clicks on control 722, control 722 may display a symbol of "√" to indicate a selection.
As shown in fig. 7D, when the user selects both the controls 722 on the "desk lamp writing" 711 icon and the "human body sensor" 712 icon, the terminal device displays an interface 730. When the terminal device receives a click operation acted on the opening control 721 by a user, in response to the click operation, the terminal device opens a control interface corresponding to the icon of the "writing desk lamp" 711 and a control interface corresponding to the "human body sensor" 712.
In a possible embodiment, the opening operation may also be another gesture or a voice instruction, and the like, which is not limited in this application embodiment.
In response to the opening operation, the processor 310 of the terminal device controls the first display area of the display 394 of the terminal device to display the second interface and controls the second display area of the display 394 to display the third interface, step 603.
The second interface comprises a functional component of the first intelligent household equipment, and the third interface comprises a functional component of the second intelligent household equipment.
It should be noted that the display screen (including the folding screen or the non-folding screen) of the terminal device may include a plurality of display areas that are preset in the terminal device, or may be manually set by the user. That is, the sizes (including width and height) of the first display region and the second display region may be previously configured in the foldable cellular phone. Alternatively, the widths and heights of the first display region and the second display region may be manually set by a user in the foldable cellular phone. In this embodiment, the size of the first display region and the size of the second display region may be the same or different.
Illustratively, in response to the opening operation, the terminal device displays an interface 740 as shown in fig. 7E. The interface 740 includes a control interface 741 of the smart home device, i.e., a human body sensor, and a control interface 742 of the smart home device, i.e., a writing desk lamp.
The control interface 741 of the smart home device, i.e. the human body sensor, includes a plurality of functional components, and illustratively, the functional component 7411 is used to display a detection state of the human body sensor, for example, the detection state is a movement of a person. The control interface 742 of the smart home device, i.e., the writing desk lamp, includes a plurality of functional components, for example, the functional component 7421 is used to control whether the writing desk lamp is turned on, the functional component 7422 is a control corresponding to a reading mode, the functional component 7423 is used to control a color of light, and the functional component 7424 is used to control a brightness of light.
In step 604, the processor 310 of the terminal device receives a configuration operation of the user, where the configuration operation is used to establish an association relationship between the functional component of the second interface and the functional component of the third interface.
Illustratively, the configuration operation may be a drag operation, for example, the user drags the function component 7411 in the control interface 741 to the position of the function component 7421 in the control interface 742, as shown in fig. 7F. In response to the dragging operation, the terminal device obtains that the attribute information of the functional component 7411 is a text control, and the attribute information of the functional component 7421 is a switch control, so that the terminal device determines that the person corresponding to the functional component 7411 moves as the state of an intelligent linkage scene, the switch corresponding to the functional component 7421 is turned on as a task to be executed, and a linkage rule between the human body sensor and the writing desk lamp is automatically created, wherein the linkage rule is as follows: when the human body sensor detects that a person moves, the writing desk lamp is automatically turned on.
It should be noted that, when a user operates a functional component in the interface, the terminal device may determine whether the functional component is a state functional component or a control functional component according to attribute information of the functional component operated by the user. The property information of the functional component may include a control type and control description content. In general, control types may have a text class (e.g., TextView), a Switch class (e.g., Switch), a Button class (e.g., Button), a selection class (e.g., CheckBox). Generally, a functional component of this class is a status functional component, and a functional component of a switch class or a button class or a selected class is a control functional component. The control description is textual description of the control information, such as a low 811 in interface 810 in FIG. 8B, a turned on, temperature 822, mode 823 in interface 820 in FIG. 8B, and so forth. The terminal device can take the state functional component as the state or condition of the linkage rule, and the control functional component as the task of the linkage rule to create the linkage rule.
Step 605, in response to the configuration operation, the terminal device generates control information related to the first smart home device and the second smart home device, and sends the control information to the server.
Illustratively, the control information generated by the terminal device may be: the indication writing desk lamp is turned on when meeting the set conditions, and the set conditions are as follows: the detection result of the human body sensor is that a person moves. The terminal device may send the control information to the server 200 shown in fig. 2, and after receiving the control information, the server 200 sends control instructions to the smart home device, which is a human body sensor, and the smart home device, which is a writing desk lamp. The intelligent household control system is provided with a server, and the terminal equipment can send a control instruction to the intelligent jazz adding equipment according to the control information.
The following application scenarios are combined to illustrate the display method provided in the embodiment of the present application.
Suppose that the current season is winter, a baby room of a user is provided with a temperature and humidity sensor, a constant temperature and humidity device, an air conditioner, a humidifier and other devices. Because of the data such as temperature, humidity can be gathered from temperature and humidity inductor to the constant temperature and humidity ware, so the constant temperature and humidity ware shows that there are current temperature and expectation temperature to and current humidity and expectation humidity, so the user can compare it, reachs temperature status and humidity status.
Intelligent linkage scene one
As shown in fig. 8A, the terminal device detects a drag operation by the user on the thermo-hygrostat icon 801 and the air conditioner icon 802. In response to the drag operation, the processor 310 of the terminal device controls the first display area of the display screen 394 of the terminal device to display the control interface 810 of the thermostat and the second display area of the display screen 394 to display the control interface 820 of the air conditioner, as shown in fig. 8B. Next, the terminal device detects a user's selection operation for a low state in the temperature state function block 811 in a first step, detects a user's on operation for the air conditioner switch function block 821 in a second step, detects a user's selection operation for a heating mode in the mode function block 822, detects a user's drag operation for adjusting the temperature function block 823 to 20 degrees, and a selection operation for setting the wind speed function block 824 to a low speed, and the like.
Further, when the user clicks the completion control 825, the terminal device may be an interface 830 as shown in fig. 8C, where the interface 830 is used to request the user to confirm whether to create the intelligent linkage scenario, and the conditions of the intelligent linkage scenario are: the temperature state of the constant temperature and humidity device is low, and the executing tasks are as follows: the air conditioner starts heating, the heating temperature is 20 ℃, and the wind speed is low. When the user clicks the confirmation control 831, the intelligent linkage scene is created and stored, otherwise, the intelligent linkage scene is not created.
Finally, when the user clicks the confirmation control 831, the terminal device may send the control information including the above-mentioned intelligent linkage scene information to the server 200 shown in fig. 2, and after receiving the control information, the server 200 sends a control instruction corresponding to the control information to the air conditioner. For example, when the temperature reported by the server on the thermostat and the humidistat is lower, the server 200 sends a heating instruction to the air conditioner, and when the temperature reported by the server on the thermostat and the humidistat is 20 ℃, the server 200 controls the air conditioner to stop heating.
Intelligent linkage scene two
As shown in fig. 8D, the terminal device detects a pinch gesture of the user on the thermostat-humidistat icon 801 and the humidifier icon 803, and in response to the gesture, the processor 310 of the terminal device controls the first display region of the display 394 of the terminal device to display the control interface 810 of the thermostat-humidistat and controls the second display region of the display 394 to display the control interface 840 of the humidifier, as shown in fig. 8E. Next, in a first step, the terminal device detects a user's selection operation for a low state in the humidity state function component 812, in a second step, the terminal device detects a user's opening operation for the humidifier switch function component 841, the terminal device also detects a user's selection operation for a standard mode in the mode function component 842, and the terminal device detects a user's operation for adjusting the timing function component 823 to 30 minutes, and so on.
Further, when the user clicks the completion control 845, the terminal device may be an interface 850 as shown in fig. 8F, where the interface 850 is used to request the user to confirm whether to create the intelligent linkage scenario, and the conditions of the intelligent linkage scenario are: the humidity state of the constant temperature and humidity device is low, and the executing tasks are as follows: the humidifier was started in standard mode with a target humidity of 51% and was automatically turned off after 30 minutes. When the user clicks the confirmation control 851, the creation of the intelligent linkage scene is completed and the intelligent linkage scene is saved, otherwise, the intelligent linkage scene is not created.
Finally, when the user clicks the confirmation control 851, the terminal device may send the control information including the above-mentioned intelligent linkage scene information to the server 200 shown in fig. 2, and after receiving the control information, the server 200 sends a control instruction corresponding to the control information to the humidifier.
Intelligent linkage scene three
As shown in fig. 8G, when the user performs a pressing operation on the interface 800, at this time, the terminal device displays an open control 804, and each smart home device icon of the interface 800 includes a control 805 for prompting the user to select. After the user selects all the controls 805 on the icons of the thermostat, the humidistat, the air conditioner and the humidifier, when the terminal device receives a click operation performed on the opening control 804 by the user, the terminal device displays control interfaces corresponding to the three pieces of smart home equipment in response to the click operation, as shown in fig. 8H. After the user may perform the configuration operations shown in fig. 8B and 8E on each interface shown in fig. 8H, the interface shown in fig. 8I is displayed, and when the user clicks the completion control 845 of the control interface 840 in fig. 8I, the terminal device may display a prompt box 860 shown in fig. 8J, where the prompt box 860 is used to request the user to confirm whether to create an intelligent linkage scenario, where the conditions of the intelligent linkage scenario are: the temperature state of the constant temperature and humidity device is low, and the humidity state of the constant temperature and humidity device is low; the execution tasks are: the air conditioner starts heating, the heating temperature is 20 ℃, the wind speed is low, the humidifier starts to work in a standard mode, the target humidity is 51%, and the humidifier is automatically closed after 30 minutes. And when the user clicks the confirmation control 861, the intelligent linkage scene is created and stored, otherwise, the intelligent linkage scene is not created.
Finally, when the user clicks the confirmation control 861, the terminal device may send the control information including the above-described intelligent linkage scene information to the server 200 shown in fig. 2, and after receiving the control information, the server 200 sends control instructions corresponding to the control information to the air conditioner and the humidifier, respectively.
In a possible embodiment, the terminal device 202 may first obtain configuration information including association rules between functional components of different smart home devices from the server 200, and the terminal receives a confirmation operation of a user acting on a first functional component of the second interface, where the first functional component corresponds to a first state supported by the first smart home device. And then, the terminal equipment determines at least one second functional component of second intelligent household equipment corresponding to the first functional component according to the configuration information, and then the terminal equipment switches from displaying a third interface to displaying a fourth interface, wherein the fourth interface comprises the at least one second functional component of the second intelligent household equipment, and the second functional component corresponds to a control function supported by the second intelligent household equipment in the first state.
Exemplarily, as shown in (a) in fig. 9A, it is assumed that the content displayed by the display screen of the terminal device includes a control interface 901 of the first smart home device, and an interface 902 of the second smart home device, where the control interface 901 includes the following functional components: state 1, state 2, control 1, control 2, control 5, and control 6, where the control interface 902 includes the following functional components: state 3, state 4, control 3, control 4, control 7, control 8. States 1-4 are state functional components and controls 1-8 are control functional components. When the user selects a certain state functional component of the first intelligent household equipment as a condition, the terminal equipment automatically recommends at least one control functional component of the second intelligent household equipment as an action to be executed according to the configuration information acquired from the server. For example, the terminal device is shown in table 1 according to the configuration information acquired from the server.
TABLE 1
Condition Task to be executed
State 1 Control 3, control 4 and control 8
State 2 Control 3 and control 4
As can be seen from table 1, when the user selects the state function component corresponding to the state 1 of the first smart home device as a condition, the terminal device automatically recommends the state function components corresponding to the control 3, the control 4, and the control 8 of the second smart home device as an action to be executed. As shown in (b) of fig. 9A, when the user selects state 1 in the interface 901 corresponding to the first smart home device as a condition, the second smart home device switches from the interface 902 to the interface 903, and the interface 903 includes control 3, control 4, and control 8 corresponding to state 1. In fig. 9A (b), the terminal device grays out (i.e., is not editable) the functional components other than control 3, control 4, and control 8 in the interface 902. In another possible implementation, the terminal device may also hide from view (e.g., hide from view) functional components of interface 902 other than control 3, control 4, and control 8. Of course, other processing modes are also possible, and the embodiment of the present application is not limited thereto.
Further, as shown in (c) in fig. 9B, when the user selects control 3, control 4, and control 8 in the interface 903 as the execution tasks corresponding to the conditions, the terminal device may generate an interface 804 as shown in (d) in fig. 9B, the interface 804 being used to request the user to confirm whether to create the smart linkage scenario, the conditions of which are: the state of the first intelligent household equipment is state 1, and the execution task is as follows: the second smart home device performs control 3, control 4, and control 8. And if the user selects and determines, finishing the creation of the intelligent linkage scene, otherwise, not saving.
For another example, the terminal device may obtain configuration information including a linkage rule between the thermostat and the air conditioner from the server 200 in advance, for example, the configuration information may be as shown in table 2.
TABLE 2
Condition Task to be executed
At a lower temperature, i.e. 20 degrees below the desired temperature The air conditioner is turned on and is in a heating mode
At a higher temperature, i.e. 28 degrees above the desired temperature The air conditioner is turned on and in a cooling mode
As can be seen from table 2, when the temperature is 20 degrees lower than the desired temperature, the air conditioner is turned on and the heating mode is performed. The terminal device may determine the functional components of the air conditioner corresponding to the selected state of fig. 8B lower based on the configuration information, and thus, in an alternative embodiment, the cooling mode shown in fig. 8B may be grayed out or set to an invisible state.
In a possible embodiment, if the display screen of the terminal device is a foldable screen, the terminal device may control the hidden functional component of the control interface in the original folded state to be displayed in the second screen of the foldable screen when the foldable screen is completely unfolded.
Specifically, as shown in fig. 10A, step 1001 assumes that in the folded state, the terminal device receives an unfolding operation by the user on the folding screen. Step 1002, the terminal device obtains display information of a control interface of the current third smart home device. In step 1003, the terminal device determines whether the control interface of the third smart home device includes a hidden functional component, if so, step 1004 is executed, otherwise, step 1005 is executed. And 1004, the terminal device controls the first display area of the foldable screen to display the control interface and controls the second display area of the foldable screen to display the hidden functional components. And step 1005, the terminal device controls the foldable screen to display the control interface in a full screen mode.
For example, as shown in (a) in fig. 10B, after the user confirms that the smart linkage scene is created, the user closes the control interface of the thermostat and the control interface of the humidifier in the interface 1100, and then the user performs a folding operation on the folding screen of the terminal device to fold the terminal device to a fully folded state, where the display screen of the terminal device displays the first interface of the smart home application in the fully folded state. The first interface comprises icons of a plurality of different smart home devices. When the terminal device detects a click operation of a user on the router icon control 1201, in response to the operation, the terminal device displays the control interface 1300 shown in (B) of fig. 10B in a folded state, displays and views all controls 1301 in the interface 1300 (the user needs to click and view all controls 1301 to view hidden functional components), and when the user expands the foldable screen, the terminal device controls a first screen of the foldable screen to display the displayed functional components (except for viewing all controls 1301) in the control interface 1300, controls a second screen of the foldable screen to display the hidden functional components, and as shown in fig. 10C, the user does not need to click and view all controls 1301 to display the hidden functional components.
In another possible embodiment, if the display screen of the terminal device is a foldable screen, when the terminal device is converted from the fully unfolded state to the folded state, the terminal device may control to hide the functional component displayed in the second display area of the original foldable screen when the foldable screen is in the folded state.
Specifically, as shown in fig. 10D, step 1401, it is assumed that in the expanded state, the terminal device receives a folding operation by the user on the folding screen. Step 1402, the terminal device obtains the current interface display information. In step 1403, the terminal device determines whether the control interface of the second screen of the third smart home device is a supplementary interface of the control interface of the first screen, if so, step 1404 is executed, otherwise, step 1405 is executed. Step 1404, the terminal device controls the foldable screen to display a control interface of the third smart home device in a folded state, where the control interface of the third smart home device hides part of the functional components, and the control interface includes a control that can trigger the display of the hidden functional components (i.e., view all controls 1301). Step 1405, the terminal device controls the foldable screen of the foldable screen to display a control interface of the third smart home device in a folded state.
Exemplarily, it is assumed that the terminal device is in a complete state as shown in fig. 10C, and a functional component displayed on the second screen in the interface is a complementary interface to the first screen, so that when the folded screen of the terminal device is folded, an interface as shown in (C) of fig. 10B is displayed.
In the above embodiment, when the folding screen completely expanded, all functional components of a single smart home device can be displayed, so that the user operation is facilitated, meanwhile, the screen display area of the folding screen is also fully utilized, and the user experience can be prompted to a certain extent.
In other embodiments of the present application, an embodiment of the present application discloses a terminal device, and as shown in fig. 11, the terminal device may include: a touch screen 1101, wherein the touch screen 1101 includes a touch panel 1107 and a display screen 1108; one or more processors 1102; a memory 1103; one or more application programs (not shown); and one or more computer programs 1104, the sensors 1105, the various devices described above may be connected by one or more communication buses 1106. Wherein the one or more computer programs 1104 are stored in the memory 1103 and configured to be executed by the one or more processors 1102, the one or more computer programs 1104 comprise instructions which may be used to perform the steps as in the respective embodiment of fig. 6.
The embodiment of the present application further provides a computer storage medium, where a computer instruction is stored in the computer storage medium, and when the computer instruction runs on the vehicle-mounted terminal, the vehicle-mounted terminal is enabled to execute the relevant method steps to implement the message processing method in the foregoing embodiment.
The embodiment of the present application further provides a computer program product, which when running on a computer, causes the computer to execute the above related steps, so as to implement the message processing method in the above embodiment.
In addition, embodiments of the present application also provide an apparatus, which may be specifically a chip, a component or a module, and may include a processor and a memory connected to each other; the memory is used for storing computer execution instructions, and when the device runs, the processor can execute the computer execution instructions stored in the memory, so that the chip can execute the response method of the touch screen in the above-mentioned method embodiments.
The vehicle-mounted terminal, the computer storage medium, the computer program product, or the chip provided in the embodiment of the present application are all configured to execute the corresponding method provided above, so that beneficial effects achieved by the vehicle-mounted terminal, the computer storage medium, the computer program product, or the chip may refer to beneficial effects in the corresponding method provided above, and are not described herein again.
Through the description of the above embodiments, those skilled in the art will understand that, for convenience and simplicity of description, only the division of the above functional modules is used as an example, and in practical applications, the above function distribution may be completed by different functional modules as needed, that is, the internal structure of the device may be divided into different functional modules to complete all or part of the above described functions.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other manners. For example, the above-described apparatus embodiments are merely illustrative, and for example, a division of a module or a unit is merely one type of logical division, and an actual implementation may have another division, for example, a plurality of units or components may be combined or integrated into another apparatus, or some features may be discarded or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
Units described as separate parts may or may not be physically separate, and parts displayed as units may be one physical unit or a plurality of physical units, may be located in one place, or may be distributed to a plurality of different places. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a readable storage medium. Based on such understanding, the technical solutions of the embodiments of the present application may be essentially or partially contributed to by the prior art, or all or part of the technical solutions may be embodied in the form of a software product, where the software product is stored in a storage medium and includes several instructions to enable a device (which may be a single chip, a chip, or the like) or a processor (processor) to execute all or part of the steps of the methods of the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (20)

1. A display method is applied to terminal equipment in an intelligent home control system, the intelligent home control system comprises a server, the terminal equipment and at least two pieces of intelligent home equipment, wherein the terminal equipment and the at least two pieces of intelligent home equipment are in network connection with the server, the at least two pieces of intelligent home equipment comprise first intelligent home equipment and second intelligent home equipment, and the display method is characterized by comprising the following steps:
displaying a first interface, wherein the first interface comprises icons of the at least two pieces of intelligent household equipment, and the icons of the at least two pieces of intelligent household equipment comprise an icon of the first intelligent household equipment and an icon of the second intelligent household equipment;
receiving opening operation of a user on the icon of the first intelligent household equipment and the icon of the second intelligent household equipment;
responding to the opening operation, a first display area of a display screen of the terminal equipment displays a second interface, a second display area of the display screen displays a third interface, the second interface comprises a functional component of the first intelligent household equipment, and the third interface comprises a functional component of the second intelligent household equipment;
receiving configuration operation of a user, wherein the configuration operation is used for establishing an association relationship between the functional components of the second interface and the functional components of the third interface;
responding to the configuration operation, the terminal equipment generates control information related to the first intelligent household equipment and the second intelligent household equipment, and sends the control information to the server.
2. The method of claim 1, wherein the receiving a configuration operation of a user comprises:
receiving a confirmation operation of a user on a first functional component of the second interface, wherein the first functional component corresponds to a first state supported by the first intelligent household equipment;
the method further comprises the following steps:
responding to the confirmation operation, and switching the third interface displayed by the display screen to a fourth interface; the fourth interface includes at least one second functional component of the second smart home device, where the second functional component corresponds to a control function supported by the second smart home device in the first state, and a functional component corresponding to the control function that is not supported by the second smart home device in the first state is hidden or set to an inoperable state;
and receiving confirmation operation of the user on the at least one second functional component.
3. The method of claim 2, further comprising: acquiring configuration information from the server, wherein the configuration information comprises association rules among functional components of different intelligent household equipment;
after receiving the confirmation operation of the user on the first functional component of the second interface, the method further comprises the following steps:
and determining at least one second functional component of the second intelligent household equipment corresponding to the first functional component according to the configuration information.
4. The method of claim 1, wherein the display screen is a folded screen that is foldable to form at least two screens, the at least two screens including a first screen and a second screen;
the responding to the opening operation, the first display area of the display screen of the terminal equipment displays a second interface, and the second display area of the display screen displays a third interface, and the method comprises the following steps:
and when the folding screen is in the unfolding state, responding to the opening operation, controlling the first screen of the folding screen of the terminal equipment to display the second interface, and controlling the second screen of the folding screen to display the third interface.
5. The method of claim 4, wherein after the terminal device sends the control information to the server, the method further comprises:
closing the second interface and the third interface, and displaying the first interface when the folding screen is in a folding state;
receiving a second operation of the user on an icon of a third smart home device in the first interface;
responding to the second operation, controlling the first screen of the folding screen to display a fifth interface of the third smart home device, wherein the fifth interface comprises displayed functional components and a control used for triggering display of hidden functional components;
in response to the foldable screen being converted from the folded state to an unfolded state, controlling a first screen of the foldable screen to display the displayed functional component and controlling a second screen of the foldable screen to display the hidden functional component;
wherein, fold condition does first screen with the contained angle of second screen is less than or equal to the state of predetermineeing the angle threshold value, it does to expand the state first screen with the contained angle of second screen is greater than predetermineeing the state of angle threshold value.
6. The method according to any one of claims 1 to 5, characterized in that the first display area and the second display area are pre-configured in the terminal device;
or the first display area and the second display area are set in the terminal device by a user.
7. A terminal device is connected with a server and at least two intelligent household devices in an intelligent household control system through a network, wherein the at least two intelligent household devices comprise a first intelligent household device and a second intelligent household device;
the memory stores program instructions;
the processor is configured to execute the program instructions stored in the memory, so that the terminal device executes:
displaying a first interface, wherein the first interface comprises icons of the at least two pieces of intelligent household equipment, and the icons of the at least two pieces of intelligent household equipment comprise an icon of the first intelligent household equipment and an icon of the second intelligent household equipment;
receiving an opening operation of a user on an icon of the first intelligent household equipment and an icon of the second intelligent household equipment;
responding to the opening operation, a first display area of a display screen of the terminal equipment displays a second interface, a second display area of the display screen displays a third interface, the second interface comprises a functional component of the first intelligent household equipment, and the third interface comprises a functional component of the second intelligent household equipment;
receiving configuration operation of a user, wherein the configuration operation is used for establishing an association relationship between the functional components of the second interface and the functional components of the third interface;
responding to the configuration operation, the terminal equipment generates control information related to the first intelligent household equipment and the second intelligent household equipment, and sends the control information to the server.
8. The terminal device of claim 7, wherein the processor is configured to execute the program instructions stored in the memory, so that the terminal device specifically performs:
receiving a confirmation operation of a user on a first functional component of the second interface, wherein the first functional component corresponds to a first state supported by the first intelligent household equipment;
responding to the confirmation operation, and switching the third interface displayed by the display screen to a fourth interface; the fourth interface includes at least one second functional component of the second smart home device, where the second functional component corresponds to a control function supported by the second smart home device in the first state, and a functional component corresponding to the control function that is not supported by the second smart home device in the first state is hidden or set to an inoperable state;
and receiving confirmation operation of the user on the at least one second functional component.
9. The terminal device of claim 8, wherein the processor is configured to execute the program instructions stored in the memory, so that the terminal device specifically performs:
acquiring configuration information from the server, wherein the configuration information comprises association rules among functional components of different intelligent household equipment;
after the terminal device receives the confirmation operation of the user on the first functional component of the second interface, the terminal device further specifically executes:
and determining at least one second functional component of the second intelligent household equipment corresponding to the first functional component according to the configuration information.
10. The terminal device of claim 7, wherein the display screen is a foldable screen, the foldable screen being foldable to form at least two screens, the at least two screens including a first screen and a second screen;
the processor is configured to run the program instructions stored in the memory, so that the terminal device specifically executes:
and when the folding screen is in a fully unfolded state, responding to the opening operation, controlling the first screen of the folding screen of the terminal equipment to display the second interface, and controlling the second screen of the folding screen to display the third interface.
11. The terminal device of claim 10, wherein the processor is configured to execute the program instructions stored in the memory and further cause the terminal device to perform:
closing the second interface and the third interface, and displaying the first interface when the screen is in a folded state;
receiving a second operation of the user on an icon of a third smart home device in the first interface;
responding to the second operation, controlling the first screen of the folding screen to display a fifth interface of the third smart home device, wherein the fifth interface comprises displayed functional components and a control used for triggering display of hidden functional components;
when the folding screen is converted from the folding state to the unfolding state, controlling a first screen of the folding screen to display the displayed functional components and controlling a second screen of the folding screen to display the hidden functional components;
wherein, fold condition does first screen with the contained angle of second screen is less than or equal to the state of predetermineeing the angle threshold value, it does to expand the state first screen with the contained angle of second screen is greater than predetermineeing the state of angle threshold value.
12. The terminal device according to any one of claims 7 to 11, wherein the first display region and the second display region are pre-configured in the terminal device;
or the first display area and the second display area are set in the terminal device by a user.
13. A computer-readable storage medium, characterized in that the computer-readable storage medium comprises program instructions which, when run on a terminal device, cause the terminal device to carry out the method according to any one of claims 1 to 6.
14. A chip, wherein the chip is coupled to a memory for executing a computer program stored in the memory to perform the method of any of claims 1 to 6.
15. A display method is applied to terminal equipment in an intelligent household control system, the intelligent household control system comprises the terminal equipment and at least two pieces of intelligent household equipment, and the at least two pieces of intelligent household equipment comprise first intelligent household equipment and second intelligent household equipment, and is characterized by comprising the following steps:
displaying a first interface, wherein the first interface comprises icons of the at least two pieces of intelligent household equipment, and the icons of the at least two pieces of intelligent household equipment comprise an icon of the first intelligent household equipment and an icon of the second intelligent household equipment;
receiving opening operation of a user on the icon of the first intelligent household equipment and the icon of the second intelligent household equipment;
responding to the opening operation, a first display area of a display screen of the terminal equipment displays a second interface, a second display area of the display screen displays a third interface, the second interface comprises a functional component of the first intelligent household equipment, and the third interface comprises a functional component of the second intelligent household equipment;
receiving configuration operation of a user, wherein the configuration operation is used for establishing an association relationship between the functional components of the second interface and the functional components of the third interface;
and responding to the configuration operation, and the terminal equipment generates control information related to the first intelligent household equipment and the second intelligent household equipment.
16. The method of claim 15, wherein receiving the configuration operation of the user comprises:
receiving a confirmation operation of a user on a first functional component of the second interface, wherein the first functional component corresponds to a first state supported by the first intelligent household equipment;
the method further comprises the following steps:
responding to the confirmation operation, and switching the third interface displayed by the display screen to a fourth interface; the fourth interface includes at least one second functional component of the second smart home device, where the second functional component corresponds to a control function supported by the second smart home device in the first state, and a functional component corresponding to the control function that is not supported by the second smart home device in the first state is hidden or set to an inoperable state;
and receiving confirmation operation of the user on the at least one second functional component.
17. The method of claim 16, further comprising: acquiring configuration information from the server, wherein the configuration information comprises association rules among functional components of different intelligent household equipment;
after receiving the confirmation operation of the user on the first functional component of the second interface, the method further comprises the following steps:
and determining at least one second functional component of the second intelligent household equipment corresponding to the first functional component according to the configuration information.
18. The method of claim 15, wherein the display screen is a folded screen that is foldable to form at least two screens, the at least two screens including a first screen and a second screen;
the responding to the opening operation, the first display area of the display screen of the terminal equipment displays a second interface, and the second display area of the display screen displays a third interface, and the method comprises the following steps:
and when the folding screen is in the unfolding state, responding to the opening operation, controlling the first screen of the folding screen of the terminal equipment to display the second interface, and controlling the second screen of the folding screen to display the third interface.
19. A terminal device is arranged in an intelligent home control system, the intelligent home control system comprises at least two intelligent home devices for establishing network connection, the at least two intelligent home devices comprise a first intelligent home device and a second intelligent home device, and the terminal device is characterized by comprising a display screen, a processor and a memory;
the memory stores program instructions;
the processor is configured to execute the program instructions stored by the memory to cause the terminal device to perform the method of any one of claims 15-18.
20. A computer-readable storage medium, characterized in that the computer-readable storage medium comprises program instructions which, when run on a terminal device, cause the terminal device to carry out the method according to any one of claims 15 to 18.
CN201910818118.8A 2019-08-30 2019-08-30 Display method and terminal equipment Active CN110688179B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910818118.8A CN110688179B (en) 2019-08-30 2019-08-30 Display method and terminal equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910818118.8A CN110688179B (en) 2019-08-30 2019-08-30 Display method and terminal equipment

Publications (2)

Publication Number Publication Date
CN110688179A true CN110688179A (en) 2020-01-14
CN110688179B CN110688179B (en) 2021-02-12

Family

ID=69108691

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910818118.8A Active CN110688179B (en) 2019-08-30 2019-08-30 Display method and terminal equipment

Country Status (1)

Country Link
CN (1) CN110688179B (en)

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111399789A (en) * 2020-02-20 2020-07-10 华为技术有限公司 Interface layout method, device and system
CN111599148A (en) * 2020-03-06 2020-08-28 维沃移动通信有限公司 Electronic equipment and connection method thereof
CN111880421A (en) * 2020-07-10 2020-11-03 珠海格力电器股份有限公司 Linkage control method and system of household electrical appliance, storage medium and electronic equipment
CN111880653A (en) * 2020-07-21 2020-11-03 珠海格力电器股份有限公司 Equipment linkage scene establishing method and device, electronic equipment and storage medium
CN112214126A (en) * 2020-09-23 2021-01-12 杭州鸿雁电器有限公司 Operation panel and display method and device thereof
CN112306364A (en) * 2020-11-19 2021-02-02 Oppo广东移动通信有限公司 IoT (Internet of things) equipment control method and device, terminal and storage medium
CN113625908A (en) * 2021-07-26 2021-11-09 珠海格力电器股份有限公司 Interface display method, device and equipment of application program and storage medium
CN113791546A (en) * 2021-09-01 2021-12-14 珠海格力电器股份有限公司 Intelligent device control method, device, equipment and storage medium
CN113992725A (en) * 2021-09-13 2022-01-28 珠海格力电器股份有限公司 Equipment control method, device, storage medium and equipment
CN113992791A (en) * 2021-09-17 2022-01-28 珠海格力电器股份有限公司 Operation processing method and device, storage medium and electronic equipment
CN114398016A (en) * 2022-01-12 2022-04-26 金华鸿正科技有限公司 Interface display method and device
WO2022250309A1 (en) * 2021-05-26 2022-12-01 삼성전자 주식회사 Electronic device comprising expandable display, and content provision method
WO2023036082A1 (en) * 2021-09-09 2023-03-16 华为技术有限公司 System and method for displaying and controlling remote device task
CN116719494A (en) * 2022-09-27 2023-09-08 荣耀终端有限公司 Multi-service display method, electronic device and storage medium
CN116743908A (en) * 2022-09-13 2023-09-12 荣耀终端有限公司 Wallpaper display method and related device
WO2024012413A1 (en) * 2022-07-11 2024-01-18 华为技术有限公司 Display control method, electronic device and computer-readable storage medium
US11922842B2 (en) 2021-05-26 2024-03-05 Samsung Electronics Co., Ltd. Electronic device having extendable display and method for providing content thereof

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1852195A (en) * 2005-10-28 2006-10-25 华为技术有限公司 Method for remote control of domestic network apparatus
CN104503688A (en) * 2014-12-31 2015-04-08 小米科技有限责任公司 Intelligent hardware device control achieving method and device
CN105785779A (en) * 2016-03-17 2016-07-20 珠海格力电器股份有限公司 Smart home control method and device
WO2016179129A1 (en) * 2015-05-07 2016-11-10 Microsoft Technology Licensing, Llc Linking screens and content in a user interface
CN106936675A (en) * 2017-03-27 2017-07-07 欧普照明股份有限公司 A kind of apparatus bound system
CN107682236A (en) * 2017-08-28 2018-02-09 深圳广田智能科技有限公司 Smart home interactive system and method based on computer picture recognition
CN107705171A (en) * 2017-08-31 2018-02-16 北京小米移动软件有限公司 Method for information display, device and terminal
CN107733750A (en) * 2017-09-25 2018-02-23 珠海市领创智能物联网研究院有限公司 A kind of method for triggering smart home person's electric shaft
CN108241988A (en) * 2016-12-26 2018-07-03 北京奇虎科技有限公司 Multi-page linkage media display methods, device and intelligent terminal
CN108319151A (en) * 2018-02-09 2018-07-24 广东美的制冷设备有限公司 Control method, device, system, mobile terminal and the storage medium of household appliance
US10055094B2 (en) * 2014-10-29 2018-08-21 Xiaomi Inc. Method and apparatus for dynamically displaying device list
KR20180096242A (en) * 2017-02-21 2018-08-29 삼성전자주식회사 Mehtod and apparatus for indicaiting direction of best beam
CN109212995A (en) * 2018-11-02 2019-01-15 珠海格力电器股份有限公司 A kind of quick control method of smart home system, system and electric appliance

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1852195A (en) * 2005-10-28 2006-10-25 华为技术有限公司 Method for remote control of domestic network apparatus
US10055094B2 (en) * 2014-10-29 2018-08-21 Xiaomi Inc. Method and apparatus for dynamically displaying device list
CN104503688A (en) * 2014-12-31 2015-04-08 小米科技有限责任公司 Intelligent hardware device control achieving method and device
WO2016179129A1 (en) * 2015-05-07 2016-11-10 Microsoft Technology Licensing, Llc Linking screens and content in a user interface
CN105785779A (en) * 2016-03-17 2016-07-20 珠海格力电器股份有限公司 Smart home control method and device
CN108241988A (en) * 2016-12-26 2018-07-03 北京奇虎科技有限公司 Multi-page linkage media display methods, device and intelligent terminal
KR20180096242A (en) * 2017-02-21 2018-08-29 삼성전자주식회사 Mehtod and apparatus for indicaiting direction of best beam
CN106936675A (en) * 2017-03-27 2017-07-07 欧普照明股份有限公司 A kind of apparatus bound system
CN107682236A (en) * 2017-08-28 2018-02-09 深圳广田智能科技有限公司 Smart home interactive system and method based on computer picture recognition
CN107705171A (en) * 2017-08-31 2018-02-16 北京小米移动软件有限公司 Method for information display, device and terminal
CN107733750A (en) * 2017-09-25 2018-02-23 珠海市领创智能物联网研究院有限公司 A kind of method for triggering smart home person's electric shaft
CN108319151A (en) * 2018-02-09 2018-07-24 广东美的制冷设备有限公司 Control method, device, system, mobile terminal and the storage medium of household appliance
CN109212995A (en) * 2018-11-02 2019-01-15 珠海格力电器股份有限公司 A kind of quick control method of smart home system, system and electric appliance

Cited By (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111399789A (en) * 2020-02-20 2020-07-10 华为技术有限公司 Interface layout method, device and system
CN111599148A (en) * 2020-03-06 2020-08-28 维沃移动通信有限公司 Electronic equipment and connection method thereof
CN111880421A (en) * 2020-07-10 2020-11-03 珠海格力电器股份有限公司 Linkage control method and system of household electrical appliance, storage medium and electronic equipment
CN111880653B (en) * 2020-07-21 2021-09-14 珠海格力电器股份有限公司 Equipment linkage scene establishing method and device, electronic equipment and storage medium
WO2022017066A1 (en) * 2020-07-21 2022-01-27 格力电器(武汉)有限公司 Device linkage scene establishment method and apparatus, electronic device, and storage medium
CN111880653A (en) * 2020-07-21 2020-11-03 珠海格力电器股份有限公司 Equipment linkage scene establishing method and device, electronic equipment and storage medium
CN112214126A (en) * 2020-09-23 2021-01-12 杭州鸿雁电器有限公司 Operation panel and display method and device thereof
CN112306364A (en) * 2020-11-19 2021-02-02 Oppo广东移动通信有限公司 IoT (Internet of things) equipment control method and device, terminal and storage medium
WO2022250309A1 (en) * 2021-05-26 2022-12-01 삼성전자 주식회사 Electronic device comprising expandable display, and content provision method
US11922842B2 (en) 2021-05-26 2024-03-05 Samsung Electronics Co., Ltd. Electronic device having extendable display and method for providing content thereof
CN113625908A (en) * 2021-07-26 2021-11-09 珠海格力电器股份有限公司 Interface display method, device and equipment of application program and storage medium
CN113791546A (en) * 2021-09-01 2021-12-14 珠海格力电器股份有限公司 Intelligent device control method, device, equipment and storage medium
WO2023036082A1 (en) * 2021-09-09 2023-03-16 华为技术有限公司 System and method for displaying and controlling remote device task
CN113992725A (en) * 2021-09-13 2022-01-28 珠海格力电器股份有限公司 Equipment control method, device, storage medium and equipment
CN113992791A (en) * 2021-09-17 2022-01-28 珠海格力电器股份有限公司 Operation processing method and device, storage medium and electronic equipment
CN114398016A (en) * 2022-01-12 2022-04-26 金华鸿正科技有限公司 Interface display method and device
WO2024012413A1 (en) * 2022-07-11 2024-01-18 华为技术有限公司 Display control method, electronic device and computer-readable storage medium
CN116743908A (en) * 2022-09-13 2023-09-12 荣耀终端有限公司 Wallpaper display method and related device
CN116743908B (en) * 2022-09-13 2024-03-26 荣耀终端有限公司 Wallpaper display method and related device
CN116719494A (en) * 2022-09-27 2023-09-08 荣耀终端有限公司 Multi-service display method, electronic device and storage medium

Also Published As

Publication number Publication date
CN110688179B (en) 2021-02-12

Similar Documents

Publication Publication Date Title
CN110688179B (en) Display method and terminal equipment
CN110661917B (en) Display method and electronic equipment
CN111949345B (en) Application display method and electronic equipment
CN110839096B (en) Touch method of equipment with folding screen and folding screen equipment
CN110727382A (en) Split-screen display method and electronic equipment
US20220360654A1 (en) Touchscreen Display Method and Electronic Device
WO2021052279A1 (en) Foldable screen display method and electronic device
CN108089891B (en) Application program starting method and mobile terminal
AU2017291584B2 (en) Method for recognizing iris based on user intention and electronic device for the same
WO2021169399A1 (en) Method for caching application interface, and electronic apparatus
CN112286618A (en) Device cooperation method, device, system, electronic device and storage medium
WO2019149028A1 (en) Application download method and terminal
CN109819168B (en) Camera starting method and mobile terminal
CN110673783B (en) Touch control method and electronic equipment
CN110427165B (en) Icon display method and mobile terminal
CN108287655A (en) A kind of interface display method, interface display apparatus and mobile terminal
US20220327190A1 (en) Screen Display Control Method and Electronic Device
CN109688341A (en) A kind of method for polishing and terminal device
CN109918014B (en) Page display method, wearable device and computer-readable storage medium
CN110071866A (en) A kind of instant messaging application control method, wearable device and storage medium
CN109933400A (en) Display interface layout method, wearable device and computer readable storage medium
CN114500732B (en) Interface display method, electronic equipment and storage medium
WO2020228735A1 (en) Method for displaying application, and electronic device
CN110740263B (en) Image processing method and terminal equipment
CN114449686A (en) Wireless network access method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant