CN114885031A - Interactive processing method and device based on equipment control and electronic equipment - Google Patents

Interactive processing method and device based on equipment control and electronic equipment Download PDF

Info

Publication number
CN114885031A
CN114885031A CN202210278189.5A CN202210278189A CN114885031A CN 114885031 A CN114885031 A CN 114885031A CN 202210278189 A CN202210278189 A CN 202210278189A CN 114885031 A CN114885031 A CN 114885031A
Authority
CN
China
Prior art keywords
target
equipment
task
page
displaying
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210278189.5A
Other languages
Chinese (zh)
Inventor
汤孝义
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lumi United Technology Co Ltd
Original Assignee
Lumi United Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lumi United Technology Co Ltd filed Critical Lumi United Technology Co Ltd
Priority to CN202210278189.5A priority Critical patent/CN114885031A/en
Publication of CN114885031A publication Critical patent/CN114885031A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses an interactive processing method and device based on equipment control and electronic equipment, wherein the interactive processing method based on the equipment control is used for responding to control operation aiming at target equipment, displaying an equipment page corresponding to the target equipment, and displaying a task execution picture of the target equipment for executing a target task in the equipment page based on equipment state information, wherein the control operation is used for indicating the target equipment to execute the target task and feeding back the equipment state information in the target task executing process. By the method, the user can intuitively and effectively acquire the execution state of the target device in the device page.

Description

Interactive processing method and device based on equipment control and electronic equipment
Technical Field
The application relates to the technical field of smart home, in particular to an interactive processing method and device based on equipment control and electronic equipment.
Background
With the continuous development of the internet of things technology, the situation that intelligent equipment is arranged in places such as families and offices is more and more common, and the popularization of the intelligent equipment greatly improves the convenience of life and work of people.
However, at present, after the target device is controlled, it is difficult to effectively know the execution state of the device.
Disclosure of Invention
In view of the above, it is desirable to provide an interactive processing method and apparatus based on device control, and an electronic device, which can effectively feed back the execution state of the target device.
In order to achieve the technical purpose, the embodiment of the application provides the following technical scheme:
according to a first aspect of an embodiment of the present application, an interactive processing method based on device control is provided, including:
responding to a control operation aiming at target equipment, displaying an equipment page corresponding to the target equipment, wherein the control operation is used for indicating the target equipment to execute a target task and feeding back equipment state information in the process of executing the target task;
and displaying a task execution picture of the target device for executing the target task in the device page based on the device state information.
According to a second aspect of embodiments of the present application, there is provided an electronic apparatus, including: a memory storing a computer program, and a processor implementing the steps of the device control-based interaction processing method according to any one of the above when the processor executes the computer program.
According to a third aspect of the embodiments of the present application, there is provided an interaction processing apparatus based on device control, including:
the operation response module is used for responding to control operation aiming at target equipment and displaying an equipment page corresponding to the target equipment, wherein the control operation is used for indicating the target equipment to execute a target task and feeding back equipment state information in the process of executing the target task;
and the picture display module is used for displaying a task execution picture of the target task executed by the target equipment in the equipment page based on the equipment state information.
Optionally, if the target task is a continuous execution task, the device state information includes multiple pieces of device state information fed back by the target device in real time in the process of executing the target task;
the image display module displays a dynamic task execution image in the device page based on the pieces of device state information to display a state change process when the target device executes the target task, and is specifically configured to display the dynamic task execution image in the device page based on the pieces of device state information to display the state change process when the target device executes the target task.
Optionally, the image display module is specifically configured to obtain device parameters corresponding to the device status information, and continuously display dynamic task execution images in a device page based on the device status images corresponding to the device parameters, respectively.
Optionally, if the target task is a non-continuous execution task, the screen display module displays, in the device page, a task execution screen on which the target device executes the target task based on the device state information, and specifically, displays, in the device page, a task execution screen corresponding to a state that changes from the initial state of the target device to the state after the target task is executed based on the device state information of the target device.
Optionally, the screen displaying module is specifically configured to display, on the device page, a virtual device corresponding to the target device, and display, on the device page, a task execution screen of the virtual device for executing the target task based on the device state information.
Optionally, if the control operation for the multiple target devices is included, the screen display module is specifically configured to display, in the corresponding device page, task execution screens on which the target devices execute the corresponding target tasks, based on the device state information of each of the target devices.
Optionally, the device control-based interaction processing apparatus further includes: and the picture switching module is used for displaying the equipment page of the target equipment corresponding to the switching display instruction and the task execution picture of the target equipment corresponding to the switching display instruction for executing the corresponding target task according to the switching display instruction if the switching display instruction is acquired.
Optionally, the screen displaying module displays, in the device page, a task execution screen of the target device for executing the target task based on the device state information, and according to the device state information, displays, in the device page, each task execution screen of the target device for executing the target task based on a target interval duration in a gradual change manner.
Optionally, the device control-based interaction processing apparatus further includes: and the information acquisition module is used for responding to the information acquisition instruction, displaying the target feedback information acquired based on the information acquisition instruction on the information display page, and broadcasting the target feedback information through voice.
According to a fourth aspect of embodiments of the present application, there is provided a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the steps of the above-described device control-based interaction processing method.
A computer program product or computer program comprising computer instructions stored in a computer readable storage medium; the processor of the computer device reads the computer instructions from the computer readable storage medium, and when the processor executes the computer instructions, the steps of the interaction processing method based on device control are realized.
The embodiment of the application provides an interactive processing method and device based on equipment control and electronic equipment, wherein the interactive processing method based on the equipment control is used for responding to control operation aiming at target equipment, displaying an equipment page corresponding to the target equipment, and displaying a task execution picture of the target equipment for executing a target task in the equipment page based on equipment state information, wherein the control operation is used for indicating the target equipment to execute the target task and feeding back equipment state information in the target task executing process. By the method, the user can intuitively and effectively acquire the execution state of the target device in the device page.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, it is obvious that the drawings in the following description are only embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the provided drawings without creative efforts.
FIG. 1 is a diagram of an application environment of an interactive processing method based on device control in one embodiment;
FIG. 2 is a block diagram of a hardware configuration of a gateway in one embodiment;
FIG. 3 is a flowchart illustrating an interaction processing method based on device control according to an embodiment;
FIG. 4 is a schematic view of a display page of a colored light in one embodiment;
FIG. 5 is a schematic view of a display page of the retractable curtain motor in one embodiment;
FIG. 6 is a task execution screen diagram illustrating an example of an opening and closing curtain motor executing a closing task;
fig. 7 is a task execution screen diagram illustrating a task of starting a task of an intelligent air conditioner according to an embodiment;
FIG. 8 is a signaling flow diagram of a method for device control based interaction processing in one embodiment;
FIG. 9 is a signaling flow diagram of an interaction processing method based on device control in another embodiment;
fig. 10 is a schematic structural diagram of an electronic device in one embodiment.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments in the present application without making any creative effort belong to the protection scope of the present application.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures.
Exemplary implementation Environment
Referring to fig. 1, fig. 1 is a schematic diagram of an implementation environment that may be involved in accordance with the present invention. The implementation environment is an internet of things platform, and the internet of things platform includes the terminal device 100, the gateway 200, the smart home devices 300 deployed in the gateway 200, the cloud server 400, the router 500, and the central control device 600.
The terminal device 100 may be any device having communication and storage functions, such as: a desktop computer, a notebook computer, a tablet computer, a smart phone or other intelligent communication devices capable of implementing network connection, which is not limited herein.
The smart home device 300 may be a smart lamp, a smart printer, a smart fax machine, a smart camera, a smart air conditioner, a smart television, a smart refrigerator, or a human body sensor, a door/window sensor, a temperature/humidity sensor, a water sensor, a natural gas alarm, a smoke alarm, a wall switch, a wall socket, a wireless switch, a wireless wall switch, a magic cube controller, a curtain motor, and the like, which are configured with a communication module (e.g., a ZIGBEE module, a WIFI module, a bluetooth communication module, and the like).
The central control device 600 may be a device capable of performing centralized management and control on various devices, for example, a device such as an intelligent control panel may be specifically used.
The terminal device 100 and the gateway 200 establish a network connection, and in one embodiment, the terminal device 100 and the gateway 200 establish a network connection through 2G/3G/4G/5G, Wi-Fi or the like. Through the network connection, the terminal device 100 interacts with the gateway 200, so that the user controls the internet of things device accessing the gateway 200 to execute corresponding actions.
Optionally, a client capable of managing the smart home is installed in the terminal device 100, where the client may be an application client (such as a mobile phone APP), or may be a web page client or an applet, and the like, which is not limited herein.
The smart home devices 300 are bound with the central control device 600, and are uniformly accessed to the gateway 200 in the internet of things platform through the central control device 600. The smart home device 300 communicates with the central control device 600 through a communication module configured by itself, and is further controlled by the central control device 600. The central control device 600 communicates with the gateway 200 through its own configured communication module, and is further controlled by the gateway 200. In one embodiment, the central control device 600 accesses the gateway 200 through a local area network, and is thus deployed in the gateway 200. The process of accessing the gateway 200 by the central control device 600 through the local area network includes that the gateway 200 firstly establishes a local area network, and the central control device 600 accesses the local area network established by the gateway 200 by connecting with the gateway 200. The local area network includes: ZigBee or bluetooth. Accordingly, the central control device 600 has a built-in communication module (e.g., a ZIGBEE module, a Wi-Fi module, a bluetooth communication module, etc.) to implement a function of communicating with the gateway 200 and the terminal device. Correspondingly, the smart home device 300 is provided with a communication module (e.g., a ZIGBEE module, a Wi-Fi module, a bluetooth communication module, etc.) to implement a function of communicating with the central control device 600.
Both the gateway 200 and the terminal device 100 may be connected to the router 500, and may access the network through the router 500, and the router 500 may access the server 500 through a wired or wireless communication connection. For example, the gateway 200 and the terminal device 100 may store the acquired information in the cloud server 400. Optionally, the terminal device 100 may further establish a network connection with the cloud server 400 through 2G/3G/4G/5G, Wi-Fi and the like, so as to obtain data sent by the cloud server 400.
Alternatively, as shown in fig. 1, the terminal device 100, the router 500, and the gateway 200 are in the same local area network, and a path established in the local area network may be referred to as a local area network path. When the terminal device 100, the router 500 and the gateway 200 are in the same local area network, the terminal device 100 may interact with the gateway 200 and the smart home devices 300 connected to the gateway 200 through a local area network path. When the terminal device 100 is not located in the same local area network as the gateway 200 and the central control device 600 and the smart home devices 300 connected to the gateway 200, interaction may also be performed through a wide area network path. The wan path is a path formed by connecting the terminal device to the cloud server 400 through a mobile network such as 2G/3G/4G/5G, and the central control device 600 is connected to the cloud server 400 through the gateway 200 and the router 500.
Fig. 2 is a block diagram illustrating a hardware architecture of a gateway according to an example embodiment. This gateway is suitable for use in the implementation environment shown in fig. 1.
It should be noted that the gateway is only an example adapted to the present invention, and should not be considered as providing any limitation to the scope of the present invention. Nor should the gateway be interpreted as requiring reliance on, or necessity with, one or more of the components of the exemplary gateway 200 shown in fig. 2.
The hardware structure of the gateway 200 may be greatly different due to different configurations or performances, as shown in fig. 2, the gateway 200 includes: a power supply 210, an interface 230, at least one memory 250, and at least one Central Processing Unit (CPU) 270.
The power supply 210 is used to provide operating voltage for each hardware device on the gateway 200.
The interface 230 includes at least one wired or wireless network interface 231, at least one serial-to-parallel conversion interface 233, at least one input/output interface 235, and at least one USB interface 237, etc. for communicating with external devices.
The storage 250 is used as a carrier for resource storage, and may be a read-only memory, a random access memory, a magnetic disk or an optical disk, etc., and the resources stored thereon include an operating system 251, an application 253 or data 255, etc., and the storage manner may be a transient storage or a permanent storage. The operating system 251 is used for managing and controlling each hardware device and the application 253 on the gateway 200 to implement the computation and processing of the mass data 255 by the central processing unit 270, which may be windows server, MacOSXTM, unix, linux, FreeBSDTM, FreeRTOS, and the like. The application 253 is a computer program that performs at least one specific task on top of the operating system 251, and may include at least one module (not shown in fig. 2), each of which may contain a series of computer-readable instructions for the gateway 200. The data 255 may be photographs, pictures, etc. stored in a disk.
The central processor 270 may include one or more processors and is arranged to communicate with the memory 250 via a bus for computing and processing the mass data 255 in the memory 250.
As described in detail above, the gateway 200 to which the present invention is applied will complete the device linkage control method by the cpu 270 reading a series of computer-readable instructions stored in the memory 250.
Furthermore, the present invention can be implemented by hardware circuitry or by a combination of hardware circuitry and software instructions, and thus, implementation of the present invention is not limited to any specific hardware circuitry, software, or combination of both.
Exemplary method
Referring to fig. 3, in an exemplary embodiment, an interaction processing method based on device control is provided, and the method is applied to an electronic device for example, where the electronic device may specifically be a terminal, a central control device with a display function, a gateway with a display function, and the like in fig. 1. The interactive processing method based on the equipment control can comprise the following steps:
step S101: and responding to a control operation aiming at the target equipment, displaying an equipment page corresponding to the target equipment, wherein the control operation is used for indicating the target equipment to execute a target task and feeding back equipment state information in the process of executing the target task.
The target device is an intelligent device with a communication function, the target device can respond to a control operation to execute a corresponding target task, and the types of the target device include, but are not limited to, a mobile intelligent terminal and an intelligent home device (an intelligent curtain, an intelligent air conditioner, a central control device, and the like).
The control operation refers to an operation of adjusting the device state of the target device, and the control operation may trigger a corresponding control instruction, so that the target device may execute a corresponding task according to the control instruction triggered by the control operation, thereby enabling the target device to meet the use requirement of a user or a specific scene. Depending on the type of target device, the control operations include, but are not limited to, turning on, turning off, adjusting the percentage of on state, and adjusting the device parameters.
Specifically, the process of controlling the operation to trigger the control instruction may be indirectly triggered by the user through the central control device or other intelligent devices, may also be directly triggered by the user through the target device, and may also be automatically triggered by the user through setting an automation task triggered based on a certain condition for a certain target device, so that the cloud server or other control devices storing the automation task are automatically triggered when the triggering condition is met, which is not limited in the present application.
The electronic device may acquire a control operation for the target device triggered by the own device, and may also receive a control operation for the target device received by another device. And the electronic equipment controls the target equipment according to the control operation.
The device page refers to an information presentation page set for a target device corresponding to the device page, and the device page may include a display area or a display function of a picture or a video, and may further include a display area or a display function of a text, and the device page may be used to display information related to the target device, where the information may include at least one of a parameter, an appearance, device state information, a location (room), a task execution process, and the like. The device status information includes, but is not limited to, the on-off status and operating parameters of the target device.
The device page may correspond to different target devices one to one, and the device page may also correspond to a plurality of target devices. When the device page corresponds to a plurality of target devices, the device page may divide corresponding information display areas for the plurality of target devices corresponding thereto.
For example, referring to fig. 4, fig. 4 shows a schematic diagram of a device page corresponding to a target device when the target device is a colored light. The display page displays information such as the open/close state (open) of the festoon lamp, the position (living room), the current brightness percentage (49%), and the color.
S102: and displaying a task execution picture of the target device for executing the target task in the device page based on the device state information.
The task execution picture is a picture reflecting the state change condition of the target device in the process of executing the target task, and the picture can enable a user to intuitively and effectively acquire the execution state of the target device, and solve the problem that the user is difficult to effectively know the execution state of the device after controlling the target device.
After the electronic equipment controls the target equipment to execute the target task corresponding to the control operation according to the control operation, the target equipment immediately feeds back equipment state information in the process of executing the target task to the electronic equipment. And then, the electronic equipment dynamically displays a task execution picture corresponding to the target task executed by the target equipment in real time in a corresponding equipment page according to the received equipment state information.
In an exemplary embodiment, if the target task is a continuous execution task, the device state information includes multiple pieces of device state information fed back by the target device in real time during the execution of the target task.
Wherein the target task may comprise a continuous execution task. The continuous execution task refers to a task that needs a certain time to be executed, and in the execution process, the device state information of the target device is usually in a continuous change process. Taking the opening and closing curtain equipped with the curtain motor as an example, the task of opening or closing the opening and closing curtain is a continuous execution task. Taking colored lamps as an example, adjusting the brightness of the colored lamps is also a continuous task.
Accordingly, for the execution tasks of continuity: the displaying, in the device page, a task execution screen on which the target device executes the target task based on the device state information includes: and displaying a dynamic task execution picture in the equipment page based on the plurality of pieces of equipment state information so as to display a state change process when the target equipment executes the target task.
The multiple pieces of equipment state information refer to at least two pieces of different equipment state information of the target equipment in the process of executing the target task, the more the multiple pieces of equipment state information are, the richer the information contained in the task execution picture displayed in the equipment page is, and the more exquisite the state change process when the target equipment executes the target task is displayed. The number of the pieces of equipment state information included in the pieces of equipment state information is not limited, and is specifically determined according to actual conditions.
In this embodiment, a mode of feeding back the task execution state by using a dynamic task execution picture is set for continuously executing the task, so that the user can almost acquire the state change of the target device in the execution process in real time, the information content that the user can acquire is enriched, and the user experience is optimized.
In an exemplary embodiment of the application, the presenting a dynamic task execution screen based on the pieces of device status information to display a status change process when the target device executes a target task includes:
and acquiring equipment parameters corresponding to the equipment state information, and continuously displaying dynamic task execution pictures in an equipment page based on equipment state pictures corresponding to the equipment parameters respectively.
In this embodiment, a corresponding device state picture is set for a device parameter of the target device in the process of executing the target task, so that in the process of executing the target task by the target device, the electronic device can further obtain the corresponding device state picture by obtaining the device parameter corresponding to each device state information of the target device in the task executing process, and finally continuously display the dynamic task executing picture in the device page, which is beneficial for a user to intuitively know the target task executing process and obtain effective state feedback information.
Specifically, the manner in which the dynamic task execution screen is continuously presented in the device page may be to present an animation of the target device in executing the target task in the device page.
Taking the retractable curtain as an example, referring to fig. 5, fig. 5 shows a schematic view of a device status picture with the opened state of the retractable curtain being 100%, when different opening states (device parameters) of the retractable curtain correspond to different device state pictures, the electronic device can obtain a plurality of corresponding device parameters based on the state information of each device when the retractable curtain executes a target task (such as closing a curtain), then obtain the corresponding device state pictures based on the device parameters, generate corresponding animations when the retractable curtain executes a window curtain closing task, referring to fig. 6, it can be seen from fig. 6 that, through the task execution picture, the user not only obtains the states of the retractable curtain before and after executing the target task, but also learns the states of the retractable curtain during executing the target task, so as to provide more state information for the user, and enable the user to effectively obtain the state change process of the target device during executing the target task.
In an exemplary embodiment of the application, if the target task is a non-continuous execution task, the displaying, in the device page, a task execution screen on which the target device executes the target task based on the device state information includes:
and displaying a task execution picture corresponding to a state after the initial state of the target equipment is changed to the target task execution state in the equipment page based on the equipment state information of the target equipment.
The target task may also include a continuous execution task. The non-continuous execution task refers to a task in which the target device does not have an intermediate state in the execution process, for example, a task that is completed instantaneously such as turning on, turning off, and pausing of the target device. In this embodiment, a corresponding visual display mode is also configured for the target task, which is beneficial for a user to obtain effective state feedback of the target device.
Referring to fig. 7, fig. 7 shows a task execution screen from an off state to an on state of the smart air conditioner, taking an on task of the smart air conditioner as an example, and fig. 7 shows that the air conditioner is in an on state in a manner of visualizing air blown by the air conditioner. Of course, in other exemplary embodiments, the task execution screen of the target device for executing the target task may also be represented by a changed state of a state indicator lamp on the target device, which is not limited in this application.
In an exemplary embodiment of the application, the displaying, in the device page, a task execution screen on which the target device executes the target task based on the device state information includes:
displaying virtual equipment corresponding to the target equipment on the equipment page; and displaying a task execution picture of the virtual equipment for executing the target task on the equipment page based on the equipment state information.
The virtual device refers to an image used for representing state information of an actual target device in a device page, and the image may be a two-dimensional plane image or a three-dimensional stereo image or a three-dimensional model, which is not limited in the present application. In fig. 5 and 6, the actual open state of the opening/closing curtain is represented by a virtual device of the opening/closing curtain, and in fig. 7, the actual open/close state of the air conditioner is represented by a virtual device of the air conditioner. The task execution picture for displaying the virtual equipment to execute the target task in the equipment page can intuitively feed back the state change process of the target equipment to the user, and the user can effectively learn the state feedback of the target equipment.
In other exemplary embodiments of the present application, the electronic device may further display, in a device page, a task execution screen of the target device for executing the target task in a text or other manner, which is not limited in this application.
In an exemplary embodiment of the application, if the control operation for the plurality of target devices is included, the displaying, based on the device state information, a task execution screen of the target device for executing the target task in the device page includes:
and respectively displaying a task execution picture of the target equipment for executing the corresponding target task in the corresponding equipment page based on the equipment state information of each target equipment.
When a user simultaneously controls a plurality of target devices, corresponding device pages can be set for different target devices, and then the electronic device respectively displays task execution pictures of the target devices for executing corresponding target tasks in the corresponding device pages, so that the user can effectively learn the execution state feedback of different target devices. For example, when the operation control for the target device is the operation control for a plurality of target devices, for example, a hall lamp is turned on, if there are 3 lamps in the hall, diversified page presentation can be performed, that is, pages corresponding to all lamps are displayed simultaneously, so that a user can freely select the state information of a certain lamp, and the operation control is rapid.
Specifically, multiple device pages may be simultaneously placed in one display page, and different device pages display task execution pictures of different target devices for executing corresponding target tasks, or task execution pictures of each target device for executing corresponding target tasks may be sequentially displayed in one display page according to a certain sequence, where the sequence may be a sequence in which a user issues a control operation, or a sequence in which a target task is executed by a target task.
The present application does not limit the display mode of the specific task execution screen. For example, a user sequentially sends user operations including control of colored lamp opening, opening of the retractable curtain and opening of the smart television, in a normal case, the display sequence also sequentially displays the colored lamp, the retractable curtain motor and the device page and the task execution picture corresponding to the smart television, the psychological expectation that the user firstly sends an instruction and firstly obtains feedback is met, and the improvement of user experience is facilitated.
In an exemplary embodiment of the present application, after the presenting a device page corresponding to a target device in response to a control operation for the target device, the method further includes:
and if a switching display instruction is acquired, displaying an equipment page of the target equipment corresponding to the switching display instruction and a task execution picture of the target equipment corresponding to the switching display instruction for executing the corresponding target task according to the switching display instruction.
When a plurality of device pages and corresponding task execution pictures are waiting to be displayed, the switching display instruction can be switched between different device pages and task execution pictures, so that a user can more quickly acquire the state feedback of the target device concerned by the user. The switching display instruction includes, but is not limited to, displaying a next target device and a corresponding task execution screen, displaying a previous target device and a corresponding task execution screen, and displaying a specific target device and a corresponding task execution screen.
In an exemplary embodiment, the displaying a task execution screen of the target device for executing the target task in the device page based on the device state information includes: and according to the equipment state information, gradually displaying each task execution picture of the target equipment for executing the target task based on the target interval duration in the equipment page.
The target interval duration may refer to a time interval in which the electronic device directly displays each display result, and may represent a dwell duration or dwell time of each page, for example. The preset stay time can be preset for each result displayed on the interface of the electronic equipment, so that the next page content to be displayed can be displayed slowly and switched according to the preset stay time.
Specifically, after receiving the device state information fed back by the target device, the electronic device gradually displays each task execution picture of the target device for executing the target task in the corresponding device page according to the target interval duration, so that the content of two adjacent pages can be switched to present a gradually appearing visual effect.
For example, when the control result, such as the task execution screen or the target feedback information, is displayed on the screen of the electronic device, the control result stays on the page for a preset time period, which may be 3s, 2s, 1s, or the like. Taking the time of 2s as an example, if the time is less than the time, the user needs to jump to the home page or the status detail page of the device page of the corresponding controlled device after a delay. And staying in the current page (for example, jumping to the page after 2 s), if the target device needs to run continuously, for example, if the target device needs to run continuously under a control instruction, the page may present an animation to mark that the target device is running.
For example, taking the target device as a curtain as an example, for example, the curtain is operated from 10% to 80% and stopped, a preset interval duration is waited for in the middle, for example, 5s, the interface of the electronic device displays that the target device is always operated, and then if feedback information of the target device is received, the effect of dynamic display of the screen is suspended, and the current actual position of the curtain is displayed (for example, when the curtain is controlled, the curtain is actually opened or closed in the execution process, and the state of the curtain displayed on the screen is basically synchronized with the actual opening and closing process of the curtain).
The delay processing in this embodiment is mainly to optimize the visual experience of displaying content changes on the screen, because there is content on the screen to display the control result of the current device, if the user directly and quickly jumps, the user may be confused about missing content or having no visual retention time to see text information. In the embodiment, the phenomenon of poor user experience caused by too short retention of the page content can be effectively solved by optimizing the switching time of the display content in the interface.
In an exemplary embodiment of the present application, the switching manner of the device page and the task execution screen includes:
hiding the currently displayed equipment page of the target equipment and the task execution picture displayed in the equipment page by using a fading effect;
and displaying the next device page of the target device to be displayed and the corresponding task execution picture by using the fade-in effect.
When the device pages and the corresponding task execution pictures of the multiple target devices need to be displayed, the electronic device may display the device pages and the corresponding task execution pictures of the target devices according to a certain sequence, and the display time of the device page and the corresponding task execution picture of each target device may be preset. After the display is finished, the currently displayed equipment page and the task execution picture are hidden by using the fading effect, and the next equipment page of the target equipment to be displayed and the corresponding task execution picture are displayed slowly by using the fading effect, so that the negative experience brought to the user by the sudden switching of the adjacent equipment pages is avoided.
Of course, the transitional effect of the fade-in and fade-out can also be applied to the device page switching when the switching display instruction is received.
In an exemplary embodiment, before the presenting the device page corresponding to the target device in response to the control operation for the target device, the method further includes:
and acquiring voice information, and analyzing the voice information to obtain control operation aiming at the target equipment.
The voice information is information sent by a user in a voice form, and control operation aiming at the target equipment contained in the voice information can be obtained through analysis of the voice information, so that the target equipment can have a voice control function, and the use convenience of the user is improved.
In an exemplary embodiment, the interactive processing method based on device control further includes:
and responding to the information acquisition instruction, displaying target feedback information acquired based on the information acquisition instruction on an information display page, and broadcasting the target feedback information through voice.
In this embodiment, the information obtaining instruction refers to an instruction for obtaining information such as information, and the information obtaining instruction may be sent by a user in a voice form or sent by a touch manner, which is not limited in this application. The information includes but is not limited to securities information, weather information, encyclopedia information, etc.
In some exemplary embodiments, the target feedback information and the information can be pushed in two modes, namely voice broadcasting and displaying, so that timely and effective feedback of the target feedback information is realized, and the situation that a user cannot effectively learn the target feedback information after sending an information acquisition instruction is avoided.
In some usage scenarios, for example, when a user inputs a control operation or inputs an information acquisition instruction by touch at night, it may not be desirable to feed back target feedback information or information by voice broadcasting. Therefore, in one embodiment, the electronic device may push the feedback information or the information of the target device in two modes, namely display and voice broadcast, when the touch operation is performed through a voice information sending or information obtaining instruction is a voice instruction. And when the touch operation is input in a touch mode or the information acquisition instruction is an instruction acquired in a touch mode, feeding back the target feedback information in a display mode.
In addition, in another embodiment of the present application, when a touch operation is input in a touch manner or the information acquisition instruction is an instruction acquired in a touch manner, an available time period for pushing feedback information or information in two manners of display and voice broadcast may be set, in the available time period, the target feedback information or information may be pushed in two manners of display and voice broadcast, and when the available time period is not reached, the target feedback information may be pushed in a display manner.
For example, 6:00-21:00 every day is used as the available time interval, the target feedback information or the information is allowed to be fed back in two modes of display and voice broadcast when the touch operation is input in a touch mode or the information acquisition instruction is an instruction acquired in a touch mode, and the target feedback information or the information is only allowed to be fed back in a display mode when the information acquisition instruction is an unavailable time interval 21:01-5:59 every day, so that the voice broadcast feedback information is prevented from disturbing the rest of the user. The present application does not limit this, which is determined by the actual situation.
The specific way of feeding back the target feedback information or the information in a display way can be to display the target feedback information or the information in a graphic way after the preset time of displaying the target feedback information or the information in a text way, so that the way of feeding back the target feedback information or the information in the display way can be enriched, and the sensory experience of the user for learning the target feedback information in the display way is optimized.
In the switching process of the two display modes of the character display mode and the graphic display mode, a fading-in and fading-out switching effect can be adopted, so that the reduction of the trial experience of a user due to sudden switching is avoided.
The preset time may be preset or may be adjusted according to the habit of the user, and the preset time may be 1 second, 2 seconds, 3 seconds, 5 seconds, and the like, which is not limited in the present application. Taking the preset time as 2 seconds as an example, after the target feedback information is displayed in a text mode for 2 seconds, the corresponding page is jumped to and displayed in a graphic mode. Because the characters occupy a small memory and are short in transmission time, the target feedback information is displayed in a character mode, so that the timeliness of feedback is improved, a user can obtain the target feedback information immediately, meanwhile, the preset time can be set to reserve time for data communication of contents such as pictures, and the problem of poor user experience caused by too short stay time of voice broadcast and character page display is solved.
In this embodiment, a step of determining a control instruction according to the control operation or the voice information and transmitting the control instruction to the target device is further provided. The control instruction can be determined according to the detected control operation, when the control operation triggers the touch instruction, the control instruction of the target device can be determined according to the target device and the target task corresponding to the touch instruction, and the control instruction of the target device is transmitted to the target device, so that the target device executes the target task.
When the user inputs voice information, determining a control instruction of the target device comprises:
acquiring voice information, and preprocessing the voice information, wherein the preprocessing comprises noise reduction processing.
The noise reduction processing is carried out on the voice information, so that the interference of the environmental noise on the voice content of the user can be eliminated to a certain extent, the signal to noise ratio of the voice information is favorably improved, and the precision of a control instruction of the equipment determined according to the voice information subsequently is improved. Besides the noise reduction processing, the preprocessing of the voice information can also include filtering processing to filter out the sound outside the frequency band to which the human voice belongs, reduce the data transmission quantity of the voice information transmitted to the cloud server, and reduce the operation resources consumed when determining the control instruction of the device according to the voice information.
If a communicable path exists between a local device (i.e., a device implementing the device control-based interaction processing method, such as a central control device or a control panel, which is described below by taking the central control device as an example) and a cloud server, sending the preprocessed user instruction to the cloud server, and obtaining a control instruction of the device determined by the cloud server.
After preprocessing such as making an uproar falls voice information, whether need to judge that central control equipment and cloud server establish can the communication path, whether need to judge the gateway that central control equipment connects and the router can carry out the extranet communication promptly, when can communicating between central control equipment and the cloud server, transmit the voice information after the preprocessing for the cloud server through the wide area network route to make the cloud server confirm equipment control command and send for central control equipment.
And if a communication path does not exist between the cloud server and the device, inquiring a preset database according to the preprocessed voice information, and determining a control instruction of the device corresponding to the preprocessed voice information.
When the central control device and the cloud server cannot communicate with each other, the central control device directly queries a preset database built in the central control device according to the preprocessed voice information after preprocessing the voice information, and determines a device control instruction corresponding to the preprocessed voice information. The preset database comprises the corresponding relation between the control instruction of the equipment and the voice information.
In the embodiment, two modes of determining the device control instruction, namely 'online' and 'offline', are provided, so that the probability that the user cannot execute after sending the user instruction is reduced, the user experience is optimized, and meanwhile, the applicability of the method is improved.
Sending a control instruction of the equipment to the target equipment; or
And sending the control instruction of the equipment to other central control equipment, and sending the control instruction to the target equipment through other central control equipment.
After the central control device receives the control instruction of the device, the control instruction of the device may be directly sent to the target device through a relay device such as a gateway, or the control instruction of the device may be sent to other central control devices and sent to the target device through other central control devices under some conditions (for example, when the central control device receiving the control instruction of the device occupies more operating resources or is limited in transmission with the target device or is not directly bound with the target device).
The embodiment enriches the specific mode of transmitting the control instruction of the equipment to the target equipment, is beneficial to enhancing the robustness of the method and ensures that the control instruction of the equipment is smoothly transmitted to the target equipment.
It should be understood that, although the steps in the flowchart of fig. 3 are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least a portion of the steps in fig. 3 may include multiple sub-steps or multiple stages that are not necessarily performed at the same time, but may be performed at different times, and the order of performance of the sub-steps or stages is not necessarily sequential, but may be performed in turn or alternately with other steps or at least a portion of the sub-steps or stages of other steps.
The application also provides an application scene, and the application scene applies the interactive processing method based on the equipment control. Specifically, referring to fig. 8, the application of the interactive processing method based on device control in the application scenario is as follows:
the user sends voice information including closing the opening and closing curtain and opening the colored lamp to the intelligent control panel A, and the intelligent control panel A performs noise reduction processing on the voice information to obtain corresponding control operation and sends the control operation to the cloud server. The intelligent control panel a may be a central control device or a control panel with a screen.
The cloud server analyzes the corresponding control operation, obtains corresponding equipment control instructions for closing the retractable curtain and opening the colored lamp, and sends the control instructions to the intelligent control panel A.
The intelligent control panel A sends a device control instruction for closing the retractable curtain to the retractable curtain motor of the target device, and sends a device control instruction for opening the colored lamp to the intelligent control panel B, so that the intelligent control panel B forwards the device control instruction for opening the colored lamp to the colored lamp of the target device. The intelligent control panel B may also be an intelligent control panel with a display function.
And after the curtain opening and closing motor executes a corresponding equipment control instruction, the equipment state information of the curtain opening and closing is fed back to the intelligent control panel A through the gateway and the cloud server.
And after the colored lamp executes the corresponding equipment control instruction, the equipment state information of the colored lamp is fed back to the intelligent control panel A through the gateway and the cloud server.
And the gateway and the cloud server feed back the received equipment state information of each target equipment to the intelligent control panel A.
The intelligent control panel A takes the first arrived one of the device state information of the target device sent by the cloud server and the gateway as the device state information of the target device, displays a device page corresponding to the target device according to the sequence of the obtained device state information of the target device, and displays a task execution picture of the target device for executing the target task in the device page. When the intelligent control panel A pushes the feedback information of the target equipment, if a switching display instruction is obtained, an equipment page and a task execution picture of the target equipment corresponding to the switching display instruction are displayed according to the switching display instruction.
In a specific case, taking the target device as a curtain motor as an example for illustration, when the user wants to control the curtain, the user can send a voice command for adjusting the curtain to the intelligent control panel, such as a window closing command. After the intelligent control panel receives the voice command, the voice command is analyzed to obtain a window closing control command, the window closing control command is sent to the corresponding curtain motor, the curtain motor executes the window closing control command, and the execution state of the intelligent control panel is fed back immediately, for example, the closing percentage is small. The intelligent control panel displays the window closing dynamic effect in the equipment page corresponding to the curtain motor in the interface, and can particularly display the dynamic change process of the curtain closing process in real time according to the state information fed back by the curtain motor in real time.
As shown in fig. 6, the closing process of the window covering is illustrated, and it is understood that the reverse direction is the opening process of the window covering. Referring to fig. 6, the initial state of the curtain motor may be a state in which the curtain is fully opened. After receiving a window closing instruction sent by the intelligent control panel, executing the window closing instruction, and feeding back device state information of a closing proportion of the curtain step by step, for example, device state information of 50% of the closing proportion and 100% of the closing proportion is fed back by a curtain motor respectively. The intelligent control panel respectively displays the pages corresponding to the fully-opened state of the curtain step by step, then displays the pages corresponding to 50% of the closing proportion of the curtain, and then displays the pages corresponding to 100% of the closing proportion of the curtain. It can be understood that the task execution screens corresponding to the three state states are not limited to be displayed in the curtain closing process, and the task execution screens corresponding to more closing ratios can be included.
In the implementation, the user realizes the what-you-see through the interaction of voice and vision, and the real-time control result and the dynamic result display of the target equipment can be almost synchronously displayed in the interface. The user is free from adjusting the temperature of the target equipment such as an air conditioner, but does not know whether the temperature is effective or not, namely, the temperature change cannot be immediately sensed, so that the use experience is improved.
The application further provides an application scenario applying the interactive processing method based on the device control. Specifically, referring to fig. 9, the application of the interactive processing method based on device control in the application scenario is as follows:
the user sends an information acquisition instruction for acquiring the weather of today to the intelligent control panel A.
And the intelligent control panel A performs noise reduction on the information acquisition instruction and then sends the information acquisition instruction to the cloud server.
The cloud server analyzes the information acquisition instruction, acquires the weather today and sends the weather today as target feedback information to the intelligent control panel A.
The intelligent control panel A responds to the information acquisition instruction, displays target feedback information acquired based on the information acquisition instruction on an information display page, and broadcasts the target feedback information through voice so that a user can acquire the weather information today.
Exemplary devices
Correspondingly, an embodiment of the present application further provides an interactive processing apparatus based on device control, including:
the operation response module is used for responding to control operation aiming at target equipment and displaying an equipment page corresponding to the target equipment, wherein the control operation is used for indicating the target equipment to execute a target task and feeding back equipment state information in the process of executing the target task;
and the picture display module is used for displaying a task execution picture of the target task executed by the target equipment in the equipment page based on the equipment state information.
Optionally, if the target task is a continuous execution task, the device state information includes multiple pieces of device state information fed back by the target device in real time in the process of executing the target task;
the image display module displays a dynamic task execution image in the device page based on the pieces of device state information to display a state change process when the target device executes the target task, and is specifically configured to display the dynamic task execution image in the device page based on the pieces of device state information to display the state change process when the target device executes the target task.
Optionally, the image display module is specifically configured to obtain device parameters corresponding to the device status information, and continuously display dynamic task execution images in a device page based on the device status images corresponding to the device parameters, respectively.
Optionally, if the target task is a non-continuous execution task, the screen display module displays, in the device page, a task execution screen on which the target device executes the target task based on the device state information, and specifically, displays, in the device page, a task execution screen corresponding to a state that changes from the initial state of the target device to the state after the target task is executed based on the device state information of the target device.
Optionally, the screen displaying module is specifically configured to display, on the device page, a virtual device corresponding to the target device, and display, on the device page, a task execution screen of the virtual device for executing the target task based on the device state information.
Optionally, if the control operation for the multiple target devices is included, the screen display module is specifically configured to display, in the corresponding device page, task execution screens on which the target devices execute the corresponding target tasks, based on the device state information of each of the target devices.
Optionally, the device control-based interaction processing apparatus further includes: and the picture switching module is used for displaying the equipment page of the target equipment corresponding to the switching display instruction and the task execution picture of the target equipment corresponding to the switching display instruction for executing the corresponding target task according to the switching display instruction if the switching display instruction is acquired.
Optionally, the screen displaying module displays, in the device page, a task execution screen of the target device for executing the target task based on the device state information, and according to the device state information, displays, in the device page, each task execution screen of the target device for executing the target task based on a target interval duration in a gradual change manner.
Optionally, the device control-based interaction processing apparatus further includes: and the information acquisition module is used for responding to the information acquisition instruction, displaying the target feedback information acquired based on the information acquisition instruction on the information display page, and broadcasting the target feedback information through voice.
Exemplary electronic device
In one embodiment of the present application, there is provided an electronic device including: a memory storing a computer program and a processor executing the computer program to perform the steps of the device control based interaction processing method according to various embodiments of the present application described in the above section "exemplary method" of the present specification.
The internal structure of the electronic apparatus may be as shown in fig. 10, and the electronic apparatus includes a processor, a memory, a network interface, and an input device connected through a system bus. Wherein the processor of the electronic device is configured to provide computing and control capabilities. The memory of the central control device comprises a nonvolatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The internal memory provides an environment for the operation of an operating system and computer programs in the non-volatile storage medium. The network interface of the electronic device is used for connecting and communicating with an external terminal through a network. The computer program is executed by a processor in steps of an interactive processing method based on device control according to various embodiments of the present application described in the above section "exemplary method" of the present specification.
The electronic equipment can also comprise a display component and a voice component, the display component can be a liquid crystal display screen or an electronic ink display screen, and an input device of the electronic equipment can be a touch layer covered on the display component, a key, a track ball or a touch pad arranged on a shell of the electronic equipment, an external keyboard, a touch pad or a mouse and the like.
Those skilled in the art will appreciate that the architecture shown in fig. 10 is merely a block diagram of some of the structures associated with the present solution and does not constitute a limitation on the electronic devices to which the present solution applies, and that a particular electronic device may include more or less components than those shown, or combine certain components, or have a different arrangement of components.
Exemplary Smart Home System
An intelligent home system, comprising: the system comprises intelligent home equipment, a gateway, a cloud server and central control equipment, wherein the intelligent home equipment, the gateway and the central control equipment are used for establishing a first communication path, and/or the intelligent home equipment, the gateway, the central control equipment and the cloud server are used for establishing a second communication path;
the central control device comprises a memory storing a computer program and a processor executing the computer program to perform the steps of the interactive processing method based on device control according to various embodiments of the present application described in the section "exemplary method" above in this specification.
In this embodiment, the first communication path may refer to a local area network path in the above "exemplary implementation environment", the second communication path may refer to a wide area network path in the above "exemplary implementation environment", and the specific communication relationship between the devices may also refer to the description in the above "exemplary implementation environment".
Exemplary computer program product and storage Medium
In addition to the above-described methods and devices, embodiments of the present application may also be a computer program product comprising computer program instructions that, when executed by a processor, cause the processor to perform the steps in the device control based interaction processing method according to various embodiments of the present application described in the "exemplary methods" section of this specification, supra.
The computer program product may be written with program code for performing the operations of embodiments of the present application in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server.
Furthermore, embodiments of the present application may also be a storage medium having stored thereon a computer program that is executed by a processor to perform steps in an interactive processing method based on device control according to various embodiments of the present application described in the above-mentioned "exemplary method" section of the present specification.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware instructions of a computer program, which can be stored in a non-volatile computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. Any reference to memory, storage, database, or other medium used in the embodiments provided herein may include non-volatile and/or volatile memory, among others. Non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), Double Data Rate SDRAM (DDRSDRAM), Enhanced SDRAM (ESDRAM), Synchronous Link DRAM (SLDRAM), Rambus Direct RAM (RDRAM), direct bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM).
The technical features of the above embodiments can be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the above embodiments are not described, but should be considered as the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (12)

1. An interactive processing method based on equipment control is characterized by comprising the following steps:
responding to a control operation aiming at target equipment, displaying an equipment page corresponding to the target equipment, wherein the control operation is used for indicating the target equipment to execute a target task and feeding back equipment state information in the process of executing the target task;
and displaying a task execution picture of the target device for executing the target task in the device page based on the device state information.
2. The method according to claim 1, wherein if the target task is a continuous execution task, the device status information includes a plurality of pieces of device status information fed back by the target device in real time during execution of the target task;
the displaying, in the device page, a task execution screen on which the target device executes the target task based on the device state information includes:
and displaying a dynamic task execution picture in the equipment page based on the plurality of pieces of equipment state information so as to display a state change process when the target equipment executes the target task.
3. The method according to claim 2, wherein the displaying a dynamic task execution screen in the device page based on the plurality of pieces of device state information to display a state change process when the target device executes a target task includes:
and acquiring equipment parameters corresponding to the equipment state information, and continuously displaying dynamic task execution pictures in an equipment page based on equipment state pictures corresponding to the equipment parameters respectively.
4. The method according to claim 1, wherein if the target task is a non-continuous task, the displaying a task execution screen of the target device for executing the target task in the device page based on the device state information comprises:
and displaying a task execution picture corresponding to a state after the initial state of the target equipment is changed to the target task execution state in the equipment page based on the equipment state information of the target equipment.
5. The method of claim 1, wherein the presenting, in the device page, a task execution screen of the target device for executing the target task based on the device state information comprises:
displaying virtual equipment corresponding to the target equipment on the equipment page;
and displaying a task execution picture of the virtual equipment for executing the target task on the equipment page based on the equipment state information.
6. The method according to any one of claims 1 to 5, wherein, if the method includes a control operation for a plurality of target devices, the displaying, in the device page, a task execution screen of the target device for executing the target task based on the device state information includes:
respectively displaying a task execution picture of the target equipment for executing the corresponding target task in the corresponding equipment page based on the equipment state information of each target equipment;
and if a switching display instruction is acquired, displaying an equipment page of the target equipment corresponding to the switching display instruction and a task execution picture of the target equipment corresponding to the switching display instruction for executing the corresponding target task according to the switching display instruction.
7. The method according to any one of claims 1-5, wherein the presenting a task execution screen of the target device for executing the target task in the device page based on the device state information comprises:
and according to the equipment state information, gradually displaying each task execution picture of the target equipment for executing the target task based on the target interval duration in the equipment page.
8. The method according to any one of claims 1-5, wherein the exposing the device page corresponding to the target device in response to the control operation for the target device further comprises:
and acquiring voice information, and analyzing the voice information to obtain control operation aiming at the target equipment.
9. The method of claim 8, further comprising:
and responding to the information acquisition instruction, displaying target feedback information acquired based on the information acquisition instruction on an information display page, and broadcasting the target feedback information through voice.
10. An electronic device, comprising: a memory storing a computer program and a processor implementing the steps of the device control based interaction processing method of any one of claims 1-9 when the computer program is executed.
11. An interaction processing apparatus based on device control, comprising:
the operation response module is used for responding to control operation aiming at target equipment and displaying an equipment page corresponding to the target equipment, wherein the control operation is used for indicating the target equipment to execute a target task and feeding back equipment state information in the process of executing the target task;
and the picture display module is used for displaying a task execution picture of the target task executed by the target equipment in the equipment page based on the equipment state information.
12. A storage medium having a computer program stored thereon, wherein the computer program, when executed by a processor, implements the steps of the device control-based interaction processing method of any one of claims 1 to 9.
CN202210278189.5A 2022-03-21 2022-03-21 Interactive processing method and device based on equipment control and electronic equipment Pending CN114885031A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210278189.5A CN114885031A (en) 2022-03-21 2022-03-21 Interactive processing method and device based on equipment control and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210278189.5A CN114885031A (en) 2022-03-21 2022-03-21 Interactive processing method and device based on equipment control and electronic equipment

Publications (1)

Publication Number Publication Date
CN114885031A true CN114885031A (en) 2022-08-09

Family

ID=82667336

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210278189.5A Pending CN114885031A (en) 2022-03-21 2022-03-21 Interactive processing method and device based on equipment control and electronic equipment

Country Status (1)

Country Link
CN (1) CN114885031A (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104135647A (en) * 2014-08-04 2014-11-05 深圳市劲松安防科技有限公司 Intelligent home control system, control method and control device
CN107358007A (en) * 2017-08-14 2017-11-17 腾讯科技(深圳)有限公司 Control the method, apparatus of intelligent domestic system and calculate readable storage medium storing program for executing
CN111147578A (en) * 2019-12-25 2020-05-12 青岛海信智慧家居系统股份有限公司 Method and device for preventing curtain animation from jumping
CN112527170A (en) * 2020-12-11 2021-03-19 深圳诚达伟业电子有限公司 Equipment visualization control method and device and computer readable storage medium
CN113885345A (en) * 2021-10-29 2022-01-04 广州市技师学院(广州市高级技工学校、广州市高级职业技术培训学院、广州市农业干部学校) Interaction method, device and equipment based on intelligent home simulation control system
CN114116090A (en) * 2021-11-08 2022-03-01 深圳Tcl新技术有限公司 Information display method and device, electronic equipment and computer readable storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104135647A (en) * 2014-08-04 2014-11-05 深圳市劲松安防科技有限公司 Intelligent home control system, control method and control device
CN107358007A (en) * 2017-08-14 2017-11-17 腾讯科技(深圳)有限公司 Control the method, apparatus of intelligent domestic system and calculate readable storage medium storing program for executing
CN111147578A (en) * 2019-12-25 2020-05-12 青岛海信智慧家居系统股份有限公司 Method and device for preventing curtain animation from jumping
CN112527170A (en) * 2020-12-11 2021-03-19 深圳诚达伟业电子有限公司 Equipment visualization control method and device and computer readable storage medium
CN113885345A (en) * 2021-10-29 2022-01-04 广州市技师学院(广州市高级技工学校、广州市高级职业技术培训学院、广州市农业干部学校) Interaction method, device and equipment based on intelligent home simulation control system
CN114116090A (en) * 2021-11-08 2022-03-01 深圳Tcl新技术有限公司 Information display method and device, electronic equipment and computer readable storage medium

Similar Documents

Publication Publication Date Title
CN113885345B (en) Interaction method, device and equipment based on intelligent home simulation control system
CN104110787B (en) Method and system for controlling air conditioner
CN110196557B (en) Equipment control method, device, mobile terminal and storage medium
CN107704169B (en) Virtual human state management method and system
WO2018214951A1 (en) Mobile terminal display method, mobile terminal display device, mobile terminal and storage medium
CN114302238B (en) Display method and display device for prompt information in sound box mode
CN108304234A (en) A kind of page display method and device
CN111459388B (en) Information screen display method, display equipment and storage medium for smart home information
CN112415971A (en) Method for controlling conference room by one key, computer equipment and readable medium
CN114465838B (en) Display equipment, intelligent home system and multi-screen control method
CN109783144B (en) Method and device for processing variable in interactive realization of virtual environment and storage medium
WO2018010326A1 (en) Screen display method and device
CN112785802B (en) Intelligent household security system control method and device, electronic equipment and medium
CN114885031A (en) Interactive processing method and device based on equipment control and electronic equipment
CN113301415A (en) Voice searching method suitable for video playing state
CN114915833B (en) Display control method, display device and terminal device
CN110493450B (en) Screen on-off method, folding screen terminal and computer readable storage medium
CN113590238A (en) Display control method, cloud service method, device, electronic equipment and storage medium
CN114339359B (en) Method for preventing screen burn in local screen-lighting mode and display device
CN112019931B (en) Man-machine interaction method and device of smart television, smart television and storage medium
CN115793473A (en) Control method of household equipment and related equipment
CN114630163B (en) Display device and quick start method
WO2023221995A1 (en) Intelligent device control method and electronic device
CN117714805A (en) Display equipment and subtitle display method
WO2023187071A1 (en) Display apparatus

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination