CN115314505A - Execution method and device of equipment scene, storage medium and electronic device - Google Patents
Execution method and device of equipment scene, storage medium and electronic device Download PDFInfo
- Publication number
- CN115314505A CN115314505A CN202210768538.1A CN202210768538A CN115314505A CN 115314505 A CN115314505 A CN 115314505A CN 202210768538 A CN202210768538 A CN 202210768538A CN 115314505 A CN115314505 A CN 115314505A
- Authority
- CN
- China
- Prior art keywords
- target
- scene
- twin
- equipment
- gateway
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 76
- 230000003993 interaction Effects 0.000 claims description 19
- 230000004044 response Effects 0.000 claims description 19
- 238000004891 communication Methods 0.000 claims description 17
- 238000004590 computer program Methods 0.000 claims description 8
- 230000000007 visual effect Effects 0.000 abstract description 21
- 238000005516 engineering process Methods 0.000 abstract description 4
- 230000008569 process Effects 0.000 description 15
- 238000010586 diagram Methods 0.000 description 9
- 230000000694 effects Effects 0.000 description 4
- 230000006870 function Effects 0.000 description 4
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 4
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 238000012545 processing Methods 0.000 description 3
- 230000001360 synchronised effect Effects 0.000 description 3
- 230000001960 triggered effect Effects 0.000 description 3
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000005406 washing Methods 0.000 description 2
- 208000033748 Device issues Diseases 0.000 description 1
- 238000003915 air pollution Methods 0.000 description 1
- 238000004887 air purification Methods 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 230000002457 bidirectional effect Effects 0.000 description 1
- 235000019504 cigarettes Nutrition 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 238000010408 sweeping Methods 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/01—Protocols
- H04L67/10—Protocols in which an application is distributed across nodes in the network
- H04L67/1095—Replication or mirroring of data, e.g. scheduling or transport for data synchronisation between network nodes
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02P—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
- Y02P90/00—Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
- Y02P90/02—Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]
Landscapes
- Engineering & Computer Science (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- Selective Calling Equipment (AREA)
Abstract
The application provides an execution method and device of an equipment scene, a storage medium and an electronic device, wherein the method comprises the following steps: acquiring a scene execution instruction of a target scene, wherein the target scene comprises target device operations executed by target twin devices on a target intelligent device; responding to the scene execution instruction, controlling the target twin device to execute the target device operation, and displaying an execution flow of the target twin device to execute the target device operation; acquiring a scene execution result of a target physical device corresponding to the target twin device, wherein the scene execution result is an execution result of the target scene on the target physical device; and displaying the execution result of the target twin equipment executing the target scene according to the scene execution result. By the method and the device, the problem that the execution method of the device scene in the related technology has low richness of visual information acquisition caused by small amount of displayed visual information is solved.
Description
Technical Field
The present application relates to the field of communications, and in particular, to an apparatus and a method for executing a device scenario, a storage medium, and an electronic apparatus.
Background
In the related art, the device or another device may be controlled to execute a set scene through a screen of an intelligent device, and an execution result of the scene may be displayed through the screen of the intelligent device. However, in the execution mode of the device scene, the user can only perceive the execution result of the scene, and the acquisition amount of the visual information is small, so that the use experience of the user is influenced.
Therefore, the executing method of the device scene in the related art has the problem of low richness of the visual information acquisition caused by small amount of the displayed visual information.
Disclosure of Invention
The embodiment of the application provides an execution method and device of an equipment scene, a storage medium and an electronic device, and aims to at least solve the problem that the execution method of the equipment scene in the related art is low in richness of visual information acquisition caused by the fact that the quantity of displayed visual information is small.
According to an aspect of an embodiment of the present application, there is provided a method for executing an equipment scenario, including: acquiring a scene execution instruction of a target scene, wherein the target scene comprises target equipment operation executed by target twin equipment on a target intelligent device; responding to the scene execution instruction, controlling the target twin device to execute the target device operation, and displaying an execution flow of the target twin device to execute the target device operation; acquiring a scene execution result of target physical equipment corresponding to the target twin equipment, wherein the scene execution result is an execution result of the target scene on the target physical equipment; and displaying the execution result of the target twin equipment executing the target scene according to the scene execution result.
According to another aspect of the embodiments of the present application, there is also provided an apparatus for executing an equipment scenario, including: a first obtaining unit, configured to obtain a scene execution instruction of a target scene, where the target scene includes a target device operation performed by a target twin device on a target smart device; a control unit configured to control the target twin device to perform the target device operation in response to the scene execution instruction; a first display unit, configured to display an execution flow in which the target twin device executes the target device operation; a second obtaining unit, configured to obtain a scene execution result of a target physical device corresponding to the target twin device, where the scene execution result is an execution result of the target scene on the target physical device; and the second display unit is used for displaying the execution result of the target twin equipment executing the target scene according to the scene execution result.
In one exemplary embodiment, the first display unit includes: the first display module is configured to display an execution flow of the target twin device executing the target device operation at a target position in a target area of a display interface of the target smart device, where a set of twin devices corresponding to a set of physical devices is displayed in the target area according to a positional relationship between the set of physical devices, a device shape and a device state of each twin device in the set of twin devices match a device shape and a device state of the corresponding physical device, and the target position is a position where the target twin device is located.
In one exemplary embodiment, the first display unit includes: and a second display module, configured to, when the target scene includes a plurality of device operations executed by a plurality of twin devices in a preset execution order, sequentially display, in the preset execution order, an execution flow of executing each device operation by a twin device executing each device operation of the plurality of device operations, where the target twin device includes the plurality of twin devices, and the target device operation includes the plurality of device operations, and each device operation is executed by one of the plurality of twin devices.
In one exemplary embodiment, the apparatus further comprises: a determining unit, configured to determine, before the target twin device is controlled to perform the target device operation, the target twin device and the target device operation according to target configuration information of the target scene, where the target configuration information is used to indicate a correspondence between a device operation included in the target scene and a twin device that performs the device operation.
In one exemplary embodiment, the first acquisition unit includes: the generating module is used for responding to a scene trigger operation executed on the target intelligent equipment and generating the scene execution instruction; or, a first receiving module, configured to receive the scene execution instruction sent by the target physical device, where the scene execution instruction and the scene execution result are sent simultaneously; or, the second receiving module is configured to receive the scene execution instruction sent by the other device except the target physical device.
In one exemplary embodiment, the apparatus further comprises: the first sending unit is configured to send the scene execution instruction to the target physical device after the scene execution instruction is generated in response to a scene trigger operation performed on the target smart device, so as to control the target physical device to execute the target device operation included in the target scene.
In one exemplary embodiment, the apparatus further comprises: a receiving unit, configured to receive the scene execution instruction through a first device gateway to which the target smart device belongs after the scene execution instruction is sent to the target physical device; a second sending unit, configured to send the scene execution instruction to the target physical device through the first device gateway when the device gateway to which the target physical device belongs is the first device gateway; a third sending unit, configured to send the scene execution instruction to the second device gateway through the first device gateway when the device gateway to which the target physical device belongs is a second device gateway, so that the second device gateway sends the scene execution instruction to the target physical device.
In one exemplary embodiment, the third transmitting unit includes: a first sending module, configured to send, by the first device gateway, the scene execution instruction to the second device gateway through a data interaction channel when the data interaction channel exists between the first device gateway and the second device gateway; or the second sending module is configured to send the scene execution instruction to the second device gateway through the cloud server by using the first device gateway under the condition that no data interaction channel exists between the first device gateway and the second device gateway and communication connection is established between the first device gateway and the cloud server through both the first device gateway and the second device gateway.
According to another aspect of the embodiments of the present application, there is also provided a computer-readable storage medium, in which a computer program is stored, where the computer program is configured to execute the method for executing the device scenario when running.
According to another aspect of the embodiments of the present application, there is also provided an electronic apparatus, including a memory, a processor, and a computer program stored in the memory and executable on the processor, where the processor executes the method for executing the device scenario through the computer program.
In the embodiment of the application, a scene execution instruction is adopted, the twin device is controlled to execute device operations included in a scene and an execution flow of the twin device execution device operations is displayed, and the scene execution instruction of a target scene is obtained, wherein the target scene comprises target device operations executed by a target twin device on a target intelligent device; responding to the scene execution instruction, controlling the target twin equipment to execute the target equipment operation, and displaying an execution flow of the target twin equipment to execute the target equipment operation; acquiring a scene execution result of target physical equipment corresponding to the target twin equipment, wherein the scene execution result is an execution result of a target scene on the target physical equipment; the execution result of the target twin device execution target scene is displayed according to the scene execution result, and in the execution process of the device scene, the execution flow of the device scene is executed through the twin device display device, so that the aim of increasing the displayed visual information amount can be fulfilled, the technical effect of improving the richness of visual information acquisition is achieved, and the problem of low richness of visual information acquisition caused by the small displayed visual information amount in the execution method of the device scene in the related art is solved.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the application and, together with the description, serve to explain the principles of the application.
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly described below, and it is obvious for those skilled in the art to obtain other drawings without inventive exercise.
FIG. 1 is a schematic diagram of a hardware environment for implementing a method for an alternative device scenario in accordance with an embodiment of the present application;
fig. 2 is a schematic flowchart of an implementation method of an alternative device scenario according to an embodiment of the present application;
FIG. 3 is a schematic diagram of an alternative method of implementing a device scenario in accordance with an embodiment of the present application;
FIG. 4 is a schematic diagram of a method of performing another alternative device scenario in accordance with an embodiment of the present application;
FIG. 5 is a flow diagram illustrating a method for performing another alternative device scenario in accordance with an embodiment of the present application;
FIG. 6 is a block diagram of an alternative implementation of a device scenario according to an embodiment of the present application;
fig. 7 is a block diagram of an alternative electronic device according to an embodiment of the present application.
Detailed Description
In order to make the technical solutions better understood by those skilled in the art, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only partial embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
It should be noted that the terms "first," "second," and the like in the description and claims of this application and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the application described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
According to an aspect of an embodiment of the present application, there is provided an apparatus scenario execution method. The execution method of the equipment scene is widely applied to full-House intelligent digital control application scenes such as intelligent homes (Smart Home), intelligent homes, intelligent Home equipment ecology, intelligent House (Intelligent House) ecology and the like. Alternatively, in this embodiment, the execution method of the device scenario may be applied to a hardware environment formed by the terminal 102 and the server 104 as shown in fig. 1. As shown in fig. 1, the server 104 is connected to the terminal 102 through a network, and may be configured to provide a service (e.g., an application service) for the terminal or a client installed on the terminal, and may configure a cloud computing and/or edge computing service on the server or separately from the server, so as to provide a data operation service for the server 104.
The network may include, but is not limited to, at least one of: wired networks, wireless networks. The wired network may include, but is not limited to, at least one of: wide area networks, metropolitan area networks, local area networks, which may include, but are not limited to, at least one of the following: WIFI (Wireless Fidelity), bluetooth. Terminal 102 can be not limited to be PC, the cell-phone, the panel computer, intelligent air conditioner, intelligent cigarette machine, intelligent refrigerator, intelligent oven, intelligent kitchen range, intelligent washing machine, intelligent water heater, intelligent washing equipment, intelligent dish washer, intelligent projection equipment, intelligent TV, intelligent clothes hanger, intelligent curtain, intelligent audio-visual, intelligent socket, intelligent stereo set, intelligent audio amplifier, intelligent new trend equipment, intelligent kitchen guarding equipment, intelligent bathroom equipment, intelligence robot of sweeping the floor, intelligence robot of wiping the window, intelligence robot of mopping the floor, intelligent air purification equipment, intelligent steam ager, intelligent microwave oven, intelligence kitchen guarding, intelligent clarifier, intelligent water dispenser, intelligent lock etc..
The execution method of the device scenario in the embodiment of the present application may be executed by the server 104, or may be executed by the terminal 102, or may be executed by both the server 104 and the terminal 102. The method for executing the device scenario by the terminal 102 according to the embodiment of the present application may also be executed by a client installed thereon.
Taking the method for executing the device scenario in this embodiment by the terminal 104 as an example, fig. 2 is a schematic flowchart of an optional method for executing the device scenario according to the embodiment of the present application, and as shown in fig. 2, the method may include the following steps:
step S202, a scene execution instruction of a target scene is obtained, where the target scene includes a target device operation executed by a target twin device on a target smart device.
The execution method of the device scenario in the present embodiment can be applied to a scenario in which scene twin control is performed on a digital twin device. The digital twin device may include a physical device and a corresponding twin device, the physical device may be a terminal device or an intelligent home device, and the twin device is a virtual device, which may be a virtual device for displaying the corresponding physical device. The smart home devices may be smart home devices located in a user's home, and may be electronic devices equipped with smart chips, such as a smart television, a smart refrigerator, and a smart water heater, and compared with conventional home devices, the smart home devices are added with a computing module, a network interface, an input/output device, and the like, so that the smart home devices in this embodiment have functions of intelligent analysis and intelligent service.
In this embodiment, a scene execution instruction of a target scene including a target device operation executed by a target twin device on a target smart device may be acquired. Optionally, the target twin device may be a virtual device in the target smart device, and each virtual device in the target smart device corresponds to one physical device, that is, each virtual device in the target smart device is a mapping of one corresponding physical device, and the current state of the physical device may be reflected in real time by the virtual device.
Alternatively, the target smart device may be a smart screen device (smart screen, a smart device with a larger screen size) in the home of the user, or may be another smart device with a display screen in the home of the user. In addition, the Display screen may be a Light Emitting Diode (LED) screen, an Organic Light Emitting Diode (OLED) screen, or another type of screen. In this embodiment, the types of the target smart device and the display screen are not limited.
Optionally, the process of obtaining the scene execution instruction of the target scene may be: acquiring a scene execution instruction sent by target physical equipment corresponding to the target twin equipment; or generating a scene execution instruction according to the detected scene trigger operation executed by the user on the target intelligent device; alternatively, the scene execution instruction sent by the other device except the target physical device may be received, or may be obtained by another method.
And S204, responding to the scene execution instruction, controlling the target twin device to execute the target device operation, and displaying an execution flow of the target twin device to execute the target device operation.
In this embodiment, after the scene execution instruction of the target scene is acquired, in response to the scene execution instruction, the target smart device may control the target twin device to execute the target device operation, and display an execution flow of the target twin device executing the target device operation.
Optionally, the process of controlling the target twin device to perform the target device operation in response to the scene execution instruction may be: and executing a scene execution animation corresponding to the target device operation, or adjusting the device parameters of the target twin device according to the operation parameters of the target device operation to execute the target device operation. Alternatively, in a case where the target scene contains a plurality of device operations executed by the plurality of twin devices in a preset execution order, the plurality of twin devices may be sequentially controlled to execute the corresponding device operations in the preset execution order. Since there may be a case where the same device performs device operations multiple times in the target scene, the number of the multiple device operations and the number of the multiple twin devices are not necessarily the same, and the number of the multiple device operations may be greater than or equal to the number of the multiple twin devices.
For example, when the plurality of virtual devices included in the scene to be executed (an embodiment of the target scene) are device ase:Sub>A, device B, device C, and device D, and the execution order of the plurality of virtual devices is ase:Sub>A-B-ase:Sub>A-C-D, the devices ase:Sub>A, B, C, and D may be controlled to sequentially execute corresponding device operations according to the execution order.
In controlling the target twin device to perform the target device operation, an execution flow of the target twin device to perform the target device operation may be displayed. Alternatively, the above-mentioned process of displaying the execution flow of the target twin device executing the target device operation may be: when the target scene includes a plurality of device operations executed by a plurality of twin devices according to a preset execution sequence, sequentially displaying, according to the preset execution sequence, an execution flow for executing each device operation by the twin device executing each device operation in the plurality of device operations, where an execution manner of the device operations is similar to that described above and is not described herein again.
Step S206, a scene execution result of the target physical device corresponding to the target twin device is obtained, where the scene execution result is an execution result of the target scene on the target physical device.
In this embodiment, the scene execution instruction may be a scene that needs to be synchronized to the target physical device (for example, for a scene in which the scene execution instruction is generated by the target smart device, or for a scene in which the scene execution instruction can only be synchronized to the target physical device by the target smart device), or may also be a scene that does not need to be synchronized to the target physical device (for example, for a scene in which the scene execution instruction is not generated by the target smart device, and the target smart device only needs to generate the scene execution instruction generated by the target smart device to occur to the target smart device). Correspondingly, the target physical device may also obtain a scene execution instruction, execute the target device operation in response to the scene execution instruction, and send an obtained execution result on the target physical device to the target intelligent device.
For a scene in which the target intelligent device sends the scene execution instruction to the target physical device, the target intelligent device may send the scene execution instruction to the target physical device while controlling the target twin device to execute the target device operation; or after the scene execution instruction of the target scene is acquired, the scene execution instruction is sent to the target physical device, and then the target twin device is controlled to execute the target device operation in response to the scene execution instruction, which is not limited in this embodiment.
For example, when the scene to be executed (an embodiment of the target scene) includes device operations executed by the physical devices a, B, C, D corresponding to the virtual devices a, B, C, D, the physical devices a, B, C, D may be controlled to execute the scene execution instruction or corresponding device operations in the scene execution instruction.
When the target device operation is executed, the target twin device does not need to acquire an execution solution result of the target device operation, but waits for the target physical device to send a scene execution result of the target device operation to the target intelligent device, wherein the scene execution result is an execution result of a target scene on the target physical device. For the case that the target scenario includes a plurality of device operations executed by a plurality of twin devices according to the preset execution sequence, the target smart device may directly present the received operation execution result after receiving the operation execution result of each device operation, or may present the operation execution results of all device operations in a unified manner after receiving the operation execution results of all device operations.
It should be noted that the process of executing the target device operation by the target physical device is similar to the process of executing the target device operation by the target twin device, and the difference is that the target physical device executes the scene execution instruction, which is actual execution, and the target twin device execution is simulated execution on the virtual device, and the target physical device executes the target device operation to obtain the operation execution result of the target device operation, and the target twin device executes the target device operation without obtaining the operation execution result of the target device operation, which is not described in detail in this embodiment.
And step S208, displaying the execution result of the target twin device executing the target scene according to the scene execution result.
In this embodiment, after acquiring the scene execution result, the target smart device may display the execution result of the target twin device executing the target scene according to the scene execution result. Alternatively, when the target physical device includes a plurality of physical devices, the process of displaying the execution result of the target twin device executing the target scene according to the scene execution result may be: and adjusting the execution result of the twin device corresponding to the target twin device to be a final execution result according to the final execution result of each of the plurality of physical devices in the scene execution result.
For example, when the physical device a is a lamp and the last execution result is lighting, the virtual device a on the smart screen may be controlled to display as lighting.
Through the steps S202 to S208, executing the instruction by obtaining a scene of a target scene, where the target scene includes target device operations executed by the target twin device on the target smart device; responding to the scene execution instruction, controlling the target twin equipment to execute the target equipment operation, and displaying an execution flow of the target twin equipment to execute the target equipment operation; acquiring a scene execution result of target physical equipment corresponding to the target twin equipment, wherein the scene execution result is an execution result of a target scene on the target physical equipment; the execution result of the target twin device execution target scene is displayed according to the scene execution result, the problem that the abundance of the visual information acquisition is low due to the fact that the amount of the displayed visual information is small in the execution method of the device scene in the related technology is solved, and the abundance of the visual information acquisition is improved.
In one exemplary embodiment, the execution flow of the target twin device to execute the target device operation is displayed, including:
s11, displaying an execution flow of target twin equipment executing target equipment operation on a target position in a target area of a display interface of the target intelligent equipment, wherein a group of twin equipment corresponding to a group of physical equipment is displayed in the target area according to the position relation of the group of physical equipment, the equipment shape and the equipment state of each twin equipment in the group of twin equipment are matched with the equipment shape and the equipment state of the corresponding physical equipment, and the target position is the position of the target twin equipment.
In this embodiment, in order to improve richness of visual information display and convenience of visual information acquisition, a set of twin devices corresponding to a set of physical devices may be displayed in a target area of a display interface of a target smart device according to a positional relationship of the set of physical devices, and a device shape and a device state of each twin device in the set of twin devices are matched with a device shape and a device state of the corresponding physical device, so that a device type of the corresponding physical device and a relative position between the devices may be displayed by the twin devices.
Optionally, the positional relationship of each twin device is also matched to the positional relationship of the corresponding physical device. For example, if the left side of device B in a set of physical devices a, B, C, and D is device C and the right side is device D, the left side of device B in a set of virtual devices a, B, C, and D is also device C and the right side is also device D.
Optionally, the target area may be located in an upper left area, a lower left area, an upper right area, and a lower right area of the display interface, may be located in a middle area of the display interface, may occupy the entire display interface, and may also be located in other areas of the display interface. In order to improve the use experience of the user, the position of the target area on the display interface can be adjusted, and the user can adjust the position of the target area on the display interface according to the preference of the user. Further, the ratio of the target area in the display area (i.e., the area size) may be adjusted.
For example, the user may adjust the presentation area (an example of the above-described target area) of the twin device from the upper right position of the display interface to the middle position of the display interface, and adjust the area size of the presentation area to be large.
It should be noted that, after the user adjusts the ratio of the target area on the display interface, the ratio of each twin device in the set of twin devices in the target area on the display interface may also be adjusted accordingly. For example, after the occupancy of the specific area on the display interface is adjusted from 20% to 40%, the occupancy of the virtual devices in the specific area on the display interface can be adjusted from 5% to 10%.
In the process of the target twin device performing the target device operation, an execution flow of the target twin device performing the target device operation may be displayed. Optionally, an execution flow of the target twin device executing the target device operation may be displayed at a target position in a target area of a display interface of the target smart device, where the target position is a position where the target twin device is located.
Through this embodiment, the position and the shape of the twin device are set based on the device position and the device shape of the physical device, and the richness of visual information display and the convenience of visual information acquisition can be improved.
In one exemplary embodiment, the execution flow of the target twin device to execute the target device operation is displayed, including:
and S21, in the case that the target scene contains a plurality of device operations executed by a plurality of twin devices according to a preset execution sequence, sequentially displaying an execution flow of executing each device operation by the twin device executing each device operation in the plurality of device operations according to the preset execution sequence, wherein the target twin device comprises the plurality of twin devices, the target device operation comprises the plurality of device operations, and each device operation is executed by one twin device in the plurality of twin devices.
In this embodiment, if the target scene includes a plurality of device operations executed by a plurality of twin devices in a preset execution order, that is, the target twin device includes a plurality of twin devices, the target device operation includes a plurality of device operations executed in the preset execution order, and each device operation is executed by one of the twin devices, an execution flow of executing each device operation by the twin device executing each device operation may be sequentially displayed in the preset execution order.
For example, when the scene to be executed includes a plurality of device operations executed by the virtual devices a, B, C, and D, and the order of executing the device operations of the plurality of virtual devices is a-B-C-D, the virtual devices a, B, C, and D may be controlled to sequentially execute the corresponding device operations according to the execution order.
Alternatively, the preset execution sequence may include a plurality of device operations executed simultaneously, and in the process of sequentially displaying the execution flow of each device operation executed by the twin device executing each device operation in the plurality of device operations according to the preset execution sequence, the execution flows of the plurality of device operations executed simultaneously may be displayed simultaneously.
For example, when the scene to be executed includes a plurality of device operations executed by the virtual devices a, B, C, and D, and the execution sequence of the plurality of virtual devices is a- (C, D) -B-D, the devices C and D may be controlled to simultaneously execute the corresponding device operations and display the corresponding device operations.
By means of the method and the device for executing the device operation, when the scene to be executed comprises a plurality of device operations executed by a plurality of twin devices according to the preset execution sequence, the execution flow of executing each device operation by the twin devices executing each device operation is displayed in sequence according to the preset execution sequence, and the accuracy of information acquisition can be improved.
In one exemplary embodiment, before controlling the target twin device to perform the target device operation, the method further includes:
and S31, determining target twin equipment and target equipment operation according to target configuration information of the target scene, wherein the target configuration information is used for indicating the corresponding relation between the equipment operation contained in the target scene and the twin equipment for executing the equipment operation.
In the present embodiment, before the control target twin device performs the target device operation, the twin device on which the target device operation is to be performed and the device operation performed by it may be determined. Alternatively, the target twin device and the target device operation may be determined according to target configuration information of the target scene, where the target configuration information is used to indicate a correspondence between the device operation included in the target scene and the twin device performing the device operation.
Optionally, according to the target configuration information of the target scene, the process of determining the target twin device and the target device operation may be: firstly, according to the target configuration information, the target equipment operation contained in the target scene is determined, and then the twin equipment executing the target equipment operation is determined as the target twin equipment.
For example, after the scene execution instruction of the scene to be executed is acquired, it may be determined that the device operation included in the scene to be executed is a lighting operation according to the configuration information of the scene to be executed, and then the virtual device a (electric lamp) is determined as the virtual device that executes the device operation.
Optionally, the target configuration information may be used to indicate a correspondence between a device operation included in the target scene and a twin device that performs the device operation, and may further include an execution parameter of the device operation, and the target twin device may execute the target device operation according to the determined execution parameter in a process of executing the target device operation.
For example, when the lighting gear of the virtual device a has the 3 nd gear and the execution parameter corresponding to the device operation is the 2 nd gear, the lighting gear of the virtual device a may be adjusted to the 2 nd gear.
According to the embodiment, the equipment operation contained in the scene to be executed and the twin equipment for executing the equipment operation are determined according to the configuration information of the scene to be executed, so that the accuracy of determining the twin equipment and the equipment operation can be improved, and the accuracy of displaying the execution flow can be improved.
In one exemplary embodiment, the scene execution instructions to obtain a target scene comprise:
s41, responding to a scene trigger operation executed on the target intelligent equipment, and generating a scene execution instruction; or,
s42, receiving a scene execution instruction sent by the target physical device, wherein the scene execution instruction and a scene execution result are sent simultaneously; or,
and S43, receiving a scene execution instruction sent by other equipment except the target physical equipment.
In this embodiment, the target smart device may obtain the scene execution instruction of the target scene through multiple manners, where the manner of obtaining the scene execution instruction may include, but is not limited to, at least one of the following: automatically generating a scene execution instruction in response to the detected device operation; receiving a scene execution instruction sent by a target physical device (for a scene executed by a plurality of physical devices, a scene execution instruction sent by one of the plurality of physical devices may be received); and receiving scene execution instructions sent by other devices except the target physical device.
As an alternative embodiment, the scenario execution instruction may be generated in response to a scenario-triggered operation performed on the target smart device. Optionally, the scene trigger operation may be a gesture operation performed on a twin device on the target smart device. For example, after a user performs a click operation on a virtual device on a smart screen, the smart screen may generate a scene execution instruction in response to the click operation.
As another optional implementation manner, the scene execution instruction sent by the target physical device may be received, optionally, the scene execution instruction and the scene execution result may be sent separately, and considering that the time for device operation execution is generally short, the scene execution instruction and the scene execution result may also be sent simultaneously. Optionally, before sending the scene execution instruction to the target smart device, the target physical device may acquire or generate the scene execution instruction, where the process of acquiring the scene execution instruction may be: generating a scene execution instruction according to a scene trigger operation executed by a user on target physical equipment; or, receive a scene execution instruction sent by a terminal device of a user, which is not limited in this embodiment.
After the acquired scene execution instruction is obtained, the target physical device may execute a target device operation included in the scene execution instruction, obtain a scene execution result corresponding to the scene execution instruction, and simultaneously send the scene execution instruction and the scene execution result to the target intelligent device. Optionally, the scene execution instruction and the scene execution result may be packaged into a data packet, and then the data packet is sent to the target smart device. For example, the physical device a may send the light-on instruction and the execution result to the smart screen after receiving the light-on instruction of the user.
As another alternative implementation, the user may also trigger the execution of the target scenario by another device besides the target physical device, where the target device may be a terminal device, for example, a mobile phone terminal, or another device capable of performing operation acquisition. The target smart device may receive scene execution instructions sent by devices other than the target physical device.
Through the embodiment, the scene execution instruction of the scene to be executed is acquired in multiple modes, so that the flexibility of acquiring the scene execution instruction can be improved, and the convenience of scene execution control is further improved.
In an exemplary embodiment, after generating the scenario execution instruction in response to the scenario trigger operation performed on the target smart device, the method further includes:
and S51, sending the scene execution instruction to the target physical device to control the target physical device to execute the target device operation contained in the target scene.
In this embodiment, after a scene execution instruction is generated in response to a scene trigger operation performed on a target smart device, the scene execution instruction may be sent to the target physical device to control the target physical device to execute a target device operation included in a target scene.
Optionally, since there is a risk that the scene execution instruction is stolen in the process of sending the scene execution instruction to the target physical device, the scene execution instruction may be encrypted before being sent to the target physical device, so as to reduce a risk that the scene execution instruction is leaked.
By the embodiment, after the scene execution instruction is generated, the scene execution instruction is sent to the to-be-executed physical device to control the to-be-executed physical device to execute the device operation included in the to-be-executed scene, so that the to-be-executed physical device can accurately acquire the scene execution instruction, and the scene execution instruction is executed accurately.
In an exemplary embodiment, after the transmitting the scene execution instruction to the target physical device, the method further includes:
s61, receiving a scene execution instruction through a first device gateway to which a target intelligent device belongs;
s62, under the condition that the device gateway to which the target physical device belongs is a first device gateway, sending a scene execution instruction to the target physical device through the first device gateway;
and S63, under the condition that the device gateway to which the target physical device belongs is a second device gateway, sending the scene execution instruction to the second device gateway through the first device gateway so that the second device gateway sends the scene execution instruction to the target physical device.
In this embodiment, after the target smart device issues the scene execution instruction, the scene execution instruction may be sent to the target physical device via one or more devices. First, the scene execution instruction may be first sent to a first device gateway to which the target smart device belongs, and the first device gateway may receive the scene execution instruction. For example, as shown in fig. 3, after the smart screen sends the scene execution instruction, the smart screen gateway (an example of the first device gateway described above) may receive the scene execution instruction sent by the smart screen.
Alternatively, after receiving the scene execution instruction, the first device gateway may determine whether the target physical device is a device under the gateway. If the device gateway to which the target physical device belongs is the first device gateway, the first device gateway may send the scene execution instruction to the target physical device.
For example, after a scenario is triggered by a scenario trigger end (an example of the target smart device), the scenario trigger end may send a control instruction (an example of the scenario execution instruction) to a directly connected smart gateway (i.e., the smart screen gateway in fig. 3) of the scenario trigger end to perform edge scenario execution, and an edge scenario execution engine (i.e., the edge rule engine in fig. 4) within the smart screen gateway may analyze whether a scenario attribution gateway (i.e., a smart gateway to which a physical device executing the scenario to be executed belongs) is a current gateway (i.e., a smart gateway to which the smart screen belongs) according to a scenario execution rule, and execute the scenario in different manners. As shown in fig. 3, if the scene attribution gateway is the current gateway, the gateway controls the directly connected sub-device execution scene under the gateway, that is, when the smart device 1 or the smart device 2 (an embodiment of the target physical device) is connected to the same smart gateway as the smart screen, the smart screen gateway may directly send the scene execution instruction to the smart device 1 or the smart device 2 after receiving the scene execution instruction, so that the smart device 1 or the smart device 2 executes the scene to be executed.
Optionally, if the device gateway to which the target physical device belongs is a second device gateway, the second device gateway and the first device gateway may be different device gateways, and the first device gateway may send the scene execution instruction to the second device gateway, so that the second device gateway sends the scene execution instruction to the target physical device.
For example, as shown in fig. 3, in a case where the smart device 3 is not connected to the smart screen gateway but connected to the smart device gateway, the smart screen gateway may transmit the scene execution instruction to the smart device gateway so that the smart device gateway transmits the scene execution instruction to the smart device 3.
Optionally, the sending, by the first device gateway, the scene execution instruction to the second device gateway, and the sending, by the second device gateway, the scene execution instruction to the target physical device may be: under the condition that a data exchange channel exists between a first equipment gateway and a second equipment gateway, sending a scene execution instruction to the second equipment gateway through the first equipment gateway through the data exchange channel; and sending the scene execution instruction to the second equipment gateway through the cloud server by the first equipment gateway under the condition that a data exchange channel does not exist between the first equipment gateway and the second equipment gateway but communication connection is established between the first equipment gateway and the cloud server and between the first equipment gateway and the second equipment gateway.
According to the embodiment, the scene execution instruction is sent to the physical equipment in different modes based on the relationship between the intelligent equipment and the equipment gateway where the physical equipment is located, so that the success rate of sending the scene execution instruction can be improved.
In one exemplary embodiment, sending, by the first device gateway, the scenario execution instruction to the second device gateway includes:
s71, under the condition that a data interaction channel exists between the first equipment gateway and the second equipment gateway, sending the scene execution instruction to the second equipment gateway through the data interaction channel by the first equipment gateway; or,
and S72, under the condition that no data interaction channel exists between the first equipment gateway and the second equipment gateway and the first equipment gateway and the second equipment gateway are both in communication connection with the cloud server, the scene execution instruction is sent to the second equipment gateway through the first equipment gateway and the cloud server.
In this embodiment, if the device gateway to which the target physical device belongs is the second device gateway, the first device gateway may send the scene execution instruction to the second device gateway in different manners based on whether there is a data interaction channel between the first device gateway and the second device gateway.
As an optional implementation manner, if a data interaction channel exists between the first device gateway and the second device gateway, the first device gateway may send the scene execution instruction to the second device gateway through the data interaction channel. For example, if the scenario home gateway is not the current gateway, it may check whether a small loop (that means data interaction may be performed between two gateways) of the scenario home gateway is online, and if the small loop is online, send a cross-gateway execution scenario instruction, and send the scenario execution instruction to the scenario home gateway for scenario execution.
As another optional implementation manner, if there is no data interaction channel between the first device gateway and the second device gateway, and the first device gateway and the second device gateway both establish a communication connection with the cloud server, the first device gateway may send the scene execution instruction to the second device gateway through the cloud server.
For example, if the small loop of the scene home gateway is not online, but the cloud state of the scene home gateway (for indicating whether to connect an Internet of things (IOT) cloud) is online and the cloud state of the smart screen gateway is also online, the marginal scene may be executed through the large loop, the smart screen gateway sends the scene execution instruction to the IOT cloud, and the IOT cloud sends the scene execution instruction to the scene home gateway, thereby completing the scene execution.
Through the embodiment, if the device gateways to which the intelligent device and the physical device belong are not the same gateway, the scene execution instruction is sent to the physical device in different modes based on whether the gateway device can perform data interaction, and the success rate of sending the scene execution instruction can be improved.
The following explains an execution method of the device scenario in the embodiment of the present application with an optional example. In this optional example, the target smart device is a smart screen, the target physical device is a home physical device, the first device gateway is a smart screen gateway, and the second device gateway is a smart device gateway.
At present, a plurality of household appliances in a user household are provided with intelligent screens, but the intelligent screens only stay in simple presentation of interaction and execution results of user control equipment, and control effects and running states of the whole household appliances in the user household are not intelligently presented through the intelligent screens, and the intelligent screens are not only a video entertainment center of the household, but also an information sharing center, a control management center and a multi-device interaction center, and belong to a household intelligent life control center. As a one-to-many optimal service carrier, the intelligent screen plays a great role in multiple aspects such as data display, decision command, remote cooperation, local interaction and the like, and can well assist the landing realization of various service scenes.
The optional example provides a scheme of edge cloud cooperation-scene twin control, scene states, behaviors and function execution effects of household equipment in a user household space are mapped to a smart screen through a digital twin technology, the smart screen can trigger scenes in real time (voice and manual trigger), the intelligent screen can also synchronously display the execution effect of an automatic scene in real time, and the scene digital twin cooperation form of virtual and real fusion is realized, so that the scheme is important application and embodiment of digital twin in the fields of smart families, smart scenes and internet of things.
Firstly, a user can control a scene through twin equipment in the intelligent screen, and after the scene is triggered, scene control response is carried out through the twin equipment in the intelligent screen and physical equipment in the user family; secondly, the user can realize scene control through a control panel of the household physical equipment and perform scene control response through the household physical equipment and the intelligent screen; finally, the user can control the equipment in the house through the user mobile phone APP (Application) and the intelligent sound equipment, and can perform scene control response with the equipment in the house through the intelligent screen.
As shown in fig. 3, 4 and 5, the flow of the method for executing the device scenario in this alternative example may include the following steps:
step S502, obtaining an execution instruction through the intelligent screen.
For example, the user can perform scene control through a smart screen to generate a scene execution instruction.
Step S504, the intelligent screen sends the execution instruction to the intelligent screen gateway.
For example, a smart screen may send scene execution instructions to an integrated smart device gateway (i.e., the smart screen gateway described above).
Step S506, the smart screen gateway sends the execution instruction to the smart device gateway.
For example, the integrated smart device gateway sends the cross-panel scenario execution instructions to the home gateway of the scenario (i.e., the smart device gateway described above).
Step S508, the smart device gateway sends the execution instruction to the smart device.
An edge scene SDK (Software Development Kit) in the scene home gateway and a scene execution engine, which are the core of the scene twin control, may send a scene execution instruction to the smart device after receiving the scene execution instruction. The reasoning service is to analyze time data, water quality data, air pollution data, temperature data, humidity data, ultraviolet data and the like around a user based on the position information of the user to generate personalized reasoning service.
Step S510, the smart device completes a corresponding function according to the execution instruction, and reports an execution result to the smart screen and the IOT cloud.
After the intelligent device completes the corresponding function, the device execution result callback service of the edge intelligent gateway (namely, the intelligent device gateway) can be called back, the device execution result is reported, the edge scene SDK and the scene execution engine in the intelligent device gateway can send the device execution result to the intelligent screen and the IOT cloud according to the reporting rule, and in addition, when the user is an execution instruction generated through the display screen of the intelligent device, the device execution result can be sent to the screen of the intelligent device.
According to the embodiment, whether the smart screen exists in the user home equipment is judged, and if the smart screen exists in the user home equipment, whether the smart screen is on line is analyzed; if the intelligent screen is on line, the intelligent gateway of the scene trigger end synchronizes the scene execution instruction and controls the virtual equipment to execute the scene execution instruction, after the virtual equipment executes the scene execution instruction, the equipment execution result does not need to be called back, and after the real physical equipment executes the scene execution instruction, the equipment execution result is synchronously reported to the intelligent gateway, the IOT cloud and the intelligent screen of the scene trigger end according to the scene rule, so that scene twin control is realized.
The scene rules support double control of twin equipment/physical equipment and bidirectional feedback of execution results of the physical equipment, can realize global management and control of the user on the household equipment, and provide basis for scene decision of comfortable and smart life for the user. The intelligent household scene control system has the advantages that the user can comprehensively and visually control the running condition of a whole house scene in the user family through the intelligent screen, the user can comprehensively adjust different devices to the situation required by the user according to own wishes through the intelligent screen, and then a comfortable intelligent living environment is provided for the user.
It should be noted that, for simplicity of description, the above-mentioned method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the present application is not limited by the order of acts described, as some steps may occur in other orders or concurrently depending on the application. Further, those skilled in the art should also appreciate that the embodiments described in the specification are preferred embodiments and that the acts and modules referred to are not necessarily required in this application.
Through the description of the foregoing embodiments, it is clear to those skilled in the art that the method according to the foregoing embodiments may be implemented by software plus a necessary general hardware platform, and certainly may also be implemented by hardware, but the former is a better implementation mode in many cases. Based on such understanding, the technical solutions of the present application may be embodied in the form of a software product, where the computer software product is stored in a storage medium (e.g., a ROM (Read-Only Memory)/RAM (random access Memory), a magnetic disk, an optical disk), and includes several instructions for enabling a terminal device (which may be a mobile phone, a computer, a server, or a network device) to execute the methods described in the embodiments of the present application.
According to another aspect of the embodiments of the present application, there is also provided an apparatus for executing the device scenario, which is used for implementing the method for executing the device scenario. Fig. 6 is a block diagram of an execution apparatus for an alternative device scenario according to an embodiment of the present application, and as shown in fig. 6, the apparatus may include:
a first obtaining unit 602, configured to obtain a scene execution instruction of a target scene, where the target scene includes a target device operation performed by a target twin device on a target smart device;
a control unit 604, connected to the first obtaining unit 602, configured to control the target twin device to perform a target device operation in response to the scene execution instruction;
a first display unit 606, connected to the control unit 604, for displaying an execution flow of the target twin device executing the target device operation;
a second obtaining unit 608, connected to the first displaying unit 606, configured to obtain a scene execution result of the target physical device corresponding to the target twin device, where the scene execution result is an execution result of the target scene on the target physical device;
and a second display unit 610, connected to the second obtaining unit 608, for displaying an execution result of the target twin device executing the target scene according to the scene execution result.
It should be noted that the first obtaining unit 602 in this embodiment may be configured to perform the step S202, the control unit 604 and the first display unit 606 in this embodiment may be configured to perform the step S204, the second obtaining unit 608 in this embodiment may be configured to perform the step S206, and the second display unit 610 in this embodiment may be configured to perform the step S208.
Executing the instruction by acquiring a scene of a target scene through the modules, wherein the target scene comprises target equipment operation executed by target twin equipment on the target intelligent equipment; responding to the scene execution instruction, controlling the target twin equipment to execute the target equipment operation, and displaying an execution flow of the target twin equipment to execute the target equipment operation; acquiring a scene execution result of target physical equipment corresponding to the target twin equipment, wherein the scene execution result is an execution result of a target scene on the target physical equipment; the execution result of the target twin device execution target scene is displayed according to the scene execution result, the problem that the abundance of the visual information acquisition is low due to the fact that the amount of the displayed visual information is small in the execution method of the device scene in the related technology is solved, and the abundance of the visual information acquisition is improved.
In one exemplary embodiment, the first display unit includes:
the first display module is used for displaying an execution flow of target twin equipment executing target equipment operation at a target position in a target area of a display interface of the target intelligent equipment, wherein a group of twin equipment corresponding to a group of physical equipment is displayed in the target area according to the position relation of the group of physical equipment, the equipment shape and the equipment state of each twin equipment in the group of twin equipment are matched with the equipment shape and the equipment state of the corresponding physical equipment, and the target position is the position of the target twin equipment.
In one exemplary embodiment, the first display unit includes:
and a second display module, configured to, in a case where the target scene includes a plurality of device operations executed by the plurality of twin devices in a preset execution order, sequentially display, in the preset execution order, an execution flow in which the twin device executing each of the plurality of device operations executes each of the device operations, wherein the target twin device includes the plurality of twin devices, and the target device operation includes a plurality of device operations each executed by one of the plurality of twin devices.
In an exemplary embodiment, the apparatus further includes:
the device comprises a determining unit, a judging unit and a judging unit, wherein the determining unit is used for determining a target twin device and a target device operation according to target configuration information of a target scene before controlling the target twin device to execute the target device operation, and the target configuration information is used for indicating the corresponding relation between the device operation contained in the target scene and the twin device executing the device operation.
In one exemplary embodiment, the first acquisition unit includes:
the generating module is used for responding to scene trigger operation executed on the target intelligent equipment and generating a scene executing instruction; or,
the first receiving module is used for receiving a scene execution instruction sent by target physical equipment, wherein the scene execution instruction and a scene execution result are sent simultaneously; or,
and the second receiving module is used for receiving the scene execution instruction sent by other equipment except the target physical equipment.
In an exemplary embodiment, the apparatus further comprises:
the first sending unit is used for sending the scene execution instruction to the target physical device after the scene execution instruction is generated in response to the scene trigger operation executed on the target intelligent device so as to control the target physical device to execute the target device operation contained in the target scene.
In an exemplary embodiment, the apparatus further includes:
the receiving unit is used for receiving the scene execution instruction through a first device gateway to which the target intelligent device belongs after the scene execution instruction is sent to the target physical device;
a second sending unit, configured to send the scene execution instruction to the target physical device through the first device gateway when the device gateway to which the target physical device belongs is the first device gateway;
and a third sending unit, configured to send the scene execution instruction to the second device gateway through the first device gateway when the device gateway to which the target physical device belongs is the second device gateway, so that the second device gateway sends the scene execution instruction to the target physical device.
In one exemplary embodiment, the third transmitting unit includes:
the first sending module is used for sending the scene execution instruction to the second equipment gateway through the first equipment gateway under the condition that a data interaction channel exists between the first equipment gateway and the second equipment gateway; or,
and the second sending module is used for sending the scene execution instruction to the second equipment gateway through the cloud server by the first equipment gateway under the condition that no data interaction channel exists between the first equipment gateway and the second equipment gateway are in communication connection with the cloud server.
It should be noted here that the modules described above are the same as the examples and application scenarios implemented by the corresponding steps, but are not limited to the disclosure of the above embodiments. It should be noted that the modules described above as a part of the apparatus may be operated in a hardware environment as shown in fig. 1, and may be implemented by software, or may be implemented by hardware, where the hardware environment includes a network environment.
According to still another aspect of an embodiment of the present application, there is also provided a storage medium. Optionally, in this embodiment, the storage medium may be configured to execute a program code of an execution method of any one of the device scenarios in this embodiment of the present application.
Optionally, in this embodiment, the storage medium may be located on at least one of a plurality of network devices in a network shown in the embodiment.
Optionally, in this embodiment, the storage medium is configured to store program code for performing the following steps:
s1, acquiring a scene execution instruction of a target scene, wherein the target scene comprises target equipment operation executed by target twin equipment on target intelligent equipment;
s2, responding to a scene execution instruction, controlling target twin equipment to execute target equipment operation, and displaying an execution flow of the target twin equipment to execute the target equipment operation;
s3, obtaining a scene execution result of the target physical device corresponding to the target twin device, wherein the scene execution result is an execution result of the target scene on the target physical device;
and S4, displaying the execution result of the target twin equipment executing the target scene according to the scene execution result.
Optionally, the specific example in this embodiment may refer to the example described in the above embodiment, which is not described again in this embodiment.
Optionally, in this embodiment, the storage medium may include but is not limited to: a U disk, a ROM, a RAM, a removable hard disk, a magnetic disk, or an optical disk.
According to another aspect of the embodiments of the present application, there is also provided an electronic apparatus for implementing the method for executing the device scenario, where the electronic apparatus may be a server, a terminal, or a combination thereof.
Fig. 7 is a block diagram of an alternative electronic device according to an embodiment of the present application, as shown in fig. 7, including a processor 702, a communication interface 704, a memory 706 and a communication bus 708, where the processor 702, the communication interface 704 and the memory 706 communicate with each other via the communication bus 708, where,
a memory 706 for storing computer programs;
the processor 702, when executing the computer program stored in the memory 706, performs the following steps:
s1, a scene execution instruction of a target scene is obtained, wherein the target scene comprises target equipment operation executed by target twin equipment on target intelligent equipment;
s2, responding to a scene execution instruction, controlling target twin equipment to execute target equipment operation, and displaying an execution flow of the target twin equipment to execute the target equipment operation;
s3, acquiring a scene execution result of the target physical device corresponding to the target twin device, wherein the scene execution result is an execution result of the target scene on the target physical device;
and S4, displaying the execution result of the target twin equipment executing the target scene according to the scene execution result.
Alternatively, in the present embodiment, the communication bus may be a PCI (Peripheral Component Interconnect) bus, an EISA (Extended Industry Standard Architecture) bus, or the like. The communication bus may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, only one thick line is shown in FIG. 7, but this is not intended to represent only one bus or type of bus. The communication interface is used for communication between the electronic device and other equipment.
The memory may include RAM, or may include non-volatile memory (non-volatile memory), such as at least one disk memory. Alternatively, the memory may be at least one memory device located remotely from the processor.
As an example, the memory 706 may include, but is not limited to, a first acquiring unit 602, a control unit 604, a second display unit 606, a second acquiring unit 608, and a second display unit 610 in the control apparatus including the above devices. In addition, the present invention may further include, but is not limited to, other module units in the control apparatus of the foregoing device, and details are not described in this example again.
The processor may be a general-purpose processor, and may include but is not limited to: a CPU (Central Processing Unit), an NP (network processor), and the like; but also DSPs (Digital Signal Processing), ASICs (Application Specific Integrated circuits), FPGAs (Field Programmable gate arrays) or other Programmable logic devices, discrete gate or transistor logic devices, discrete hardware components.
Optionally, for a specific example in this embodiment, reference may be made to the example described in the foregoing embodiment, and this embodiment is not described herein again.
It can be understood by those skilled in the art that the structure shown in fig. 7 is only an illustration, and the device implementing the method for executing the device scenario may be a terminal device, and the terminal device may be a terminal device such as a smart phone (e.g., an Android Mobile phone, an iOS Mobile phone, and the like), a tablet computer, a palmtop computer, a Mobile Internet Device (MID), a PAD, and the like. Fig. 7 is a diagram illustrating a structure of the electronic device. For example, the electronic device may also include more or fewer components (e.g., network interfaces, display devices, etc.) than shown in FIG. 7, or have a different configuration than shown in FIG. 7.
Those skilled in the art will appreciate that all or part of the steps in the methods of the above embodiments may be implemented by a program instructing hardware associated with the terminal device, where the program may be stored in a computer-readable storage medium, and the storage medium may include: flash disk, ROM, RAM, magnetic or optical disk, and the like.
The above-mentioned serial numbers of the embodiments of the present application are merely for description, and do not represent the advantages and disadvantages of the embodiments.
The integrated unit in the above embodiments, if implemented in the form of a software functional unit and sold or used as a separate product, may be stored in the above computer-readable storage medium. Based on such understanding, the technical solution of the present application may be substantially implemented or a part of or all or part of the technical solution contributing to the prior art may be embodied in the form of a software product stored in a storage medium, and including instructions for causing one or more computer devices (which may be personal computers, servers, network devices, or the like) to execute all or part of the steps of the method described in the embodiments of the present application.
In the embodiments of the present application, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to the related descriptions of other embodiments.
In the several embodiments provided in the present application, it should be understood that the disclosed client may be implemented in other manners. The above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one type of logical functional division, and other divisions may be implemented in practice, for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed coupling or direct coupling or communication connection between each other may be an indirect coupling or communication connection through some interfaces, units or modules, and may be electrical or in other forms.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, and may also be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution provided in the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The foregoing is only a preferred embodiment of the present application and it should be noted that those skilled in the art can make several improvements and modifications without departing from the principle of the present application, and these improvements and modifications should also be considered as the protection scope of the present application.
Claims (11)
1. An execution method of a device scenario, comprising:
acquiring a scene execution instruction of a target scene, wherein the target scene comprises target equipment operation executed by target twin equipment on a target intelligent device;
responding to the scene execution instruction, controlling the target twin device to execute the target device operation, and displaying an execution flow of the target twin device to execute the target device operation;
acquiring a scene execution result of target physical equipment corresponding to the target twin equipment, wherein the scene execution result is an execution result of the target scene on the target physical equipment;
and displaying the execution result of the target twin equipment executing the target scene according to the scene execution result.
2. The method of claim 1, wherein the displaying of the execution flow of the target twin device to execute the target device operation comprises:
and displaying an execution flow of the target twin device executing the target device operation at a target position in a target area of a display interface of the target intelligent device, wherein a group of twin devices corresponding to a group of physical devices are displayed in the target area according to the position relationship of the group of physical devices, the device shape and the device state of each twin device in the group of twin devices are matched with the device shape and the device state of the corresponding physical device, and the target position is the position of the target twin device.
3. The method of claim 1, wherein the displaying of the execution flow of the target twin device to execute the target device operation comprises:
in a case where the target scene includes a plurality of device operations executed by a plurality of twin devices in a preset execution order, sequentially displaying, in the preset execution order, an execution flow in which a twin device executing each of the plurality of device operations executes the each device operation, wherein the target twin device includes the plurality of twin devices, the target device operation includes the plurality of device operations, and the each device operation is executed by one of the plurality of twin devices.
4. The method of claim 1, wherein prior to said controlling said target twin device to perform said target device operation, said method further comprises:
and determining the target twin equipment and the target equipment operation according to target configuration information of the target scene, wherein the target configuration information is used for indicating the corresponding relation between equipment operation contained in the target scene and twin equipment executing equipment operation.
5. The method of any of claims 1 to 4, wherein the obtaining the scene execution instructions of the target scene comprises:
generating the scene execution instruction in response to the scene trigger operation executed on the target intelligent device; or,
receiving the scene execution instruction sent by the target physical device, wherein the scene execution instruction and the scene execution result are sent simultaneously; or,
receiving the scene execution instruction sent by other devices except the target physical device.
6. The method of claim 5, wherein after generating the scenario execution instruction in response to a scenario-triggering operation performed on the target smart device, the method further comprises:
and sending the scene execution instruction to the target physical device to control the target physical device to execute the target device operation contained in the target scene.
7. The method of claim 6, wherein after the sending the scenario execution instruction to the target physical device, the method further comprises:
receiving the scene execution instruction through a first device gateway to which the target intelligent device belongs;
under the condition that the device gateway to which the target physical device belongs is the first device gateway, the scene execution instruction is sent to the target physical device through the first device gateway;
and under the condition that the device gateway to which the target physical device belongs is a second device gateway, sending the scene execution instruction to the second device gateway through the first device gateway, so that the second device gateway sends the scene execution instruction to the target physical device.
8. The method of claim 7, wherein sending, by the first device gateway, the scene execution instructions to the second device gateway comprises:
under the condition that a data interaction channel exists between the first equipment gateway and the second equipment gateway, the scene execution instruction is sent to the second equipment gateway through the data interaction channel by the first equipment gateway; or,
no data interaction channel exists between the first device gateway and the second device gateway,
and under the condition that the first equipment gateway and the second equipment gateway are in communication connection with a cloud server, the scene execution instruction is sent to the second equipment gateway through the cloud server through the first equipment gateway.
9. An apparatus for executing a device scenario, comprising:
a first obtaining unit, configured to obtain a scene execution instruction of a target scene, where the target scene includes a target device operation performed by a target twin device on a target smart device;
a control unit configured to control the target twin device to perform the target device operation in response to the scene execution instruction;
a first display unit, configured to display an execution flow of the target twin device executing the target device operation;
a second obtaining unit, configured to obtain a scene execution result of a target physical device corresponding to the target twin device, where the scene execution result is an execution result of the target scene on the target physical device;
and the second display unit is used for displaying the execution result of the target twin equipment executing the target scene according to the scene execution result.
10. A computer-readable storage medium, comprising a stored program, wherein the program when executed performs the method of any one of claims 1 to 8.
11. An electronic device comprising a memory and a processor, wherein the memory has stored therein a computer program and the processor is arranged to execute the method of any of claims 1 to 8 by means of the computer program.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210768538.1A CN115314505A (en) | 2022-07-01 | 2022-07-01 | Execution method and device of equipment scene, storage medium and electronic device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210768538.1A CN115314505A (en) | 2022-07-01 | 2022-07-01 | Execution method and device of equipment scene, storage medium and electronic device |
Publications (1)
Publication Number | Publication Date |
---|---|
CN115314505A true CN115314505A (en) | 2022-11-08 |
Family
ID=83855273
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210768538.1A Pending CN115314505A (en) | 2022-07-01 | 2022-07-01 | Execution method and device of equipment scene, storage medium and electronic device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115314505A (en) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113064351A (en) * | 2021-03-26 | 2021-07-02 | 京东数字科技控股股份有限公司 | Digital twin model construction method and device, storage medium and electronic equipment |
CN113703407A (en) * | 2021-08-29 | 2021-11-26 | 河海大学 | Method, system and equipment for constructing robot production line operating system based on digital twin |
WO2022032688A1 (en) * | 2020-08-14 | 2022-02-17 | Siemens Aktiengesellschaft | Method for remote assistance and device |
CN114137917A (en) * | 2021-11-19 | 2022-03-04 | 北京京东乾石科技有限公司 | Device control method, device, electronic device, system and storage medium |
CN114299390A (en) * | 2021-12-27 | 2022-04-08 | 烟台杰瑞石油服务集团股份有限公司 | Method and device for determining maintenance component demonstration video and safety helmet |
-
2022
- 2022-07-01 CN CN202210768538.1A patent/CN115314505A/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2022032688A1 (en) * | 2020-08-14 | 2022-02-17 | Siemens Aktiengesellschaft | Method for remote assistance and device |
CN113064351A (en) * | 2021-03-26 | 2021-07-02 | 京东数字科技控股股份有限公司 | Digital twin model construction method and device, storage medium and electronic equipment |
CN113703407A (en) * | 2021-08-29 | 2021-11-26 | 河海大学 | Method, system and equipment for constructing robot production line operating system based on digital twin |
CN114137917A (en) * | 2021-11-19 | 2022-03-04 | 北京京东乾石科技有限公司 | Device control method, device, electronic device, system and storage medium |
CN114299390A (en) * | 2021-12-27 | 2022-04-08 | 烟台杰瑞石油服务集团股份有限公司 | Method and device for determining maintenance component demonstration video and safety helmet |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN113412457B (en) | Scene pushing method, device and system, electronic equipment and storage medium | |
CN104852975B (en) | Household equipment calling method and device | |
US20180024541A1 (en) | Smart home system and product thereof | |
CN108604181A (en) | Media transmission between media output devices | |
US8996749B2 (en) | Achieving a uniform device abstraction layer | |
CN109407527B (en) | Method and device for realizing intelligent equipment recommendation | |
CN109104473A (en) | A kind of control method, control device, control system and gateway | |
CN104092775A (en) | Intelligent household electrical appliance self-learning method and system | |
CN109521683A (en) | The control method of household appliance, remote controler and controlling terminal in local area network | |
CN109802876B (en) | Little intelligent home systems | |
CN113168334A (en) | Data processing method and device, electronic equipment and readable storage medium | |
CN105425603A (en) | Method and apparatus for controlling intelligent equipment | |
CN104244047A (en) | Screen-free super television and household appliance control system | |
CN105759618A (en) | A household intelligent control system based on internet of things | |
CN115167164A (en) | Method and device for determining equipment scene, storage medium and electronic device | |
CN109991858A (en) | A kind of scene pairing control method, apparatus and system | |
CN103747067A (en) | Data configuration method based on digital home intelligent gateway | |
CN205139694U (en) | Intelligent housing system | |
CN115687349A (en) | Database table generation method and device, storage medium and electronic device | |
CN115327934A (en) | Intelligent household scene recommendation method and system, storage medium and electronic device | |
CN115314505A (en) | Execution method and device of equipment scene, storage medium and electronic device | |
US8010653B2 (en) | Method and apparatus for testing ubiquitous service or device using uniform object model | |
US20150222691A1 (en) | Hub Application Automation Widget | |
CN115167160A (en) | Device control method and apparatus, device control system, and storage medium | |
CN115356937A (en) | Device control method, device, storage medium, and electronic apparatus |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |