CN113268004A - Scene creating method and device, computer equipment and storage medium - Google Patents
Scene creating method and device, computer equipment and storage medium Download PDFInfo
- Publication number
- CN113268004A CN113268004A CN202110434552.3A CN202110434552A CN113268004A CN 113268004 A CN113268004 A CN 113268004A CN 202110434552 A CN202110434552 A CN 202110434552A CN 113268004 A CN113268004 A CN 113268004A
- Authority
- CN
- China
- Prior art keywords
- scene
- device control
- creation
- control scene
- acquiring
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 49
- 238000012545 processing Methods 0.000 claims abstract description 15
- 238000003825 pressing Methods 0.000 claims description 25
- 206010048669 Terminal state Diseases 0.000 claims description 14
- 238000004590 computer program Methods 0.000 claims description 11
- 230000001960 triggered effect Effects 0.000 claims description 11
- 238000004458 analytical method Methods 0.000 claims description 7
- 238000007726 management method Methods 0.000 description 9
- 238000004378 air conditioning Methods 0.000 description 8
- 238000010586 diagram Methods 0.000 description 8
- 230000006870 function Effects 0.000 description 8
- 238000013507 mapping Methods 0.000 description 7
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 7
- 238000001816 cooling Methods 0.000 description 5
- 238000004891 communication Methods 0.000 description 4
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 2
- 210000003811 finger Anatomy 0.000 description 2
- 230000004927 fusion Effects 0.000 description 2
- 238000010438 heat treatment Methods 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 238000013473 artificial intelligence Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000007599 discharging Methods 0.000 description 1
- 210000004936 left thumb Anatomy 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B15/00—Systems controlled by a computer
- G05B15/02—Systems controlled by a computer electric
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B19/00—Programme-control systems
- G05B19/02—Programme-control systems electric
- G05B19/418—Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM]
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/20—Pc systems
- G05B2219/26—Pc applications
- G05B2219/2642—Domotique, domestic, home control, automation, smart house
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02P—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
- Y02P90/00—Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
- Y02P90/02—Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Manufacturing & Machinery (AREA)
- Quality & Reliability (AREA)
- Telephonic Communication Services (AREA)
Abstract
The embodiment of the application discloses a scene creating method and device, computer equipment and a storage medium, and can acquire a creating instruction aiming at an equipment control scene; acquiring scene parameters matched with the equipment control scene according to the creation instruction; performing backup processing on the equipment control scene according to the scene parameters to generate an initial equipment control scene; acquiring an adjusting parameter corresponding to the control scene of the initial equipment; and adjusting the initial equipment control scene based on the adjusting parameters to obtain a target equipment control scene. The convenience and the efficiency of scene creation are improved.
Description
Technical Field
The present application relates to the field of computer technologies, and in particular, to a scene creation method, apparatus, computer device, and storage medium.
Background
Along with the rapid development of science and technology, intelligent equipment is more and more popular, and a user can control the intelligent equipment through an intelligent home application program installed on a mobile phone and can conveniently control the intelligent equipment by newly building a specific control scene in the intelligent home application program. At present, the existing method for newly building a control scene is very complicated to operate, and the convenience and the efficiency of newly building the control scene are reduced.
Disclosure of Invention
The embodiment of the application provides a scene creating method and device, computer equipment and a storage medium, which can improve the convenience and efficiency of scene creation.
In order to solve the above technical problem, an embodiment of the present application provides the following technical solutions:
the embodiment of the application provides a scene creating method, which comprises the following steps:
acquiring a creation instruction aiming at a device control scene;
acquiring scene parameters matched with the equipment control scene according to the creation instruction;
performing backup processing on the equipment control scene according to the scene parameters to generate an initial equipment control scene;
acquiring an adjusting parameter corresponding to the control scene of the initial equipment;
and adjusting the initial equipment control scene based on the adjusting parameters to obtain a target equipment control scene.
In an embodiment, the obtaining an adjustment parameter corresponding to the initial device control scenario includes:
acquiring a scene name and current time;
and determining an adjusting parameter corresponding to the control scene of the initial equipment according to the scene name and the current time.
In an embodiment, the obtaining an adjustment parameter corresponding to the initial device control scenario includes:
acquiring input voice information or text information;
and performing recognition analysis on the voice information or the text information to determine an adjusting parameter corresponding to the initial equipment control scene.
In an embodiment, the obtaining an adjustment parameter corresponding to the initial device control scenario includes:
acquiring user state information of a user in a use range of equipment corresponding to the equipment control scene;
and determining an adjusting parameter corresponding to the control scene of the initial equipment according to the user state information.
In an embodiment, the obtaining of the scene parameter matched with the device control scene according to the creation instruction includes:
and according to the creation instruction, obtaining scene parameters matched with the equipment control scene from a local cache of a terminal corresponding to scene creation.
In an embodiment, the obtaining of the scene parameter matched with the device control scene according to the creation instruction includes:
acquiring a scene identifier of the equipment control scene according to the creation instruction;
sending a parameter acquisition request carrying the scene identifier to a server;
and receiving the scene parameters which are returned by the server based on the parameter acquisition request and are matched with the scene identification.
In an embodiment, the performing backup processing on the device control scenario according to the scenario parameter to generate an initial device control scenario includes:
acquiring a scene template corresponding to the equipment control scene;
and fusing the scene parameters with the scene template to obtain an initial equipment control scene.
In one embodiment, the obtaining a scene template corresponding to the device control scene includes:
acquiring a current user of a terminal corresponding to a use scene creation;
and acquiring a scene template matched with the user identification of the current user and the equipment control scene from a template library.
In an embodiment, the obtaining of the creation instruction for the device control scenario includes:
displaying the created scene list in the scene management interface;
receiving a pressing operation in a preset area of the device control scene in the scene list;
responding to the pressing operation, and displaying prompt information whether to backup a scene or not through a popup window;
receiving a determination operation based on the prompt information input;
generating a creation instruction for the device control scenario based on the determination.
In an embodiment, the obtaining of the creation instruction for the device control scenario includes:
and receiving a trigger operation of a preset backup key, and generating a creation instruction aiming at the equipment control scene based on the trigger operation.
In an embodiment, the obtaining of the creation instruction for the device control scenario includes:
and receiving user voice information input by a user, and generating a creation instruction aiming at the equipment control scene based on the user voice information.
In an embodiment, the obtaining of the creation instruction for the device control scenario includes:
gesture information input by a user is received, and a creation instruction for the equipment control scene is generated based on the gesture information.
In an embodiment, the obtaining of the creation instruction for the device control scenario includes:
receiving track information input in a touch display screen, and generating a creation instruction for the equipment control scene based on the track information.
In an embodiment, the obtaining of the creation instruction for the device control scenario includes:
fingerprint information input by a user is received, and a creation instruction for the equipment control scene is generated based on the fingerprint information.
In an embodiment, the obtaining of the creation instruction for the device control scenario includes:
detecting the current state of a terminal corresponding to scene creation;
and if the current state of the terminal is matched with the terminal state triggered by scene creation, generating a creation instruction aiming at the equipment control scene.
According to an aspect of the present application, there is also provided a scene creation apparatus, including:
the device comprises a first acquisition module, a second acquisition module and a control module, wherein the first acquisition module is used for acquiring a creation instruction aiming at a device control scene;
the second acquisition module is used for acquiring scene parameters matched with the equipment control scene according to the creation instruction;
the generating module is used for carrying out backup processing on the equipment control scene according to the scene parameters to generate an initial equipment control scene;
a third obtaining module, configured to obtain an adjustment parameter corresponding to the initial device control scene;
and the adjusting module is used for adjusting the initial equipment control scene based on the adjusting parameters to obtain a target equipment control scene.
In an embodiment, the third obtaining module is specifically configured to obtain a scene name and a current time; and determining an adjusting parameter corresponding to the control scene of the initial equipment according to the scene name and the current time.
In an embodiment, the third obtaining module is specifically configured to obtain input voice information or text information; and performing recognition analysis on the voice information or the text information to determine an adjusting parameter corresponding to the initial equipment control scene.
In an embodiment, the third obtaining module is specifically configured to obtain user state information of a user in a use range of the device corresponding to the device control scenario; and determining an adjusting parameter corresponding to the control scene of the initial equipment according to the user state information.
In an embodiment, the second obtaining module is specifically configured to, according to the creation instruction, obtain a scene parameter matched with the device control scene from a local cache of a terminal corresponding to scene creation.
In an embodiment, the second obtaining module is specifically configured to obtain, according to the creating instruction, a scene identifier of the device control scene; sending a parameter acquisition request carrying the scene identifier to a server; and receiving the scene parameters which are returned by the server based on the parameter acquisition request and are matched with the scene identification.
In one embodiment, the generating module comprises:
the obtaining submodule is used for obtaining a scene template corresponding to the equipment control scene;
and the fusion submodule is used for fusing the scene parameters with the scene template to obtain an initial equipment control scene.
In an embodiment, the obtaining sub-module is specifically configured to obtain a current user who uses a scene to create a corresponding terminal; and acquiring a scene template matched with the user identification of the current user and the equipment control scene from a template library.
In an embodiment, the first obtaining module is specifically configured to display a created scene list in a scene management interface; receiving a pressing operation in a preset area of the device control scene in the scene list; responding to the pressing operation, and displaying prompt information whether to backup a scene or not through a popup window; receiving a determination operation based on the prompt information input; generating a creation instruction for the device control scenario based on the determination.
In an embodiment, the first obtaining module is specifically configured to receive a trigger operation of a preset backup key, and generate a creation instruction for the device control scene based on the trigger operation.
In an embodiment, the first obtaining module is specifically configured to receive user voice information input by a user, and generate a creation instruction for the device control scene based on the user voice information.
In an embodiment, the first obtaining module is specifically configured to receive gesture information input by a user, and generate a creation instruction for the device control scene based on the gesture information.
In an embodiment, the first obtaining module is specifically configured to receive trajectory information input in a touch display screen, and generate a creation instruction for the device control scene based on the trajectory information.
In an embodiment, the first obtaining module is specifically configured to receive fingerprint information input by a user, and generate a creation instruction for the device control scene based on the fingerprint information.
In an embodiment, the first obtaining module is specifically configured to detect a current state of a terminal corresponding to a scene creation; and if the current state of the terminal is matched with the terminal state triggered by scene creation, generating a creation instruction aiming at the equipment control scene.
According to an aspect of the present application, there is also provided a computer device, including a processor and a memory, where the memory stores a computer program, and the processor executes any of the scene creation methods provided by the embodiments of the present application when calling the computer program in the memory.
According to an aspect of the present application, there is also provided a storage medium for storing a computer program, which is loaded by a processor to execute any of the scene creation methods provided by the embodiments of the present application.
According to the method and the device, a creation instruction for the device control scene can be acquired, scene parameters matched with the device control scene are acquired according to the creation instruction, then the device control scene can be backed up according to the scene parameters, and an initial device control scene is generated; at this time, adjusting parameters corresponding to the initial device control scene may be obtained, and the initial device control scene may be adjusted based on the adjusting parameters to obtain the target device control scene. According to the scheme, the device control scene can be backed up based on the scene parameters, and the initial device control scene is adjusted based on the adjusting parameters, so that the required target device control scene can be quickly obtained, and the convenience and the efficiency of scene creation are improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic flowchart of a scene creation method provided in an embodiment of the present application;
fig. 2 is a schematic diagram of triggering scene creation by pressing according to an embodiment of the present application;
fig. 3 is a schematic diagram of triggering scene creation by a key manner according to an embodiment of the present application;
fig. 4 is a schematic diagram of triggering scene creation through track information according to an embodiment of the present application;
FIG. 5 is a schematic diagram of scene parameters of a device control scene provided in an embodiment of the present application;
fig. 6 is another schematic flowchart of a scene creation method provided in an embodiment of the present application;
fig. 7 is a schematic diagram of a scene creation apparatus provided in an embodiment of the present application;
fig. 8 is a schematic structural diagram of a computer device provided in an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The embodiment of the application provides a scene creating method and device, computer equipment and a storage medium (namely a computer readable storage medium). The scene creating method may be applied to a scene creating apparatus, the scene creating apparatus may be specifically integrated in a computer device, the computer device may be a terminal, the terminal may be a mobile phone, a computer, or a wearable device, and the terminal may be in direct or indirect communication connection with a server through a wired or wireless communication manner, and the application is not limited herein. The server may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server providing basic cloud computing services such as a cloud service, a cloud database, cloud computing, a cloud function, cloud storage, a Network service, cloud communication, a middleware service, a domain name service, a security service, a Content Delivery Network (CDN), a big data and artificial intelligence platform, but is not limited thereto.
The following are detailed below. It should be noted that the following description of the embodiments is not intended to limit the preferred order of the embodiments.
In this embodiment, a description will be given of a computer device as a terminal, please refer to fig. 1, and fig. 1 is a flowchart illustrating a scene creation method according to an embodiment of the present application. The scene creation method may include:
s101, acquiring a creation instruction aiming at the equipment control scene.
In this embodiment, a terminal such as a mobile phone or a wearable device may control a device through an application (for example, a smart home APP), and may create a device control scenario (the device control scenario may also be referred to as a control mode) for a device such as an air conditioner, a refrigerator, a television, a water heater, a fan, a lamp, a humidifier, a water dispenser, and an electric cooker. The device control scene may include a manual scene, an automatic scene, and the like, and the automatic scene may include a home mode, a away mode, and the like.
For example, taking an air conditioner as an example, the device control scene corresponding to the air conditioner may include a afternoon nap scene and a late sleep scene, the afternoon nap scene may include controlling the air conditioner in the bedroom to operate according to the operation parameters of the 26 ℃ cooling mode in a time period of 12:30 to 14:00 of a non-working day, the late sleep scene may include controlling the air conditioner in the bedroom to operate according to the operation parameters of the 26 ℃ cooling mode in a time period of 22:30 to 00:00 per day, and controlling the air conditioner in the bedroom to operate according to the operation parameters of the 27 ℃ cooling mode in a time period of 00:00 to 4: 00.
In one embodiment, obtaining a creation instruction for a device control scenario may include: displaying the created scene list in the scene management interface; receiving a pressing operation in a preset area of a device control scene in a scene list; responding to the pressing operation, and displaying prompt information whether to backup the scene or not through a popup window; receiving a determination operation based on prompt information input; a creation instruction for the device control scenario is generated based on the determination operation.
The backup scene may be a new device control scene that is obtained by copying an existing device control scene and has scene parameters and the like consistent with the existing device control scene. The backup scenario may also be referred to as a clone scenario, for example, the device control scenario 1 may be cloned to obtain the device control scenario 2 consistent with the parameters and the like of the device control scenario 1.
In order to improve the convenience of scene creation, device control scenes may be created by a pressing manner, for example, as shown in fig. 2, a created scene list may be displayed in a scene management interface, and the scene list may include at least one created device control scene, for example, an air purifier control scene, an air conditioner control scene, and the like. When a scene needs to be backed up, a pressing operation input by a user can be received in a preset area of an equipment control scene in a scene list, wherein the preset area can be a display area of the equipment control scene, the pressing operation can be a long pressing operation or a clicking operation and the like, the long pressing operation can be that the pressing time is greater than a preset time threshold and the pressing pressure value is greater than a preset pressure threshold, the clicking operation can be that the pressing time is less than or equal to the preset time threshold and the pressing pressure value is greater than the preset pressure threshold, the preset time threshold and the preset pressure threshold can be flexibly set according to actual needs, and specific values are not limited at this point. Then, in response to the pressing operation, a popup may display prompt information for whether to backup a scene, for example, select buttons of "whether to backup scene 2" and "yes" and "no" may be displayed, and when it is determined that a scene needs to be backed up, a determination operation of the user based on the input of the prompt information may be received, for example, a selection button of "yes" is clicked, at which time a creation instruction for a device control scene may be generated based on the determination operation.
It should be noted that, in displaying the created scene list in the scene management interface, if the pressing operation is received in the preset area of the device control scene, the creating instruction for the device control scene may be directly generated in response to the pressing operation, and the prompt information on whether to backup the scene does not need to be displayed.
In one embodiment, obtaining a creation instruction for a device control scenario may include: and receiving a trigger operation of a preset backup key, and generating a creation instruction aiming at the equipment control scene based on the trigger operation.
In order to improve flexibility of scene creation, a device control scene may be created in a key triggering manner, specifically, a backup key may be preset, where the backup key may be a physical key or a virtual key, and a type and a setting position of the backup key may be flexibly set according to actual needs. For example, as shown in fig. 3, a backup key may be set in the display area of the device control scene, at this time, a trigger operation of the backup key by the user may be received in the display area of the device control scene, and a creation instruction for the device control scene may be generated based on the trigger operation, where the trigger operation may be a click operation, a long press operation, or the like.
In one embodiment, obtaining a creation instruction for a device control scenario may include: and receiving user voice information input by a user, and generating a creation instruction for the equipment control scene based on the user voice information.
In order to improve convenience and flexibility of scene creation, an equipment control scene may be created in a voice triggering manner, specifically, user voice information input by a user, for example, related voice information such as "please backup scene 1" and the like, may be received by a voice collector such as a microphone and the like, and then semantic analysis may be performed on the user voice information to obtain semantic information corresponding to the voice information, extract keywords from the semantic information, and generate a creation instruction for the equipment control scene based on the keywords (for example, "backup" and "long scene 1").
In one embodiment, obtaining a creation instruction for a device control scenario may include: gesture information input by a user is received, and a creation instruction for the device control scene is generated based on the gesture information.
In order to improve the diversity of scene creation, a device control scene can be created in a gesture triggering mode. Specifically, a mapping relationship between target gesture information and a creation instruction may be preset, when a scene needs to be backed up, gesture information input by a user may be collected through a preset camera, where the gesture information may include a waving hand, a scissor hand, a fist or the like, and then it may be determined whether the collected gesture information is the target gesture information, and if the collected gesture information is the target gesture information, a creation instruction for an equipment control scene is generated based on the gesture information; and if the acquired gesture information is not the target gesture information, not generating a creation instruction.
In one embodiment, obtaining a creation instruction for a device control scenario may include: receiving track information input in the touch display screen, and generating a creation instruction for the device control scene based on the track information.
In order to improve flexibility and convenience of scene creation, an equipment control scene may be created in a track triggering manner, specifically, a mapping relationship between target track information and a creation instruction may be preset, and when a scene needs to be backed up, track information input by a user in a touch display screen is received, where the track information may include a click track or a slide track input in the touch display screen, the click track may include one point or multiple points, and the slide track may include a horizontal line track, a vertical line track, a circle track, or a polygonal track. The trajectory information may be input at any position or instruction position within the touch display screen, for example, as shown in fig. 4, the trajectory information may be input within a display area in which a device control scene (e.g., scene 2) is located within the touch display screen, and a creation instruction for the device control scene (e.g., scene 2) may be generated based on the input trajectory information.
In one embodiment, obtaining a creation instruction for a device control scenario may include: fingerprint information input by a user is received, and a creation instruction for the equipment control scene is generated based on the fingerprint information.
In order to improve the reliability and flexibility of scene creation, a device control scene can be created in a fingerprint track triggering mode. Specifically, a mapping relationship between target fingerprint potential information and a creation instruction may be preset, when a scene needs to be backed up, fingerprint information input by a user may be collected by a preset fingerprint collector, where the fingerprint information may include fingerprint information of a left thumb, fingerprint information of a right middle finger, fingerprint information of a left index finger, and the like, and then it may be determined whether the collected fingerprint information is the target fingerprint information, and if the collected fingerprint information is the target fingerprint information, a creation instruction for an equipment control scene is generated based on the fingerprint information; and if the acquired fingerprint information is not the target fingerprint information, not generating a creation instruction.
In one embodiment, obtaining a creation instruction for a device control scenario may include: detecting the current state of a terminal corresponding to scene creation; and if the current state of the terminal is matched with the terminal state triggered by the scene creation, generating a creation instruction aiming at the equipment control scene.
In order to improve the diversity and flexibility of scene creation, an equipment control scene may be created in a terminal state triggering manner, specifically, a mapping relationship between a terminal state and a creation instruction may be preset, the terminal state may include a terminal tilting 45 degrees or 60 degrees, a shaking and the like, where the shaking and the shaking may be a movement of the terminal within a preset time period at a preset speed back and forth within a preset area range, and the preset time period, the preset area range, and the preset speed may be flexibly set according to actual needs. When a scene needs to be backed up, the current state of a terminal (such as a mobile phone) corresponding to scene creation can be detected, and if the current state of the terminal is matched with the terminal state triggered by the scene creation, a creation instruction for an equipment control scene is generated; when the current state of the terminal is not matched with the terminal state triggered by the scene creation, a creation instruction is not generated, wherein the current state of the terminal is matched with the terminal state triggered by the scene creation, the current state and the terminal state can be completely consistent or the similarity reaches a preset similarity threshold, and the preset similarity threshold can be flexibly set according to actual needs.
And S102, acquiring scene parameters matched with the equipment control scene according to the creation instruction.
The scene parameters may be flexibly set according to the type of the device to be controlled, for example, for an air conditioner control scene, the scene parameters may include timing time, temperature, an air supply mode, a cooling mode, a heating mode, an air speed, and the like. For another example, for a humidifier, the scene parameters may include humidity, timing time, temperature, and wind speed, among others.
In one embodiment, acquiring scene parameters matching the device control scene according to the creation instruction may include: and according to the creation instruction, acquiring scene parameters matched with the equipment control scene from a local cache of the terminal corresponding to the scene creation.
When scene parameters corresponding to the device control scene are stored in a local cache of a terminal (for example, a mobile phone) corresponding to scene creation, if a scene needs to be backed up, the scene parameters matched with the device control scene can be quickly acquired from the local cache of the terminal based on an instruction of a creation instruction, so that the acquisition efficiency of the scene parameters is improved.
In one embodiment, acquiring scene parameters matching the device control scene according to the creation instruction may include: acquiring a scene identifier of a device control scene according to the creation instruction; sending a parameter acquisition request carrying a scene identifier to a server; and receiving the scene parameters matched with the scene identification returned by the server based on the parameter acquisition request.
When the server stores the scene parameters corresponding to the device control scene, the scene parameters corresponding to the device control scene may be obtained from the server, for example, when the scene parameters need to be obtained based on the indication of the creation instruction, a scene identifier (e.g., a scene ID) of the device control scene may be obtained, where the scene identifier may be a unique identification identifier of the device control scene on the server, and the scene parameters corresponding to the device control scene may be retrieved through the scene identifier. And then, a parameter acquisition request carrying the scene identifier can be sent to the server, and the scene parameters which are returned by the server based on the parameter acquisition request and are matched with the scene identifier are received.
It should be noted that, when the local cache of the terminal and the server both store the scene parameters corresponding to the device control scene, the scene parameters corresponding to the device control scene may be preferentially obtained from the local cache of the terminal, and when the obtaining from the local cache fails, the scene parameters corresponding to the device control scene are obtained from the server. Or when the scene parameters need to be acquired, whether the scene parameters corresponding to the device control scene are stored in the local cache of the terminal or not can be judged firstly, and if the scene parameters corresponding to the device control scene are stored in the local cache of the terminal, the scene parameters corresponding to the device control scene are acquired from the local cache of the terminal; and if the scene parameters corresponding to the equipment control scene are not stored in the local cache of the terminal, acquiring the scene parameters corresponding to the equipment control scene from the server.
And S103, performing backup processing on the equipment control scene according to the scene parameters to generate an initial equipment control scene.
After the scene parameters are obtained, the device control scene may be backed up according to the scene parameters, so as to generate an initial device control scene, where the initial device control scene may be a device to be controlled corresponding to the backed-up device control scene, and the scene parameters of the device and other related information are all consistent.
In an embodiment, performing backup processing on a device control scene according to scene parameters to generate an initial device control scene may include: acquiring a scene template corresponding to an equipment control scene; and fusing the scene parameters with the scene template to obtain the control scene of the initial equipment.
Each device control scene may correspond to a respective scene template, and in the process of backing up a scene, a scene template corresponding to the device control scene may be acquired from a local cache of a terminal or a server, and then the acquired scene parameters are fused with the scene template, so that an initial device control scene may be obtained. For example, when the air-conditioning control scene 1 needs to be backed up, a scene template of the air-conditioning control scene 1 may be acquired, and the acquired scene parameters of the air-conditioning control scene 1 are filled to a position corresponding to the scene parameters in the scene template, so that an initial device control scene identical to the air-conditioning control scene 1 may be obtained, and the rapid generation of the initial device control scene is realized.
In one embodiment, acquiring a scene template corresponding to a device control scene may include: acquiring a current user of a terminal corresponding to a use scene creation; and acquiring a scene template matched with the user identification of the current user and the equipment control scene from the template library.
In order to improve the diversity and the individuation of the initial device control scene generation, the scene background, the individuation information and the like can be automatically set according to the current user, and the backup scenes (namely the initial device control scene) with different styles can be displayed for different users, for example, a simple backup scene can be displayed for parents, and a cartoon background and a backup scene for operation navigation can be displayed for children. Specifically, the mapping relationship between the user identifier of the different user and the scene identifier of the device control scene and the scene template may be preset, and the mapping relationship between the user identifier of the different user and the scene identifier of the device control scene and the scene template may be stored in a template library local to the terminal or on the server. When a scene needs to be backed up, a current user of a terminal corresponding to a usage scene creation may be first obtained, for example, face information of the current user of the usage terminal may be collected by a camera of the terminal, and a user identifier of the current user is determined according to the face information. For another example, the user identifier of the current user may be determined from fingerprint information input by the current user when unlocking the terminal. Then, a scene template matching the user identifier of the current user and the scene identifier of the device control scene may be obtained from a template library local to the terminal or on the server. Subsequently, the scene parameters and the scene template can be fused to obtain an initial device control scene.
And S104, obtaining an adjusting parameter corresponding to the control scene of the initial equipment.
The adjusting parameters may be flexibly set according to the type of the device to be controlled, for example, for an air conditioning control scene, the adjusting parameters may include a scene name, a timing time, a temperature, an air supply mode, a cooling mode, a heating mode, an air speed, and the like. For another example, for a humidifier, the adjustment parameters may include a scene name, humidity, timing time, temperature, and wind speed, among others.
In an embodiment, obtaining an adjustment parameter corresponding to an initial device control scenario may include: acquiring a scene name and current time; and determining an adjusting parameter corresponding to the control scene of the initial equipment according to the scene name and the current time.
In order to improve the convenience of automatic acquisition of the adjustment parameters, a default scene name may be acquired, or a scene name input by a user and the like may be acquired, and the current time (for example, the current beijing time) may be acquired, and then the adjustment parameters corresponding to the initial device control scene may be determined according to the scene name and the current time. For example, for an air-conditioning control scene, the name of the acquired scene is a bedroom air-conditioning control scene, and the acquired current time is 12:30, at this time, it may be determined that parameters of the initial device control scene need to be adjusted to parameters corresponding to a afternoon nap time period according to the bedroom air-conditioning control scene and the current time, so as to obtain adjustment parameters.
In an embodiment, obtaining an adjustment parameter corresponding to an initial device control scenario may include: acquiring input voice information or text information; and performing recognition analysis on the voice information or the text information to determine an adjusting parameter corresponding to the control scene of the initial equipment.
In order to improve the flexibility and convenience of automatic acquisition of the adjustment parameters, voice information input by a user can be acquired through a voice acquisition unit such as a microphone, for example, the air conditioner temperature of a newly-built scene is adjusted to 26 ℃, then, the acquired voice information can be identified and analyzed, for example, the air conditioner temperature is adjusted to 26 ℃, and whether the temperature corresponding to the initial air conditioner control scene obtained by backup is consistent with the analyzed temperature is judged, if not, the adjustment parameter corresponding to the initial air conditioner control scene can be determined to be the temperature 26 ℃, and if so, the adjustment parameter corresponding to the initial air conditioner control scene can be determined to be the current temperature 26 ℃.
Or, an image including text information may be acquired by the camera, the text information in the image is identified, or the text information input in the current interface of the terminal is detected, for example, the temperature of the water heater in the newly-built scene is adjusted to 36 ℃, then, the text information may be identified and analyzed, for example, the temperature of the water heater is adjusted to 36 ℃, and whether the temperature corresponding to the initial water heater control scene obtained by backup is consistent with the analyzed temperature is determined, if not, the adjustment parameter corresponding to the initial water heater control scene may be determined to be the temperature 36 ℃, and if so, the adjustment parameter corresponding to the initial water heater control scene may be determined to maintain the current temperature 36 ℃. Automatic and rapid determination of the adjustment parameters is achieved without manual modification by the user.
In an embodiment, obtaining an adjustment parameter corresponding to an initial device control scenario may include: acquiring user state information of a user in a use range of equipment corresponding to an equipment control scene; and determining an adjusting parameter corresponding to the control scene of the initial equipment according to the user state information.
In order to improve the reliability of the automatic acquisition of the adjustment parameters, the adjustment parameters may be determined based on user status information. Specifically, the user state information of the user in the use range of the device corresponding to the device control scenario may be obtained, for example, the use range of the television corresponding to the living room television control scenario may be determined as the living room, the user state information of the user in the living room may be obtained, and the user state may include a mobile phone playing state, a walking state, a dining state, a sitting state, and the like, and may further include information of the number of users, ages, sexes, habits, and the like of the users. For another example, it may be determined that the usage range of the air conditioner corresponding to the bedroom air conditioner control scenario is a bedroom, and the user state information of the user in the bedroom may be acquired, where the user state may include a sleeping state, whether the user is covered by a quilt, a walking state, a sitting state, and the like, and may further include information such as the number of users, and the age, sex, and habit of the user. Then, according to the user state information, an adjustment parameter corresponding to the initial device control scenario may be determined, for example, when the user state information indicates that the user a is in a sleep state, the temperature corresponding to the initial bedroom air conditioner control scenario may be determined to be a.
And S105, adjusting the initial equipment control scene based on the adjusting parameters to obtain a target equipment control scene.
After the initial device control scenario is obtained, the parameters of the initial device control scenario may be adjusted manually or automatically. For example, as shown in fig. 5, a user may control a terminal to enter a parameter setting interface of an initial device control scenario, and modify, delete, or add parameters of the initial device control scenario on the parameter setting interface, so as to obtain a target device control scenario.
After the target device control scene is obtained, the target scene identifier and the target scene parameters of the target device control scene may be uploaded to a server for storage, or the target scene identifier and the target scene parameters of the target device control scene may be stored in a local cache of the terminal in an associated manner.
According to the method and the device, a creation instruction for the device control scene can be acquired, scene parameters matched with the device control scene are acquired according to the creation instruction, then the device control scene can be backed up according to the scene parameters, and an initial device control scene is generated; at this time, adjusting parameters corresponding to the initial device control scene may be obtained, and the initial device control scene may be adjusted based on the adjusting parameters to obtain the target device control scene. According to the scheme, the device control scene can be backed up based on the scene parameters, and the initial device control scene is adjusted based on the adjusting parameters, so that the required target device control scene can be quickly obtained, and the convenience and the efficiency of scene creation are improved.
The method described in the above embodiments is further illustrated in detail by way of example.
Referring to fig. 6, fig. 6 is a flowchart illustrating a scene creating method according to an embodiment of the present application. The method flow can comprise the following steps:
and S20, the terminal triggers a backup device control scene.
For example, the terminal may generate a creation instruction for the device control scenario through the above-mentioned press trigger manner, key trigger manner, voice information trigger manner, gesture information trigger manner, track information trigger manner, fingerprint information trigger manner, or terminal state trigger manner, so that the backup device control scenario may be triggered based on the creation instruction.
And S21, the terminal sends a parameter acquisition request carrying the scene identifier to the server.
The server may be an Internet of Things (IOT) cloud, and when the backup device triggers a control scenario, it indicates that an operation of the backup scenario needs to be performed, at this time, the terminal may send a parameter acquisition request carrying a scenario identifier to the server to acquire a scenario parameter, for example, the parameter acquisition request may be a Hypertext Transfer Protocol (HTTP) request, and the terminal may write the scenario identifier into a preset field of the HTTP request and send the HTTP request including the scenario identifier to the server.
S22, the server searches the scene parameters corresponding to the scene identification.
The server may extract the scene identifier from the received parameter obtaining request, for example, the parameter obtaining request may be parsed to extract the scene identifier from a preset field. At this time, the server may search for the scene parameter corresponding to the scene identifier from a database in which mapping relationships between different scene identifiers and scene parameters are stored in advance.
And S23, returning the scene parameters to the terminal by the server.
And S24, the terminal generates an initial device control scene based on the scene parameters.
For example, after receiving the scene parameters, the terminal may jump to display a new scene interface, and generate an initial device control scene based on the scene parameters in the new scene interface. For example, a scene template corresponding to the device control scene may be obtained according to the above generation manner, and the scene parameters are fused with the scene template to obtain an initial device control scene.
And S25, modifying the parameters by the terminal, and generating a target device control scene.
The terminal may automatically modify some or all of the parameters of the original device control scenario, or the terminal may receive a modification instruction input by a user, and modify some or all of the parameters of the original device control scenario (e.g., add, delete, or modify, etc.) based on the modification instruction. For example, the terminal may automatically determine the adjustment parameter based on the current time, the voice information, the text information, or the user status information, and modify the parameter of the initial device control scenario based on the adjustment parameter to obtain the target device control scenario.
S26, the terminal sends the target scene identification and the target scene parameters of the target device control scene to the server.
After the target device control scene is generated, the terminal may send a target scene identifier, a target scene parameter, and the like of the target device control scene to the server.
S27, the server stores the target scene identification and the target scene parameters.
S28, the server sends a scene creation success message to the terminal.
And S29, the terminal outputs prompt information of scene creation success.
For example, the terminal may display the prompt information that the scene creation is successful in the newly created scene interface, or output the prompt information that the scene creation is successful through voice broadcast.
In the above embodiments, the descriptions of the embodiments have respective emphasis, and parts that are not described in detail in a certain embodiment may refer to the above detailed description of the scene creation method, and are not described herein again.
In the embodiment of the application, when the backup device control scene is triggered, the scene parameters are obtained from the server, the initial device control scene is generated based on the scene parameters, and then the initial device control scene can be automatically modified to generate the target device control scene. When a scene similar to the existing device control scene needs to be newly built, partial parameters can be modified through a backup scene, the purpose of quickly building the scene is achieved, and convenience and efficiency of scene building are improved.
In order to better implement the scene creation method provided by the embodiments of the present application, an embodiment of the present application further provides a device based on the scene creation method. The meaning of the noun is the same as that in the above scene creation method, and specific implementation details may refer to the description in the method embodiment.
Referring to fig. 7, fig. 7 is a schematic structural diagram of a scene creating apparatus according to an embodiment of the present disclosure, where the scene creating apparatus 300 may include a first obtaining module 301, a second obtaining module 302, a generating module 303, a third obtaining module 304, an adjusting module 305, and the like.
The first obtaining module 301 is configured to obtain a creation instruction for a device control scenario.
And a second obtaining module 302, configured to obtain, according to the creation instruction, a scene parameter matched with the device control scene.
The generating module 303 is configured to perform backup processing on the device control scene according to the scene parameter, and generate an initial device control scene.
A third obtaining module 304, configured to obtain an adjusting parameter corresponding to the initial device control scenario.
And an adjusting module 305, configured to adjust the initial device control scenario based on the adjustment parameter, so as to obtain a target device control scenario.
In an embodiment, the third obtaining module 304 is specifically configured to obtain a scene name and a current time; and determining an adjusting parameter corresponding to the control scene of the initial equipment according to the scene name and the current time.
In an embodiment, the third obtaining module 304 is specifically configured to obtain input voice information or text information; and performing recognition analysis on the voice information or the text information to determine an adjusting parameter corresponding to the control scene of the initial equipment.
In an embodiment, the third obtaining module 304 is specifically configured to obtain user state information of a user in a use range of a device corresponding to a device control scene; and determining an adjusting parameter corresponding to the control scene of the initial equipment according to the user state information.
In an embodiment, the second obtaining module 302 is specifically configured to obtain, according to the creation instruction, a scene parameter matched with the device control scene from a local cache of a terminal corresponding to the scene creation.
In an embodiment, the second obtaining module 302 is specifically configured to obtain a scene identifier of a device control scene according to a creation instruction; sending a parameter acquisition request carrying a scene identifier to a server; and receiving the scene parameters matched with the scene identification returned by the server based on the parameter acquisition request.
In one embodiment, the generating module 303 includes:
the acquisition submodule is used for acquiring a scene template corresponding to the equipment control scene;
and the fusion submodule is used for fusing the scene parameters with the scene template to obtain an initial equipment control scene.
In an embodiment, the obtaining sub-module is specifically configured to obtain a current user who uses a scene to create a corresponding terminal; and acquiring a scene template matched with the user identification of the current user and the equipment control scene from the template library.
In an embodiment, the first obtaining module 301 is specifically configured to display the created scene list in the scene management interface; receiving a pressing operation in a preset area of a device control scene in a scene list; responding to the pressing operation, and displaying prompt information whether to backup the scene or not through a popup window; receiving a determination operation based on prompt information input; a creation instruction for the device control scenario is generated based on the determination operation.
In an embodiment, the first obtaining module 301 is specifically configured to receive a trigger operation of a preset backup key, and generate a creation instruction for a device control scene based on the trigger operation.
In an embodiment, the first obtaining module 301 is specifically configured to receive user voice information input by a user, and generate a creation instruction for a device control scene based on the user voice information.
In an embodiment, the first obtaining module 301 is specifically configured to receive gesture information input by a user, and generate a creation instruction for a device control scene based on the gesture information.
In an embodiment, the first obtaining module 301 is specifically configured to receive track information input in the touch display screen, and generate a creation instruction for the device control scene based on the track information.
In an embodiment, the first obtaining module 301 is specifically configured to receive fingerprint information input by a user, and generate a creation instruction for a device control scene based on the fingerprint information.
In an embodiment, the first obtaining module 301 is specifically configured to detect a current state of a terminal corresponding to a scene creation; and if the current state of the terminal is matched with the terminal state triggered by the scene creation, generating a creation instruction aiming at the equipment control scene.
In the embodiment of the application, a first obtaining module 301 may obtain a creating instruction for an equipment control scene, a second obtaining module 302 obtains a scene parameter matched with the equipment control scene according to the creating instruction, and a generating module 303 may perform backup processing on the equipment control scene according to the scene parameter to generate an initial equipment control scene; at this time, the third obtaining module 304 may obtain the adjustment parameter corresponding to the initial device control scenario, and the adjusting module 305 adjusts the initial device control scenario based on the adjustment parameter, so as to obtain the target device control scenario. According to the scheme, the device control scene can be backed up based on the scene parameters, and the initial device control scene is adjusted based on the adjusting parameters, so that the required target device control scene can be quickly obtained, and the convenience and the efficiency of scene creation are improved.
In the above embodiments, the descriptions of the embodiments have respective emphasis, and parts that are not described in detail in a certain embodiment may refer to the above detailed description of the scene creation method, and are not described herein again.
An embodiment of the present application further provides a computer device, where the computer device may be a terminal such as a mobile phone, and as shown in fig. 8, a schematic structural diagram of the computer device according to the embodiment of the present application is shown, specifically:
the computer device may include components such as a processor 401 of one or more processing cores, memory 402 of one or more computer-readable storage media, a power supply 403, and an input unit 404. Those skilled in the art will appreciate that the computer device configuration illustrated in FIG. 8 does not constitute a limitation of computer devices, and may include more or fewer components than those illustrated, or some components may be combined, or a different arrangement of components. Wherein:
the processor 401 is a control center of the computer device, connects various parts of the entire computer device using various interfaces and lines, and performs various functions of the computer device and processes data by running or executing software programs and/or modules stored in the memory 402 and calling data stored in the memory 402, thereby monitoring the computer device as a whole. Optionally, processor 401 may include one or more processing cores; preferably, the processor 401 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 401.
The memory 402 may be used to store software programs and modules, and the processor 401 executes various functional applications and data processing by operating the software programs and modules stored in the memory 402. The memory 402 may mainly include a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data created according to use of the computer device, and the like. Further, the memory 402 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device. Accordingly, the memory 402 may also include a memory controller to provide the processor 401 access to the memory 402.
The computer device further comprises a power supply 403 for supplying power to the various components, and preferably, the power supply 403 is logically connected to the processor 401 via a power management system, so that functions of managing charging, discharging, and power consumption are implemented via the power management system. The power supply 403 may also include any component of one or more dc or ac power sources, recharging systems, power failure detection circuitry, power converters or inverters, power status indicators, and the like.
The computer device may also include an input unit 404, the input unit 404 being operable to receive input numeric or character information and to generate keyboard, mouse, joystick, optical or trackball signal inputs related to user settings and function control.
Although not shown, the computer device may further include a display unit and the like, which are not described in detail herein. Specifically, in this embodiment, the processor 401 in the computer device loads the executable file corresponding to the process of one or more application programs into the memory 402 according to the following instructions, and the processor 401 runs the application programs stored in the memory 402, thereby implementing various functions as follows:
acquiring a creation instruction aiming at an equipment control scene, and acquiring scene parameters matched with the equipment control scene according to the creation instruction; performing backup processing on the equipment control scene according to the scene parameters to generate an initial equipment control scene; and acquiring adjusting parameters corresponding to the control scene of the initial equipment, and adjusting the control scene of the initial equipment based on the adjusting parameters to obtain the control scene of the target equipment.
In one embodiment, in acquiring the adjustment parameter corresponding to the initial device control scenario, the processor 401 may perform: and acquiring a scene name and current time, and determining an adjusting parameter corresponding to the control scene of the initial equipment according to the scene name and the current time.
In one embodiment, in acquiring the adjustment parameter corresponding to the initial device control scenario, the processor 401 may perform: and acquiring input voice information or text information, and performing recognition analysis on the voice information or the text information to determine an adjusting parameter corresponding to the control scene of the initial equipment.
In one embodiment, in acquiring the adjustment parameter corresponding to the initial device control scenario, the processor 401 may perform: acquiring user state information of a user in a use range of equipment corresponding to an equipment control scene; and determining an adjusting parameter corresponding to the control scene of the initial equipment according to the user state information.
In an embodiment, when acquiring the scene parameter matching the device control scene according to the creation instruction, the processor 401 may perform: and according to the creation instruction, acquiring scene parameters matched with the equipment control scene from a local cache of the terminal corresponding to the scene creation.
In an embodiment, when acquiring the scene parameter matching the device control scene according to the creation instruction, the processor 401 may perform: acquiring a scene identifier of a device control scene according to the creation instruction; sending a parameter acquisition request carrying a scene identifier to a server; and receiving the scene parameters matched with the scene identification returned by the server based on the parameter acquisition request.
In an embodiment, when the device control scenario is backed up according to the scenario parameter to generate an initial device control scenario, the processor 401 may perform: acquiring a scene template corresponding to an equipment control scene; and fusing the scene parameters with the scene template to obtain the control scene of the initial equipment.
In one embodiment, in acquiring a scene template corresponding to a device control scene, the processor 401 may perform: acquiring a current user of a terminal corresponding to a use scene creation; and acquiring a scene template matched with the user identification of the current user and the equipment control scene from the template library.
In an embodiment, in obtaining a creation instruction for a device control scenario, processor 401 may perform: displaying the created scene list in the scene management interface; receiving a pressing operation in a preset area of a device control scene in a scene list; responding to the pressing operation, and displaying prompt information whether to backup the scene or not through a popup window; receiving a determination operation based on prompt information input; a creation instruction for the device control scenario is generated based on the determination operation.
In an embodiment, in obtaining a creation instruction for a device control scenario, processor 401 may perform: and receiving a trigger operation of a preset backup key, and generating a creation instruction aiming at the equipment control scene based on the trigger operation.
In an embodiment, in obtaining a creation instruction for a device control scenario, processor 401 may perform: and receiving user voice information input by a user, and generating a creation instruction for the equipment control scene based on the user voice information.
In an embodiment, in obtaining a creation instruction for a device control scenario, processor 401 may perform: gesture information input by a user is received, and a creation instruction for the device control scene is generated based on the gesture information.
In an embodiment, in obtaining a creation instruction for a device control scenario, processor 401 may perform: receiving track information input in the touch display screen, and generating a creation instruction for the device control scene based on the track information.
In an embodiment, in obtaining a creation instruction for a device control scenario, processor 401 may perform: fingerprint information input by a user is received, and a creation instruction for the equipment control scene is generated based on the fingerprint information.
In an embodiment, in obtaining a creation instruction for a device control scenario, processor 401 may perform: detecting the current state of a terminal corresponding to scene creation; and if the current state of the terminal is matched with the terminal state triggered by the scene creation, generating a creation instruction aiming at the equipment control scene.
In the above embodiments, the descriptions of the embodiments have respective emphasis, and parts that are not described in detail in a certain embodiment may refer to the above detailed description of the scene creation method, and are not described herein again.
According to an aspect of the application, a computer program product or computer program is provided, comprising computer instructions, the computer instructions being stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions to cause the computer device to perform the method provided in the various alternative implementations of the above embodiments.
It will be understood by those skilled in the art that all or part of the steps in the methods of the above embodiments may be performed by computer instructions or by computer instructions controlling associated hardware, and the computer instructions may be stored in a storage medium and loaded and executed by a processor, and the storage medium is a computer-readable storage medium. To this end, the present application provides a storage medium, in which a computer program is stored, where the computer program may include computer instructions, and the computer program can be loaded by a processor to execute any of the scene creating methods provided by the present application.
The above operations can be implemented in the foregoing embodiments, and are not described in detail herein.
Wherein the storage medium may include: read Only Memory (ROM), Random Access Memory (RAM), magnetic or optical disks, and the like.
As the computer instructions stored in the storage medium can execute any scene creation method provided in the embodiments of the present application, beneficial effects that can be achieved by any scene creation method provided in the embodiments of the present application can be achieved, which are detailed in the foregoing embodiments and will not be described herein again.
The scene creation method, the scene creation apparatus, the computer device, and the storage medium provided in the embodiments of the present application are described in detail above, and a specific example is applied in the present application to explain the principle and the implementation of the present application, and the description of the above embodiments is only used to help understanding the method and the core idea of the present application; meanwhile, for those skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.
Claims (12)
1. A scene creation method, characterized in that the scene creation method comprises:
acquiring a creation instruction aiming at a device control scene;
acquiring scene parameters matched with the equipment control scene according to the creation instruction;
performing backup processing on the equipment control scene according to the scene parameters to generate an initial equipment control scene;
acquiring an adjusting parameter corresponding to the control scene of the initial equipment;
and adjusting the initial equipment control scene based on the adjusting parameters to obtain a target equipment control scene.
2. The method of claim 1, wherein the obtaining of the adjustment parameter corresponding to the initial device control scenario includes:
acquiring a scene name and current time;
and determining an adjusting parameter corresponding to the control scene of the initial equipment according to the scene name and the current time.
3. The method of claim 1, wherein the obtaining of the adjustment parameter corresponding to the initial device control scenario includes:
acquiring input voice information or text information;
performing recognition analysis on the voice information or the text information to determine an adjusting parameter corresponding to the initial equipment control scene; or,
acquiring user state information of a user in a use range of equipment corresponding to the equipment control scene;
and determining an adjusting parameter corresponding to the control scene of the initial equipment according to the user state information.
4. The scene creating method according to claim 1, wherein the obtaining of the scene parameter matching the device control scene according to the creating instruction includes:
according to the creation instruction, scene parameters matched with the equipment control scene are obtained from a local cache of a terminal corresponding to scene creation; or,
acquiring a scene identifier of the equipment control scene according to the creation instruction;
sending a parameter acquisition request carrying the scene identifier to a server;
and receiving the scene parameters which are returned by the server based on the parameter acquisition request and are matched with the scene identification.
5. The method of claim 1, wherein the performing backup processing on the device control scenario according to the scenario parameter to generate an initial device control scenario comprises:
acquiring a scene template corresponding to the equipment control scene;
and fusing the scene parameters with the scene template to obtain an initial equipment control scene.
6. The scene creation method according to claim 5, wherein the acquiring a scene template corresponding to the device control scene includes:
acquiring a current user of a terminal corresponding to a use scene creation;
and acquiring a scene template matched with the user identification of the current user and the equipment control scene from a template library.
7. The scene creation method according to any one of claims 1 to 6, wherein the acquiring of the creation instruction for the device control scene includes:
displaying the created scene list in the scene management interface;
receiving a pressing operation in a preset area of the device control scene in the scene list;
responding to the pressing operation, and displaying prompt information whether to backup a scene or not through a popup window;
receiving a determination operation based on the prompt information input;
generating a creation instruction for the device control scenario based on the determination.
8. The scene creation method according to any one of claims 1 to 6, wherein the acquiring of the creation instruction for the device control scene includes:
receiving a trigger operation of a preset backup key, and generating a creation instruction for the equipment control scene based on the trigger operation; or,
receiving user voice information input by a user, and generating a creation instruction aiming at the equipment control scene based on the user voice information; or,
receiving gesture information input by a user, and generating a creation instruction for the equipment control scene based on the gesture information; or,
receiving track information input in a touch display screen, and generating a creation instruction for the equipment control scene based on the track information; or,
fingerprint information input by a user is received, and a creation instruction for the equipment control scene is generated based on the fingerprint information.
9. The scene creation method according to any one of claims 1 to 6, wherein the acquiring of the creation instruction for the device control scene includes:
detecting the current state of a terminal corresponding to scene creation;
and if the current state of the terminal is matched with the terminal state triggered by scene creation, generating a creation instruction aiming at the equipment control scene.
10. A scene creation apparatus, characterized by comprising:
the device comprises a first acquisition module, a second acquisition module and a control module, wherein the first acquisition module is used for acquiring a creation instruction aiming at a device control scene;
the second acquisition module is used for acquiring scene parameters matched with the equipment control scene according to the creation instruction;
the generating module is used for carrying out backup processing on the equipment control scene according to the scene parameters to generate an initial equipment control scene;
a third obtaining module, configured to obtain an adjustment parameter corresponding to the initial device control scene;
and the adjusting module is used for adjusting the initial equipment control scene based on the adjusting parameters to obtain a target equipment control scene.
11. A computer device comprising a processor and a memory, the memory having stored therein a computer program, the processor executing the scene creation method according to any one of claims 1 to 9 when calling the computer program in the memory.
12. A storage medium for storing a computer program which is loaded by a processor to execute the scene creation method of any one of claims 1 to 9.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110434552.3A CN113268004A (en) | 2021-04-22 | 2021-04-22 | Scene creating method and device, computer equipment and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110434552.3A CN113268004A (en) | 2021-04-22 | 2021-04-22 | Scene creating method and device, computer equipment and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN113268004A true CN113268004A (en) | 2021-08-17 |
Family
ID=77229174
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110434552.3A Pending CN113268004A (en) | 2021-04-22 | 2021-04-22 | Scene creating method and device, computer equipment and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113268004A (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114237563A (en) * | 2021-12-17 | 2022-03-25 | 中国电信股份有限公司 | Visual program construction method and device, readable medium and electronic equipment |
CN115327932A (en) * | 2022-02-22 | 2022-11-11 | 深圳绿米联创科技有限公司 | Scene creation method and device, electronic equipment and storage medium |
CN115883273A (en) * | 2021-09-30 | 2023-03-31 | 青岛海尔科技有限公司 | Scene adjusting method and device, storage medium and electronic device |
WO2023202633A1 (en) * | 2022-04-22 | 2023-10-26 | 华为技术有限公司 | Method for controlling iot device, and device, iot system and storage medium |
CN117118773A (en) * | 2023-08-28 | 2023-11-24 | 广东金朋科技有限公司 | Scene generation method, system and storage medium |
Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20170193703A1 (en) * | 2015-12-31 | 2017-07-06 | Beijing Pico Technology Co., Ltd. | Virtual reality scene implementation method and a virtual reality apparatus |
CN107166645A (en) * | 2017-05-18 | 2017-09-15 | 厦门瑞为信息技术有限公司 | A kind of air conditioning control method analyzed based on indoor scene |
CN107490979A (en) * | 2017-09-26 | 2017-12-19 | 生迪智慧科技有限公司 | Home terminal control method, equipment and system |
CN108803529A (en) * | 2018-07-16 | 2018-11-13 | 珠海格力电器股份有限公司 | Device and method for switching room environment modes based on mobile terminal |
CN109597313A (en) * | 2018-11-30 | 2019-04-09 | 新华三技术有限公司 | Method for changing scenes and device |
CN110045621A (en) * | 2019-04-12 | 2019-07-23 | 深圳康佳电子科技有限公司 | Intelligent scene processing method, system, smart home device and storage medium |
CN110262261A (en) * | 2019-05-31 | 2019-09-20 | 华为技术有限公司 | A kind of method, Cloud Server and smart home system controlling device service |
CN111158254A (en) * | 2019-12-31 | 2020-05-15 | 青岛海尔科技有限公司 | Method and device for starting scene and mobile phone |
CN111650840A (en) * | 2019-03-04 | 2020-09-11 | 华为技术有限公司 | Intelligent household scene arranging method and terminal |
CN112073471A (en) * | 2020-08-17 | 2020-12-11 | 青岛海尔科技有限公司 | Device control method and apparatus, storage medium, and electronic apparatus |
CN112180754A (en) * | 2020-10-20 | 2021-01-05 | 珠海格力电器股份有限公司 | Setting method of intelligent control scene and equipment control system |
CN112306968A (en) * | 2020-11-10 | 2021-02-02 | 珠海格力电器股份有限公司 | Scene establishing method and device |
-
2021
- 2021-04-22 CN CN202110434552.3A patent/CN113268004A/en active Pending
Patent Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20170193703A1 (en) * | 2015-12-31 | 2017-07-06 | Beijing Pico Technology Co., Ltd. | Virtual reality scene implementation method and a virtual reality apparatus |
CN107166645A (en) * | 2017-05-18 | 2017-09-15 | 厦门瑞为信息技术有限公司 | A kind of air conditioning control method analyzed based on indoor scene |
CN107490979A (en) * | 2017-09-26 | 2017-12-19 | 生迪智慧科技有限公司 | Home terminal control method, equipment and system |
CN108803529A (en) * | 2018-07-16 | 2018-11-13 | 珠海格力电器股份有限公司 | Device and method for switching room environment modes based on mobile terminal |
CN109597313A (en) * | 2018-11-30 | 2019-04-09 | 新华三技术有限公司 | Method for changing scenes and device |
CN111650840A (en) * | 2019-03-04 | 2020-09-11 | 华为技术有限公司 | Intelligent household scene arranging method and terminal |
CN110045621A (en) * | 2019-04-12 | 2019-07-23 | 深圳康佳电子科技有限公司 | Intelligent scene processing method, system, smart home device and storage medium |
CN110262261A (en) * | 2019-05-31 | 2019-09-20 | 华为技术有限公司 | A kind of method, Cloud Server and smart home system controlling device service |
CN111158254A (en) * | 2019-12-31 | 2020-05-15 | 青岛海尔科技有限公司 | Method and device for starting scene and mobile phone |
CN112073471A (en) * | 2020-08-17 | 2020-12-11 | 青岛海尔科技有限公司 | Device control method and apparatus, storage medium, and electronic apparatus |
CN112180754A (en) * | 2020-10-20 | 2021-01-05 | 珠海格力电器股份有限公司 | Setting method of intelligent control scene and equipment control system |
CN112306968A (en) * | 2020-11-10 | 2021-02-02 | 珠海格力电器股份有限公司 | Scene establishing method and device |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115883273A (en) * | 2021-09-30 | 2023-03-31 | 青岛海尔科技有限公司 | Scene adjusting method and device, storage medium and electronic device |
CN114237563A (en) * | 2021-12-17 | 2022-03-25 | 中国电信股份有限公司 | Visual program construction method and device, readable medium and electronic equipment |
CN115327932A (en) * | 2022-02-22 | 2022-11-11 | 深圳绿米联创科技有限公司 | Scene creation method and device, electronic equipment and storage medium |
WO2023202633A1 (en) * | 2022-04-22 | 2023-10-26 | 华为技术有限公司 | Method for controlling iot device, and device, iot system and storage medium |
CN117118773A (en) * | 2023-08-28 | 2023-11-24 | 广东金朋科技有限公司 | Scene generation method, system and storage medium |
CN117118773B (en) * | 2023-08-28 | 2024-05-24 | 广东金朋科技有限公司 | Scene generation method, system and storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN113268004A (en) | Scene creating method and device, computer equipment and storage medium | |
CN110262261B (en) | Method for controlling equipment service, cloud server and intelligent home system | |
CN109974235B (en) | Method and device for controlling household appliance and household appliance | |
WO2016065812A1 (en) | Scenario mode setting-based smart device control method and apparatus | |
WO2016065813A1 (en) | Method and apparatus for customising smart device scenario mode | |
CN108131791A (en) | Control method, device and the server of home appliance | |
WO2022267671A1 (en) | Air conditioner operation mode pushing method and apparatus, and air conditioner | |
AU2016212943A1 (en) | Image processing method and electronic device for supporting the same | |
WO2020125339A1 (en) | Operating mode selection method and smart television | |
US11507619B2 (en) | Display apparatus with intelligent user interface | |
CN111583926B (en) | Continuous voice interaction method and device based on cooking equipment and cooking equipment | |
CN108881353B (en) | Content pushing method and device and computer readable storage medium | |
CN106572131B (en) | The method and system that media data is shared in Internet of Things | |
CN114286156A (en) | Live broadcast interaction method and device, storage medium and computer equipment | |
US20140089238A1 (en) | Information processing device and information processing method | |
CN113495487A (en) | Terminal and method for adjusting operation parameters of target equipment | |
CN113569138A (en) | Intelligent device control method and device, electronic device and storage medium | |
CN108710516A (en) | Acquisition method, device, storage medium and the intelligent terminal of forecast sample | |
CN114257824A (en) | Live broadcast display method and device, storage medium and computer equipment | |
US12120389B2 (en) | Systems and methods for recommending content items based on an identified posture | |
CN111984337B (en) | Operation mode collection method, terminal device, massage device and storage medium | |
CN113596529A (en) | Terminal control method and device, computer equipment and storage medium | |
CN110673737B (en) | Display content adjusting method and device based on intelligent home operating system | |
CN112083655B (en) | Electronic equipment control method and related equipment | |
CN117412088A (en) | Terminal control method and device, storage medium and electronic equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |