CN114816197A - Control method and device of intelligent household equipment, storage medium and terminal equipment - Google Patents
Control method and device of intelligent household equipment, storage medium and terminal equipment Download PDFInfo
- Publication number
- CN114816197A CN114816197A CN202210373430.2A CN202210373430A CN114816197A CN 114816197 A CN114816197 A CN 114816197A CN 202210373430 A CN202210373430 A CN 202210373430A CN 114816197 A CN114816197 A CN 114816197A
- Authority
- CN
- China
- Prior art keywords
- instruction
- intelligent household
- household equipment
- control
- graph
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 62
- 239000003550 marker Substances 0.000 claims description 26
- 238000012986 modification Methods 0.000 claims description 20
- 230000004048 modification Effects 0.000 claims description 20
- 238000004590 computer program Methods 0.000 claims description 4
- 230000001276 controlling effect Effects 0.000 claims 1
- 230000002596 correlated effect Effects 0.000 claims 1
- 238000004891 communication Methods 0.000 description 12
- 230000008569 process Effects 0.000 description 12
- 230000006870 function Effects 0.000 description 8
- 238000010586 diagram Methods 0.000 description 7
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 6
- 238000012545 processing Methods 0.000 description 4
- 239000011521 glass Substances 0.000 description 3
- 230000009471 action Effects 0.000 description 2
- 239000000284 extract Substances 0.000 description 2
- 241001465382 Physalis alkekengi Species 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000009835 boiling Methods 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000012790 confirmation Methods 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000001939 inductive effect Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 230000001960 triggered effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04847—Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B15/00—Systems controlled by a computer
- G05B15/02—Systems controlled by a computer electric
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/20—Pc systems
- G05B2219/26—Pc applications
- G05B2219/2642—Domotique, domestic, home control, automation, smart house
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Automation & Control Theory (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The embodiment of the application discloses a control method and device of intelligent household equipment, a storage medium and terminal equipment, and relates to the field of Internet of things. The user draws on the drawing interface, the intelligent household equipment controlled is determined based on the shape of the graph drawn by the user, the control parameter value is determined according to the attribute value of the image, then the intelligent household equipment is controlled, the intelligent household equipment is controlled based on the drawing mode, compared with the prior art that the intelligent household equipment is controlled through voice recognition, the control method is expanded, the richness and the interestingness of the control mode are improved, and then the viscosity of the control application program installed on the terminal equipment used by the user can be improved.
Description
Technical Field
The application relates to the field of internet of things, in particular to a control method and device of intelligent household equipment, a storage medium and terminal equipment.
Background
Along with the development of the internet of things technology, intelligent household equipment is more and more popular. In the prior art, a user usually uses an intelligent sound box as a control center of an intelligent home device, the intelligent sound box collects voice information of the user through a microphone, obtains a voice control instruction and a controlled intelligent home device through voice recognition, and then sends the voice control instruction to the corresponding intelligent home device.
Disclosure of Invention
The embodiment of the application provides a control method and device of intelligent household equipment, a storage medium and terminal equipment, and the technical scheme is as follows:
in a first aspect, an embodiment of the present application provides a method for controlling smart home devices, where the method includes:
displaying a drawing interface through a display unit;
receiving a drawing instruction of a user, and drawing a first graph on the drawing interface based on the drawing instruction;
receiving a graphic attribute modification instruction of the first graphic of a user, and modifying an attribute value of the first graphic based on the graphic attribute modification instruction;
generating a first control instruction according to the modified graph attribute value; wherein the first control instruction carries a control parameter value;
determining first smart home equipment associated with the first graph in an equipment library;
and sending the first control instruction to the first intelligent household equipment.
In a second aspect, an embodiment of the present application provides a control apparatus for smart home devices, the apparatus includes:
a display unit for displaying a drawing interface;
the drawing unit is used for receiving a drawing instruction of a user and drawing a first graph on the drawing interface based on the drawing instruction;
the modification unit is used for receiving a graphic attribute modification instruction of the first graphic of a user and modifying an attribute value of the first graphic based on the graphic attribute modification instruction;
the generating unit is used for generating a first control instruction according to the modified graph attribute value; wherein the first control instruction carries a control parameter value;
the determining unit is used for determining first intelligent household equipment associated with the first graph in an equipment library;
and the transceiver unit is used for sending the first control instruction to the first intelligent household equipment.
In a third aspect, embodiments of the present application provide a computer storage medium storing a plurality of instructions adapted to be loaded by a processor and to perform the above-mentioned method steps.
In a fourth aspect, an embodiment of the present application provides a terminal device, which may include: a processor and a memory; wherein the memory stores a computer program adapted to be loaded by the processor and to perform the above-mentioned method steps.
The beneficial effects brought by the technical scheme provided by some embodiments of the application at least comprise:
the user draws on the drawing interface, the intelligent household equipment controlled is determined based on the shape of the graph drawn by the user, the control parameter value is determined according to the attribute value of the image, then the intelligent household equipment is controlled, the intelligent household equipment is controlled based on the drawing mode, compared with the prior art that the intelligent household equipment is controlled through voice recognition, the control method is expanded, the richness and the interestingness of the control mode are improved, and then the viscosity of the control application program installed on the terminal equipment used by the user can be improved.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and other drawings can be obtained by those skilled in the art without creative efforts.
Fig. 1 is a schematic diagram of a network architecture provided in an embodiment of the present application;
fig. 2 is a schematic flowchart of a control method of smart home devices provided in an embodiment of the present application;
fig. 3 is a schematic diagram illustrating a principle of controlling an intelligent desk lamp according to an embodiment of the present application;
fig. 4 to 6 are user interfaces of a binding graph and smart home devices provided in an embodiment of the present application;
FIG. 7 is a schematic flow chart of linkage control provided by an embodiment of the present application;
FIG. 8 is a schematic diagram of a linkage control provided by an embodiment of the present application;
FIG. 9 is a schematic flow chart of marker trigger control provided by an embodiment of the present application;
10-12 are user interfaces for binding a marker and a trigger location provided by embodiments of the present application;
fig. 13 is a schematic structural diagram of a control device of a smart home device provided in the present application;
fig. 14 is a schematic structural diagram of a terminal device provided in the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more clear, embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
It should be noted that, the control method for the smart home devices provided in the present application is generally executed by the terminal device, and accordingly, the control device for the smart home devices is generally disposed in the terminal device.
Fig. 1 shows an exemplary system architecture of a control method of a smart home device or a control apparatus of a smart home device, which may be applied to the present application.
As shown in fig. 1, the system architecture may include: terminal equipment 101, intelligent lamps and lanterns 102 and intelligent audio amplifier 103. The terminal device 101, the intelligent lamp 102 and the intelligent sound box 103 may communicate with each other through a network, and the network is used as a medium for providing a communication link among the above units. The network may include various types of wired or wireless communication links, such as: the wired communication link includes an optical fiber, a twisted pair wire, or a coaxial cable, etc., and the WIreless communication link includes a bluetooth communication link, a WIreless-FIdelity (Wi-Fi) communication link, or a microwave communication link, etc.
The terminal device 101 has a control function, and is configured to send a control instruction to each smart home device to control the smart home device to execute corresponding operations, for example: opening the air conditioner, opening the lamp, closing the curtain, opening the water heater, etc.
The terminal device is a variety of devices having a display screen including, but not limited to, smart phones, tablet computers, laptop portable computers, desktop computers, and the like.
The terminal equipment is also provided with a camera which is used for collecting video streams or images. The display device of the terminal device of the present application may be a cathode ray tube (CR) display, a light-emitting diode (LED) display, an electronic ink screen, a Liquid Crystal Display (LCD), a Plasma Display Panel (PDP), or the like. The user can utilize the display device on the terminal device to view the displayed information such as characters, pictures, videos and the like.
It should be understood that the number of terminal devices and smart home devices in fig. 1 is merely illustrative. According to the implementation requirement, the intelligent household equipment can be any number of terminal equipment and intelligent household equipment.
The following describes in detail a control method for smart home devices according to an embodiment of the present application with reference to fig. 2. The control device of the smart home device in the embodiment of the present application may be the terminal device shown in fig. 1.
Referring to fig. 2, a schematic flow chart of a control method of smart home devices is provided in an embodiment of the present application. As shown in fig. 2, the method of the embodiment of the present application may include the steps of:
and S201, displaying a drawing interface through a display unit.
The terminal device displays a drawing interface through the display unit, the drawing interface comprises a canvas and a toolbar, and a user can interact with the drawing interface through input units such as a mouse, a keyboard or a touch screen, so that simple graphs can be drawn on the drawing interface.
S202, receiving a drawing instruction of a user, and drawing a first graph on a drawing interface based on the drawing instruction.
Wherein, the drawing instruction is used for drawing a first graph required by a user on the drawing interface, and the drawing instruction can be triggered by the user through the input unit, such as: the user draws a first graphic on the drawing interface through the touch screen.
S203, receiving a graphic attribute modification instruction of the first graphic of the user, and modifying the attribute value of the first graphic based on the graphic attribute modification instruction.
The graphic attribute modification instruction is used for modifying the attribute value of the first graphic, and the attribute value includes but is not limited to a color value, a size value, or a position coordinate, and the like. For example: and triggering and generating a graphic attribute modification instruction aiming at the first graphic by the user through the input unit.
And S204, generating a first control instruction according to the modified graph attribute value.
The terminal device determines a control parameter value associated with the modified graphic attribute value according to the mapping relation, and then generates a first control instruction according to the control parameter value.
S205, determining first intelligent household equipment associated with the first graph in an equipment library.
The device library comprises a plurality of intelligent home devices managed by a user, the terminal device extracts shape parameter values of the first graph, and determines the associated first intelligent home devices in the device library according to the shape parameter values, for example: the first graph is circular and is associated with an intelligent desk lamp in the equipment library; the first graph is a rectangle and is associated with the intelligent sound box in the equipment library.
S206, sending a first control instruction to the first intelligent household equipment.
The terminal device sends a first control instruction to the first smart home device, the first smart home device analyzes the first control instruction to obtain a control parameter value, and control operation is executed based on the control parameter value, wherein the control parameter includes but is not limited to: adjusting brightness, adjusting volume, opening curtains, opening air conditioners and the like. The control parameter values may include device attribute values and time attribute values, such as: the device attribute values include a light emission color, a volume, a temperature, a humidity, and the like, and the time attribute values include: the closing time, the frequency, the opening time and the like, and the specific device attribute value and the time attribute value may be determined according to the actual usage scenario, which is not limited in the present application.
Referring to fig. 3, a process of the control method for smart home devices according to the present application is described below with respect to a specific embodiment: the terminal device is provided with a touch screen and a display screen, the terminal device displays a drawing interface on a display unit, a user triggers and generates a drawing instruction through the touch screen, the terminal device draws a circle on the drawing interface based on the drawing instruction, then the user performs color filling operation (for example, red filling) on the circle through the touch screen, triggers and generates a color value modification instruction aiming at the circle, modifies the color of the circle into red based on the color value modification instruction, then generates a control instruction based on the red color value, determines a related intelligent desk lamp in a device library according to the circular shape parameter value, and then the terminal device sends the control instruction to the intelligent desk lamp to indicate the intelligent desk lamp to emit red light.
According to the embodiment, the intelligent household equipment is controlled based on the drawing mode, compared with the prior art that the intelligent household equipment is controlled through voice recognition, the control method is expanded, the richness and the interestingness of the control mode are improved, and then the viscosity of the control application program installed on the terminal equipment used by a user can be improved.
In one or more possible embodiments, before S201, the method further includes: binding the first graph and the first intelligent household equipment, wherein the binding process comprises the following steps:
acquiring an image of first intelligent household equipment through a camera;
inquiring matched intelligent household equipment in an equipment library according to the characteristic information of the image;
selecting the intelligent household equipment from the matched intelligent household equipment based on a selection instruction of a user;
displaying a drawing interface through a display unit, and drawing a graph in the drawing interface;
and binding the drawn graph with the selected intelligent household equipment.
The binding process is described below with respect to the user interfaces of fig. 4, 14, and 6: the terminal equipment firstly displays the user interface of fig. 4, the user interface comprises a shooting button, when the triggering operation on the shooting button is detected, a camera is called to shoot the first intelligent household equipment needing to be bound, and the image obtained through shooting is displayed in the user interface of fig. 4. Then, the terminal device extracts feature information of the image, where the feature information includes texture features, shape features, color features, and the like, and searches for a plurality of matching smart home devices in the device library according to the feature information, where the number of the matching smart home devices may be multiple, for example: referring to the user interface shown in fig. 14, the matched smart home devices are smart table lamps, smart headlights and bedside lamps, and each matched smart home device is further provided with a product model, a high-frequency service time and a linkage scene. After a user selects the intelligent table lamp in the matched intelligent household equipment through the touch screen, the terminal equipment jumps to display the user interface of fig. 6, the user interface comprises canvas, the graph drawn by the user in the canvas is circular, and the drawn circular graph and the intelligent table lamp are bound after the user clicks the confirmation button.
Further, referring to fig. 7, the method for controlling smart home devices according to the present application further includes:
and S701, drawing a second graph and a third graph on a drawing interface.
S702, after a connecting line is drawn between the second graph and the third graph based on a connecting instruction of a user, the second intelligent household equipment related to the second graph and the intelligent household equipment related to the third graph are bound.
Wherein, carry out coordinated control between the second intelligent household equipment that binds and the third intelligent household equipment, coordinated control's parameter includes but not limited to volume, luminance, colour, temperature, wind speed etc. and two parameters of coordinated control can be the same, also can not be the same, for example: the parameter of the second intelligent household equipment linkage control is temperature, and the parameter of the third intelligent household equipment linkage control is wind speed. Two parameters of linkage control can be determined according to actual requirements, and the application is not limited. That is, the wind speed of the third smart home device changes with the temperature change of the second smart home device.
And S703, sending a second control instruction to the second intelligent household equipment.
And the second control instruction carries a second control parameter value.
And S704, generating a third control command according to the second control parameter value.
Wherein the third control instruction carries a third control parameter value, the third control parameter value being related to the second control parameter value, for example: the third control parameter value is brightness, the second control parameter value is volume, and the volume and the brightness are in a proportional relation.
S705, sending a third control instruction to third intelligent household equipment.
For example, referring to fig. 8, the following describes a process of performing linkage control on the intelligent desk lamp and the intelligent sound box:
the user draws rectangle and circular through the touch-sensitive screen in drawing interface, the intelligent audio amplifier in the rectangle correlation equipment storehouse, the intelligent desk lamp in the circular correlation equipment storehouse, terminal equipment draw the connecting wire between rectangle and circular based on user connection instruction, accomplish the drawing back of connecting wire, terminal equipment binds intelligent audio amplifier and intelligent desk lamp. Terminal equipment preconfiguration has the control parameter that carries out coordinated control between intelligent audio amplifier and the intelligent desk lamp: the control parameter of the intelligent sound box is volume, and the control parameter of the intelligent desk lamp is brightness, namely when the volume of the intelligent sound box is increased, the brightness of the intelligent desk lamp is increased; when the volume of intelligent audio amplifier reduces, the luminance of intelligent desk lamp is along with reducing to the effect of realization linkage.
It should be noted that, in this embodiment, linkage control may be performed on two or more pieces of smart home equipment, and parameters for linkage control of the two or more pieces of smart home equipment and numerical relationships between the parameter values may be set according to time requirements, so as to improve flexibility of control. This embodiment carries out coordinated control to a plurality of intelligent household equipment through the mode of drawing, can further promote the interest and the user of control and use the stickness.
In one or more possible embodiments, referring to fig. 9, the method for controlling smart home devices further includes:
and S901, detecting the relative position relation between the marker and a preset trigger position through a camera.
The terminal device collects images of the markers through one or more cameras deployed indoors, and collects images of a preset trigger position, where the trigger position is an area preset by a user, for example: the area can be a storage tray, a bedside table or other areas, and the like, the terminal device identifies the marker and the trigger position in the image based on a target identification algorithm, and then determines the relative position relationship between the marker and the trigger position.
And S902, if the marker is located in the trigger position, acquiring the current time.
When the terminal equipment detects that the marker and the trigger position are overlapped based on the image detection method, the marker is determined to be located in the trigger position, and the current time can be obtained through a system clock.
And S903, if the current time is in the trigger time period associated with the trigger position, determining a control instruction sequence associated with the trigger position.
The trigger position is associated with a trigger time period in advance, the trigger time period is a time interval, the marker is associated with a control instruction sequence, the control instruction sequence comprises a plurality of control instructions arranged according to a certain sequence, and each control instruction corresponds to one intelligent household device, for example: the control instruction sequence comprises: the intelligent desk lamp is started, and the air conditioner and the water heater are started.
And S904, sequentially sending control instructions to the intelligent household equipment according to the instruction sequence in the control instruction sequence.
For example, the marker is a key, the preset trigger position is an object placing tray located in the entrance, when the terminal device detects that the key is placed in the object placing tray based on image recognition, the current time is 19:30, and the terminal device determines that the trigger time period associated with the object placing tray is as follows: 18: 00-24: 00, the current time is in the trigger time period, and the control instruction sequence related to the acquired key is as follows: the method comprises the steps of opening a lamp command → drawing a curtain → opening a water heater command, and then the terminal equipment sequentially sends an opening command to the intelligent lamp, a retracting command to the intelligent curtain and a water boiling command to the water heater.
For another example, the marker is glasses, the preset trigger position is the surface of a bedside table, when the terminal device detects the surface of the bedside table where the glasses are placed based on image recognition, the current time is obtained to be 22:30, the terminal device determines that the trigger time period associated with the bedside table is 22: 00-24: 00, the terminal device determines that the current time is located within the trigger time period, and the control instruction sequence associated with the glasses is obtained as follows: the light-off instruction → the instruction of pulling up the bedroom curtain, and then the terminal equipment sends the opening instruction to the intelligent lamp in sequence and sends the retracting instruction to the intelligent curtain of the bedroom.
Further, before S901 of the present application, the method further includes:
collecting an image of the marker through a camera;
selecting a pixel area of a trigger position based on a 3D whole house image preset by a selection instruction of a user, and binding the pixel area of the trigger position with an image of a marker;
a set of instruction sequences to configure the markers, and a trigger time period to configure the trigger positions.
The configuration process of the markers of the present application is described below with respect to the user interfaces of fig. 10, 11, and 12:
the terminal equipment displays the configuration interface of fig. 10, the configuration interface comprises a button for shooting the marker object, when the terminal equipment detects the trigger action on the button, the shooting interface is displayed, then the image of the marker is collected through the camera, and then the image recognition is carried out to obtain that the marker is the key. The configuration interface comprises a 3D whole house selection position button, when the terminal device detects a trigger action on the button, the 3D whole house image shown in FIG. 11 is displayed, and because the size of the 3D whole house image is large, the control button is arranged on the 3D whole house image and used for executing an upward moving operation, a downward moving operation, a leftward moving operation or a rightward moving operation, the whole content of the 3D whole house image is convenient to preview, then the terminal device selects an object placing disc at an entrance in the 3D whole house image as a trigger area based on a selection instruction of a user, and then a pixel area of the trigger area and an image of a key are bound. Then, the terminal device configures the name, the member, the start mode, and the sequence (i.e. the control command sequence) of the smart product based on the configuration interface of fig. 12, and after the configuration is completed, as shown in fig. 10, the key-associated control command sequence is: turning on the lamp → closing the curtain → opening the water heater, wherein the associated triggering time period of the object placing plate is 12: 00-2: 00.
According to the embodiment, the position relation between the marker and the trigger position triggers a specific control instruction sequence, and then the control instruction is sent to each intelligent household device in sequence to control the intelligent household devices in a non-inductive mode.
The following are embodiments of the apparatus of the present application that may be used to perform embodiments of the method of the present application. For details which are not disclosed in the embodiments of the apparatus of the present application, reference is made to the embodiments of the method of the present application.
Please refer to fig. 13, which shows a schematic structural diagram of a control apparatus of a smart home device according to an exemplary embodiment of the present application, and is hereinafter referred to as the apparatus 13 for short. The means 13 may be implemented by software, hardware or a combination of both as all or part of a terminal device. The device 13 comprises: a display unit 1301, a drawing unit 1302, a modification unit 1303, a generation unit 1304, a determination unit 1305, and a transceiving unit 1306.
A display unit 1301 for displaying a drawing interface;
a drawing unit 1302, configured to receive a drawing instruction of a user, and draw a first graphic on the drawing interface based on the drawing instruction;
a modifying unit 1303, configured to receive a graphic attribute modifying instruction of the first graphic of the user, and modify an attribute value of the first graphic based on the graphic attribute modifying instruction;
a generating unit 1304, configured to generate a first control instruction according to the modified graphic attribute value; wherein the first control instruction carries a control parameter value;
a determining unit 1305, configured to determine, in a device library, a first smart home device associated with the first graph;
the transceiving unit 1306 is configured to send the first control instruction to the first smart home device.
In one or more possible embodiments, the graphic property modification instructions are for modifying a color of the first graphic, adjusting a size of the first graphic, or adjusting a position of the first graphic.
In one or more possible embodiments, the method further comprises:
the configuration unit is used for acquiring an image of the first intelligent household equipment through the camera;
inquiring matched intelligent household equipment in an equipment library according to the characteristic information of the image;
selecting the intelligent household equipment from the matched intelligent household equipment based on a selection instruction of a user;
displaying a drawing interface through a display unit, and drawing a graph in the drawing interface;
and binding the drawn graph with the selected intelligent household equipment.
In one or more possible embodiments, the method further comprises:
the linkage control unit is used for drawing a second graph and a third graph on the drawing interface;
after a connecting line is drawn between the second graph and the third graph based on a connecting instruction of a user, binding second intelligent household equipment associated with the second graph and third intelligent household equipment associated with the third graph;
sending a second control instruction to the second intelligent household equipment; wherein the second control instruction carries a second control parameter value;
generating a third control instruction according to the second control parameter value; wherein the third control instruction carries a third control parameter value;
and sending a third control instruction to the third intelligent household equipment.
In one or more possible embodiments, the second control parameter value is volume, the third control parameter value is brightness, and the brightness is in positive correlation with the loudspeaker box.
In one or more possible embodiments, the method further comprises:
the trigger control unit is used for detecting the relative position relation between the marker and a preset trigger area through the camera;
if the marker is located in the trigger area, acquiring the current time;
if the current time is within the trigger time period associated with the preset area, determining a control instruction sequence associated with the preset area;
and sequentially sending control instructions to the intelligent household equipment according to the instruction sequence in the control instruction sequence.
In one or more possible embodiments, the configuration unit is further configured to:
collecting an image of the marker through a camera;
selecting a pixel area of a trigger position based on a 3D whole house image preset by a selection instruction of a user, and binding the pixel area of the trigger position with an image of a marker;
a set of instruction sequences to configure the markers, and a trigger time period to configure the trigger positions.
It should be noted that, when the apparatus 13 provided in the foregoing embodiment executes the control method of the smart home device, only the division of the functional modules is illustrated, and in practical applications, the functions may be distributed by different functional modules according to needs, that is, the internal structure of the device may be divided into different functional modules, so as to complete all or part of the functions. In addition, the control device of the smart home device provided in the above embodiment and the control method embodiment of the smart home device belong to the same concept, and details of the implementation process are described in the method embodiment, and are not described herein again.
The above-mentioned serial numbers of the embodiments of the present application are merely for description and do not represent the merits of the embodiments.
An embodiment of the present application further provides a computer storage medium, where the computer storage medium may store a plurality of instructions, where the instructions are suitable for being loaded by a processor and executing the method steps in the embodiment shown in fig. 2, and a specific execution process may refer to a specific description of the embodiment shown in fig. 2, which is not described herein again.
The application also provides a computer program product, which stores at least one instruction, and the at least one instruction is loaded and executed by the processor to implement the control method of the smart home device according to the above embodiments.
Please refer to fig. 14, which provides a schematic structural diagram of a terminal device according to an embodiment of the present application. As shown in fig. 14, the terminal device 1400 may include: at least one processor 1401, at least one network interface 1404, a user interface 1403, memory 1405, at least one communication bus 1402.
The communication bus 1402 is used to realize connection communication among these components.
The user interface 1403 may include a Display screen (Display) and a Camera (Camera), and the optional user interface 1403 may also include a standard wired interface and a standard wireless interface.
The network interface 1404 may optionally include a standard wired interface, a wireless interface (e.g., WI-FI interface), among others.
The Memory 1405 may include a Random Access Memory (RAM) or a Read-Only Memory (Read-Only Memory). Optionally, the memory 1405 includes a non-transitory computer-readable medium. The memory 1405 may be used to store instructions, programs, code, sets of codes, or sets of instructions. The memory 1405 may include a program storage area and a data storage area, wherein the program storage area may store instructions for implementing an operating system, instructions for at least one function (such as a touch function, a sound playing function, an image playing function, etc.), instructions for implementing the above-described method embodiments, and the like; the storage data area may store data and the like referred to in the above respective method embodiments. Memory 1405 may optionally be at least one memory device located remotely from processor 1401 as described above. As shown in fig. 14, the memory 1405, which is a kind of computer storage medium, may include therein an operating system, a network communication module, a user interface module, and an application program.
In the terminal apparatus 1400 shown in fig. 14, the user interface 1403 is mainly used as an interface for providing input for a user, and acquiring data input by the user; the processor 1401 may be configured to invoke the application program stored in the memory 1405 and specifically execute the method shown in fig. 2, and the specific process may refer to fig. 2 and is not described herein again.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. The storage medium can be a magnetic disk, an optical disk, a read-only memory or a random access memory.
The above disclosure is only for the purpose of illustrating the preferred embodiments of the present application and is not to be construed as limiting the scope of the present application, so that the present application is not limited thereto, and all equivalent variations and modifications can be made to the present application.
Claims (10)
1. A control method of intelligent household equipment is characterized by comprising the following steps:
displaying a drawing interface through a display unit;
receiving a drawing instruction of a user, and drawing a first graph on the drawing interface based on the drawing instruction;
receiving a graphic attribute modification instruction of the first graphic of a user, and modifying an attribute value of the first graphic based on the graphic attribute modification instruction;
generating a first control instruction according to the modified graph attribute value; wherein the first control instruction carries a control parameter value;
determining first smart home equipment associated with the first graph in an equipment library;
and sending the first control instruction to the first intelligent household equipment.
2. The method of claim 1, wherein the graphic property modification instruction is configured to modify a color of the first graphic, resize the first graphic, or resize the first graphic.
3. The method according to claim 1 or 2, wherein before displaying the drawing interface through the display unit, the method further comprises:
acquiring an image of first intelligent household equipment through a camera;
inquiring matched intelligent household equipment in an equipment library according to the characteristic information of the image;
selecting the intelligent household equipment from the matched intelligent household equipment based on a selection instruction of a user;
displaying a drawing interface through a display unit, and drawing a graph in the drawing interface;
and binding the drawn graph with the selected intelligent household equipment.
4. The method of claim 4, further comprising:
drawing a second graph and a third graph on the drawing interface;
after a connecting line is drawn between the second graph and the third graph based on a connecting instruction of a user, binding second intelligent household equipment associated with the second graph and third intelligent household equipment associated with the third graph;
sending a second control instruction to the second intelligent household equipment; wherein the second control instruction carries a second control parameter value;
generating a third control instruction according to the second control parameter value; wherein the third control instruction carries a third control parameter value;
and sending a third control instruction to the third intelligent household equipment.
5. The method of claim 3, wherein the second control parameter value is volume, the third control parameter value is brightness, and brightness and volume are positively correlated.
6. The method of claim 4 or 5, further comprising:
detecting a relative position relation between the marker and a preset trigger area through a camera;
if the marker is located in the trigger area, acquiring the current time;
if the current time is within the trigger time period associated with the preset area, determining a control instruction sequence associated with the preset area;
and sequentially sending control instructions to the intelligent household equipment according to the instruction sequence in the control instruction sequence.
7. The method according to claim 6, wherein before detecting the relative position relationship between the marker and the preset trigger area through the camera, the method further comprises:
collecting an image of the marker through a camera;
selecting a pixel area of a trigger position based on a 3D whole house image preset by a selection instruction of a user, and binding the pixel area of the trigger position with an image of a marker;
a set of instruction sequences to configure the markers, and a trigger time period to configure the trigger positions.
8. The utility model provides a controlling means of intelligent household equipment which characterized in that includes:
a display unit for displaying a drawing interface;
the drawing unit is used for receiving a drawing instruction of a user and drawing a first graph on the drawing interface based on the drawing instruction;
the modification unit is used for receiving a graphic attribute modification instruction of the first graphic of a user and modifying an attribute value of the first graphic based on the graphic attribute modification instruction;
the generating unit is used for generating a first control instruction according to the modified graph attribute value; wherein the first control instruction carries a control parameter value;
the determining unit is used for determining first intelligent household equipment associated with the first graph in an equipment library;
and the transceiver unit is used for sending the first control instruction to the first intelligent household equipment.
9. A computer storage medium, characterized in that it stores a plurality of instructions adapted to be loaded by a processor and to carry out the method steps according to any one of claims 1 to 7.
10. A terminal device, comprising: a processor and a memory; wherein the memory stores a computer program adapted to be loaded by the processor and to perform the method steps of any of claims 1 to 7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210373430.2A CN114816197A (en) | 2022-04-11 | 2022-04-11 | Control method and device of intelligent household equipment, storage medium and terminal equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210373430.2A CN114816197A (en) | 2022-04-11 | 2022-04-11 | Control method and device of intelligent household equipment, storage medium and terminal equipment |
Publications (1)
Publication Number | Publication Date |
---|---|
CN114816197A true CN114816197A (en) | 2022-07-29 |
Family
ID=82535476
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210373430.2A Pending CN114816197A (en) | 2022-04-11 | 2022-04-11 | Control method and device of intelligent household equipment, storage medium and terminal equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114816197A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115314535A (en) * | 2022-07-30 | 2022-11-08 | 远光软件股份有限公司 | Control method and device of Internet of things equipment, storage medium and computer equipment |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120065795A1 (en) * | 2010-09-10 | 2012-03-15 | Blackshaw Andrew L | System and method for operating an economizer cycle of an air conditioner |
CN103701820A (en) * | 2013-12-30 | 2014-04-02 | 广州中大数字家庭工程技术研究中心有限公司 | IMS (IP multimedia subsystem)-based digital family interaction education system |
CN106773750A (en) * | 2016-11-11 | 2017-05-31 | 北京小米移动软件有限公司 | Equipment method for displaying image and device |
CN109491263A (en) * | 2018-12-13 | 2019-03-19 | 深圳绿米联创科技有限公司 | Intelligent home equipment control method, device, system and storage medium |
-
2022
- 2022-04-11 CN CN202210373430.2A patent/CN114816197A/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120065795A1 (en) * | 2010-09-10 | 2012-03-15 | Blackshaw Andrew L | System and method for operating an economizer cycle of an air conditioner |
CN103701820A (en) * | 2013-12-30 | 2014-04-02 | 广州中大数字家庭工程技术研究中心有限公司 | IMS (IP multimedia subsystem)-based digital family interaction education system |
CN106773750A (en) * | 2016-11-11 | 2017-05-31 | 北京小米移动软件有限公司 | Equipment method for displaying image and device |
CN109491263A (en) * | 2018-12-13 | 2019-03-19 | 深圳绿米联创科技有限公司 | Intelligent home equipment control method, device, system and storage medium |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115314535A (en) * | 2022-07-30 | 2022-11-08 | 远光软件股份有限公司 | Control method and device of Internet of things equipment, storage medium and computer equipment |
CN115314535B (en) * | 2022-07-30 | 2024-04-02 | 远光软件股份有限公司 | Control method and device of Internet of things equipment, storage medium and computer equipment |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11812232B2 (en) | Electronic device and music visualization method thereof | |
RU2622405C2 (en) | Light source remote control | |
CN109597481A (en) | AR virtual portrait method for drafting, device, mobile terminal and storage medium | |
CN103310099A (en) | Method and system for realizing augmented reality by adopting image capture and recognition technology | |
CN110032361B (en) | Experiment simulation method, experiment simulation device, electronic equipment and computer readable storage medium | |
CN110533755A (en) | A kind of method and relevant apparatus of scene rendering | |
CN110458921B (en) | Image processing method, device, terminal and storage medium | |
CN111736489B (en) | Distributed stage lighting simulation system and method | |
CN111246264B (en) | Display apparatus and control method thereof | |
CN111145358B (en) | Image processing method, device and hardware device | |
CN109507904B (en) | Household equipment management method, server and management system | |
CN109725956A (en) | A kind of method and relevant apparatus of scene rendering | |
CN113677071A (en) | Lamp control method and device, electronic equipment and storage medium | |
CN114387445A (en) | Object key point identification method and device, electronic equipment and storage medium | |
CN114816197A (en) | Control method and device of intelligent household equipment, storage medium and terminal equipment | |
CN117521179B (en) | Atmosphere lamp equipment, luminous partition layout construction method and device and computer equipment | |
CN117528873B (en) | Atmosphere lamp equipment, luminous partition layout generation method and device and computer equipment | |
WO2020181529A1 (en) | Display screen configuration method, apparatus and system | |
CN110109594A (en) | A kind of draw data sharing method, device, storage medium and equipment | |
CN104750349B (en) | A kind of setting method and device of user's head portrait | |
CN107168606B (en) | Dialog control display methods, device and user terminal | |
CN113852646A (en) | Control method and device of intelligent equipment, electronic equipment and system | |
WO2021042622A1 (en) | Mind map presentation method and apparatus, storage medium, and electronic device | |
CN109089040A (en) | Image processing method, image processing apparatus and terminal device | |
WO2022242276A1 (en) | Display control method and apparatus in game, storage medium, and electronic device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |