CN111443843A - Device control method, storage medium, and electronic device - Google Patents

Device control method, storage medium, and electronic device Download PDF

Info

Publication number
CN111443843A
CN111443843A CN202010224486.2A CN202010224486A CN111443843A CN 111443843 A CN111443843 A CN 111443843A CN 202010224486 A CN202010224486 A CN 202010224486A CN 111443843 A CN111443843 A CN 111443843A
Authority
CN
China
Prior art keywords
control
contextual model
equipment
control unit
dimensional
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010224486.2A
Other languages
Chinese (zh)
Other versions
CN111443843B (en
Inventor
郑艳艳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Gree Electric Appliances Inc of Zhuhai
Original Assignee
Gree Electric Appliances Inc of Zhuhai
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Gree Electric Appliances Inc of Zhuhai filed Critical Gree Electric Appliances Inc of Zhuhai
Priority to CN202010224486.2A priority Critical patent/CN111443843B/en
Publication of CN111443843A publication Critical patent/CN111443843A/en
Application granted granted Critical
Publication of CN111443843B publication Critical patent/CN111443843B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B15/00Systems controlled by a computer
    • G05B15/02Systems controlled by a computer electric
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/418Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/20Pc systems
    • G05B2219/26Pc applications
    • G05B2219/2642Domotique, domestic, home control, automation, smart house
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Automation & Control Theory (AREA)
  • Manufacturing & Machinery (AREA)
  • Quality & Reliability (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention discloses a device control method, a storage medium and an electronic device, which relate to the technical field of communication, and the method comprises the following steps: controlling a three-dimensional control on a display interface to rotate according to the operation gesture so as to select a required equipment contextual model; the external surface of the three-dimensional control is provided with a control unit related to the equipment scene mode; and controlling the operation mode of the corresponding equipment according to the required equipment contextual model. The invention has the beneficial effects that: the switching of the device contextual model can be rapidly carried out, the operation is simple, the interestingness is high, and the user experience degree can be greatly improved.

Description

Device control method, storage medium, and electronic device
Technical Field
The present invention relates to the field of communications technologies, and in particular, to an apparatus control method, a storage medium, and an electronic apparatus.
Background
With the improvement of living standard, people have more and more demand for intelligent household products. In order to improve the convenience of the smart home products, the smart home products of various manufacturers use the contextual model. Thus, the operation requiring completion of several actions is simplified to one-key start. Such as leaving scene, returning scene, guest-meeting scene, dining scene, cinema scene, sleeping scene, and night mode.
However, the APP of each manufacturer has poor usability in the setting operation of the contextual model, the setting process is troublesome, and the user does not understand the test problems such as the complicated interactive mode, which results in poor user experience. For example, a profile typically operates in a drop down in the home page, but there is no interactive direction in the home page, resulting in no way being found by the user. Or the application operation of the contextual model is mechanical addition, and the command is difficult to understand, so that the operation task of the user cannot be completed.
Disclosure of Invention
The invention provides a device control method, a storage medium and electronic equipment based on the technical problems that the existing intelligent household product is poor in contextual model setting usability, troublesome in setting process, incapable of understanding complex interaction modes by users and the like.
In a first aspect, an embodiment of the present invention provides an apparatus control method, including:
controlling a three-dimensional control on a display interface to rotate according to the operation gesture so as to select a required equipment contextual model; the external surface of the three-dimensional control is provided with a control unit related to the equipment scene mode;
and controlling the operation mode of the corresponding equipment according to the required equipment contextual model.
Optionally, according to the operation gesture, controlling a stereoscopic control on the display interface to rotate to select a desired device contextual model, including:
according to an operation gesture, controlling the three-dimensional control to rotate by taking a preset point as a rotation center and taking the moving direction of the operation gesture as a rotation direction, so that a partial area of the outer surface of the three-dimensional control faces a preset direction;
and selecting a corresponding device contextual model from the control units on the partial area of the outer surface of the three-dimensional control facing the preset direction, and determining the device contextual model as the required device contextual model.
Optionally, the method further comprises:
and when the three-dimensional control keeps a static state and reaches a preset duration, selecting a corresponding equipment contextual model from the control units on the partial area of the outer surface of the three-dimensional control facing to the preset direction, and determining the equipment contextual model as the required equipment contextual model.
Optionally, selecting a corresponding device contextual model from control units on a partial region of the external surface of the stereoscopic control facing a preset orientation, and determining the device contextual model as a required device contextual model, including:
judging the number of control units on a partial area of the outer surface of the three-dimensional control facing a preset direction;
when only one control unit is arranged on a partial area of the outer surface of the three-dimensional control facing a preset direction, determining the equipment contextual model corresponding to the control unit as the required equipment contextual model.
Optionally, selecting a corresponding device contextual model from control units on a partial region of the external surface of the stereoscopic control facing a preset orientation, and determining the device contextual model as a required device contextual model, including:
judging the number of control units on a partial area of the outer surface of the three-dimensional control facing a preset direction;
when a plurality of control units are arranged on a partial region of the outer surface of the three-dimensional control facing a preset direction, comparing the projection area of the sub-surface where the control unit of each equipment contextual model is located, projected on the preset direction;
and determining the device contextual model corresponding to the control unit on the sub-surface corresponding to the projection with the largest area as the required device contextual model.
Optionally, selecting a corresponding device contextual model from control units on a partial region of the external surface of the stereoscopic control facing a preset orientation, and determining the device contextual model as a required device contextual model, including:
and responding to a mode selection command, determining a selected control unit from the control units on the partial area of the outer surface of the three-dimensional control facing to the preset direction, and determining the equipment contextual model corresponding to the control unit as the required equipment contextual model.
Optionally, the preset point comprises a center point of the stereoscopic control, and/or
The three-dimensional control comprises one of a cube control, a sphere control and a cone control.
Optionally, the method further comprises:
and responding to a parameter setting command, and displaying a parameter setting interface corresponding to the corresponding control unit, wherein the parameter setting interface is used for setting the parameters of the equipment contextual model corresponding to the control unit.
Optionally, the method further comprises:
associating the intelligent equipment with the equipment contextual model corresponding to the control unit by moving the intelligent equipment label to the corresponding area on the outer surface of the three-dimensional control where the corresponding control unit is located;
and/or;
and removing the association between the intelligent equipment and the equipment contextual model corresponding to the control unit by moving the label of the intelligent equipment out of the corresponding area on the outer surface of the three-dimensional control where the corresponding control unit is located.
In a second aspect, an embodiment of the present invention provides a storage medium, on which program codes are stored, and when the program codes are executed by a processor, the apparatus control method according to any one of the above embodiments is implemented.
In a third aspect, an embodiment of the present invention provides an electronic device, where the electronic device includes a memory and a processor, where the memory stores program codes executable on the processor, and when the program codes are executed by the processor, the electronic device implements the device control method according to any one of the foregoing embodiments.
According to the equipment control method provided by the embodiment of the invention, the three-dimensional control on the display interface is controlled to rotate, so that different equipment contextual models are displayed in the rotating process, and a required equipment contextual model is selected, so that the intelligent household product is controlled according to the required equipment contextual model. Therefore, the device control method provided by the embodiment of the invention can rapidly switch the contextual model of the device, is simple to operate, is very interesting, and can greatly improve the user experience.
Drawings
The scope of the present disclosure may be better understood by reading the following detailed description of exemplary embodiments in conjunction with the accompanying drawings. Wherein the included drawings are:
FIG. 1 is a diagram illustrating a device profile setting mode;
FIG. 2 is a diagram illustrating another device profile setting;
fig. 3 is a schematic flow chart illustrating a method for controlling a device according to an embodiment of the present invention;
FIG. 4 illustrates a device profile setting diagram for a cube control;
FIG. 5 illustrates a device profile setting diagram for a sphere control;
fig. 6 is a flowchart illustrating an apparatus control method according to a second embodiment of the present invention;
fig. 7 is a diagram illustrating a device profile switching by rotating a stereo control;
FIG. 8 is a diagram illustrating a plurality of device profiles in a stereo control towards a preset orientation;
fig. 9 is a schematic interface diagram illustrating a device scene mode switching by rotating a stereoscopic control;
fig. 10 is a diagram showing an interface for switching the device profile according to a selection instruction;
FIG. 11 shows an interface diagram of a parameter setting interface;
FIG. 12 illustrates an interface diagram for associating smart device tags with device profiles.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention clearer, the following will describe in detail an implementation method of the present invention with reference to the accompanying drawings and embodiments, so that how to apply technical means to solve the technical problems and achieve the technical effects can be fully understood and implemented.
In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present invention, however, the present invention may be practiced in other ways than those specifically described herein, and therefore the scope of the present invention is not limited by the specific embodiments disclosed below.
The scene mode refers to a whole set of response modes selected according to different scenes. The device contextual model of the smart home refers to device parameters of the smart home set by a user under different scenes according to requirements. For example, a "sleep" scenario may select a mode that only leaves the sleep light on, the air conditioner or heater on, and the other lights off.
At present, the setting modes of the device contextual model commonly used in the smart home application program include the following modes:
fig. 1 is a schematic diagram illustrating a device contextual model setting mode, as shown in fig. 1, contextual models such as "home", "go out", "work", and "music" are tiled on a display interface, then different contextual models are displayed in an interactive mode of sliding left and right on the display interface, and a user performs scene switching by clicking the contextual model on the display interface.
Fig. 2 is a schematic diagram illustrating another device contextual model setting manner, and as shown in fig. 2, contextual models such as "home", "music", "work", "out", "sleep", and "entertainment" are tiled on a display interface, and a user performs scene switching by clicking the contextual model on the display interface.
However, the above device profile setting method is based on the "tiling + clicking" interaction method, and after the profile of the user becomes more, not only the user may find the desired profile difficultly, but also the space of the display interface may be further compressed, and the interaction method is monotonous, and lacks interest, which may result in a decrease in user experience.
In view of the above technical problems, the inventor of the present disclosure improves a setting manner of a contextual model of a smart home application, and a device control method proposed by the present disclosure is described below with reference to the accompanying drawings.
Example one
According to an embodiment of the present invention, an apparatus control method is provided, and fig. 3 shows a flowchart of an apparatus control method according to an embodiment of the present invention, and as shown in fig. 3, the apparatus control method may include: step 110 to step 120.
In step 110, controlling the stereoscopic control on the display interface to rotate according to the operation gesture so as to select a required device contextual model; and a control unit related to the equipment scene mode is arranged on the outer surface of the three-dimensional control.
Here, the stereoscopic control is a three-dimensional virtual stereoscopic control displayed on the display interface, and the stereoscopic control may be one of a cube control, a sphere control, and a cone control. And a control unit related to the equipment scene mode is arranged on the outer surface of the three-dimensional control.
The device profile setting will be described below with reference to fig. 4 and 5.
When the three-dimensional control is a cube control, a control unit related to the device contextual model can be arranged on each surface of the cube control, and the control unit can be arranged at the center of each surface. Each face of the cube control can represent a control unit related to the device contextual model, and the user can rotate the cube control to meet the requirement of switching the device contextual model. Fig. 4 shows a schematic diagram of setting a device profile of a cubic control, as shown in fig. 4, in the cubic control, device profiles of a scene a, a scene B, and a scene C are respectively set in three surfaces of the cubic control, and in addition, when the stereoscopic control is a cone control, a setting manner of the stereoscopic control is consistent with a setting manner of the cubic control, and a separate description is not provided here.
When the three-dimensional control is a sphere control, the outer surface of the sphere control is uniformly divided into a plurality of sub-outer surfaces, an equipment contextual model is arranged at the center of each sub-outer surface, each sub-outer surface of the sphere control can represent a control unit, and a user can rotate the sphere control to meet the requirement for switching the equipment contextual models. Fig. 5 shows a schematic diagram of setting a device contextual model of a sphere control, as shown in fig. 5, an outer surface of the sphere control is uniformly divided into 8 equal sub outer surfaces, and each sub outer surface is provided with a device contextual model, for example, a scene C and a scene D, where the sub outer surfaces of the scene C and the scene D respectively represent a control unit. The division manner in fig. 5 is only used to explain the uniform division of the sphere control, and in practical applications, other division manners may also be used.
It should be noted that one surface in the stereoscopic control may represent at least one device profile, but each surface may not set a device profile. For example, the cube control includes six faces, and the six faces correspond to six device profiles, respectively, but if a user does not set a corresponding device profile in one of the faces, the user does not need to perform device profile switching when selecting the face.
In step 120, the operation mode of the corresponding device is controlled according to the required device profile.
Here, the device profile includes parameter settings of one or more devices. For example, when in a "sleep" scenario, the user's settings for the device include "turn on sleep light, air conditioner set to sleep mode". Then control instructions to turn on the sleep light and set the air conditioner to the sleep mode are executed when the "sleep" scenario is determined to be the desired device scenario mode.
In this embodiment, the three-dimensional control on the display interface is controlled to rotate, so that different device contextual models are displayed in the rotating process, a required device contextual model is selected according to a user instruction, and the smart home product is controlled according to the required device contextual model. The switching of the device contextual model can be rapidly carried out, the operation is simple, the interestingness is high, and the user experience degree can be greatly improved.
Example two
On the basis of the above embodiment, a second embodiment of the present invention may further provide an apparatus control method. Fig. 6 shows a flowchart of a device control method according to a second embodiment of the present invention, and as shown in fig. 6, the device control method may include: step 210 to step 230.
In step 210, according to an operation gesture, the stereoscopic control is controlled to rotate with a preset point as a rotation center and with a moving direction of the operation gesture as a rotation direction, so that a partial region of an outer surface of the stereoscopic control faces a preset orientation.
Here, the user controls the stereoscopic control to rotate through an operation gesture, so that different device contextual models are displayed in the rotating process. The rotation of the three-dimensional control takes a preset point as a rotation center and takes the moving direction of the operation gesture as a rotation direction. Namely, the three-dimensional control takes a preset point as a rotation center and rotates along with the operation gesture. Wherein the preset point is preferably a center point of the stereoscopic control.
It is worth mentioning that the operation gesture may be a long press. For example, the user triggers the stereoscopic control to rotate by pressing the touch screen for a long time. The stereo control rotates along with the moving direction of the operation gesture of the user on the touch screen.
Additionally, prior to step 210, stereoscopic control activation may be triggered by a gesture or other triggering action. For example, the stereoscopic control is hidden in the display interface in the standby state. And the user clicks or slides at a preset position of the display interface to trigger the three-dimensional control to be started, so that the three-dimensional control is displayed.
In step 220, a corresponding device contextual model is selected from the control units on the partial region of the outer surface of the stereoscopic control facing the preset orientation, and is determined as the required device contextual model.
Here, the user controls the stereoscopic control to rotate, so that the shielded device contextual model is selected to a preset position, and the user can conveniently select a required device contextual model. The selection process is that a partial region where the required device contextual model is located is rotated to a preset position, then the corresponding device contextual model is selected from the control units on the partial region of the outer surface of the three-dimensional control facing the preset position, and the device contextual model is determined as the required device contextual model.
In step 230, the operation mode of the corresponding device is controlled according to the required device profile.
Here, the device profile includes parameter settings of one or more devices. For example, when in a "sleep" scenario, the user's settings for the device include "turn on sleep light, air conditioner set to sleep mode". Then control instructions to turn on the sleep light and set the air conditioner to the sleep mode are executed when the "sleep" scenario is determined to be the desired device scenario mode.
Therefore, the stereoscopic control is controlled to rotate, so that the plane where the required device contextual model is located is rotated to the preset direction, and the device contextual model is switched. The switching of the device contextual model can be rapidly carried out, the operation is simple, the interestingness is high, and the user experience degree can be greatly improved.
In an optional embodiment, the method further comprises:
and when the three-dimensional control keeps a static state and reaches a preset duration, selecting a corresponding equipment contextual model from the control units on the partial area of the outer surface of the three-dimensional control facing to the preset direction, and determining the equipment contextual model as the required equipment contextual model.
When the user rotates the stereoscopic control to switch the device contextual model, after the surface of the device contextual model to be switched is rotated to the preset orientation and the stereoscopic control is maintained in a static state for the preset time length, selecting the corresponding device contextual model from the control units on the partial region of the external surface of the stereoscopic control facing the preset orientation, and determining the device contextual model as the required device contextual model. For example, after rotating the surface where the "scene a" is blocked to a preset orientation, the user performs timing, and when the timing reaches a preset time, selects a corresponding device scene mode from the control units on the partial region of the outer surface of the stereoscopic control facing the preset orientation, and determines the device scene mode as the required device scene mode.
Therefore, the situation that the user accidentally touches or carelessly switches the device scene modes in the rotation process can be avoided.
In an optional embodiment, in step 220, selecting a corresponding device contextual model from control units on a partial region of the external surface of the stereoscopic control facing a preset orientation, and determining the device contextual model as a required device contextual model includes:
judging the number of control units on a partial area of the outer surface of the three-dimensional control facing a preset direction;
when only one control unit is arranged on a partial area of the outer surface of the three-dimensional control facing a preset direction, determining the equipment contextual model corresponding to the control unit as the required equipment contextual model.
Here, the user selects a desired device profile by controlling the stereoscopic control to rotate. When only one control unit is arranged on a partial region of the outer surface of the three-dimensional control facing the preset direction, the selection process is to rotate the partial region where the required device contextual model is located to the preset direction, and then the corresponding device contextual model on the partial region located at the preset direction is determined as the device contextual model required by the user.
Assuming that the direction opposite to the display interface is taken as the preset direction, when one surface of the stereoscopic control is rotated to the direction opposite to the display interface, the surface of the stereoscopic control opposite to the direction of the display interface is determined as the required device contextual model. Fig. 7 shows a schematic diagram of switching device profiles by rotating a stereo control, and as shown in fig. 7, if the initial state of the stereo control is that the plane where the "profile a" is located is in a preset orientation, the "profile a" is used as the device profile currently set by the user. When the user needs to switch from the scene A to the scene C, the stereo control is rotated, and the device scene modes of other surfaces, such as the scene B and the scene C, are displayed in the rotating process. When the user rotates the surface of the scene C to the direction opposite to the display interface, the device scene mode switching from the scene A to the scene C is realized. It should be noted that, as a preferred embodiment, when the plane on which the "scene C" is located is maintained at the preset orientation for a preset time, the "scene C" is determined as the required scene mode.
In an optional embodiment, in step 220, selecting a corresponding device contextual model from control units on a partial region of the external surface of the stereoscopic control facing a preset orientation, and determining the device contextual model as a required device contextual model includes:
judging the number of option label control units on a partial area of the outer surface of the three-dimensional control facing a preset direction;
when a plurality of option label control units are arranged on a partial region of the outer surface of the three-dimensional control facing a preset direction, comparing the projection area of the sub-surface where the option label control unit of each equipment contextual model is located projected on the preset direction;
and determining the device contextual model corresponding to the option label control unit with the area on the sub-surface corresponding to the maximum projection area as the required device contextual model.
Here, when the user rotates the stereoscopic control, it may occur that the faces of the stereoscopic control are rotated to a preset orientation. Therefore, when a plurality of option label control units are arranged on a partial region of the outer surface of the stereo control facing to the preset direction, the device contextual model of the partial region corresponding to the projection with the largest area projected on the preset direction is determined as the required device contextual model. The situation that the required device contextual model cannot be determined due to the fact that the two outer surfaces are rotated to the preset orientation possibly caused by improper user operation can be avoided.
Fig. 8 is a schematic diagram illustrating a plurality of device profiles in a stereo control towards a preset orientation, and as shown in fig. 8, assuming that a direction facing a display interface is taken as the preset orientation, two device profiles "scene a" and "scene B" can be determined at the preset orientation. In this case, the plane on which the "scene a" and the "scene B" are located is projected onto the plane facing the display interface, and then the device profile corresponding to the projection with the largest projection area is selected as the required device profile.
The above embodiments are explained below by way of an example:
fig. 9 shows an interface diagram of rotating a stereoscopic control for device scene mode switching, and as shown in fig. 9, the interface diagram in fig. 9 is presented in the order of (a) - (b) - (c) - (d).
As shown in the interface (a), the user inputs an operation gesture at a preset position of the display interface, and the operation gesture can be long-pressing, double-clicking or sliding, so as to switch to the interface (b). As shown in the interface (b), according to the operation gesture, a stereoscopic control for switching the device scene mode is displayed in the display interface. At this moment, the 'home scene' orientation of the three-dimensional control is in a preset direction, and therefore the device scene mode executed by the smart home product is the 'home scene'. When the user wants to switch to the "sleep scene", the user controls the stereo control to rotate by operating the gesture, as shown in interface (c). When the user rotates the "sleep scene" to a preset position, and keeps the "sleep scene" in the preset position for a preset time, for example, 1 second, the switching of the device scene mode can be completed, as shown in the interface (d).
In another optional embodiment, selecting a corresponding device profile from the option label control unit on the partial region of the outer surface of the stereoscopic control facing the preset orientation to determine as the required device profile includes:
and responding to a mode selection command, determining a selected control unit from the control units on the partial area of the outer surface of the three-dimensional control facing to the preset direction, and determining the equipment contextual model corresponding to the control unit as the required equipment contextual model.
Here, the user controls the stereoscopic control to rotate, so that the shielded device contextual model is selected to a preset position, and the user can conveniently select a required device contextual model. The selection process is that after the outer surface part area where the required device contextual model is located is rotated to a preset position, the required contextual model is selected from the device contextual models on the three-dimensional control according to a selection instruction. The selection instruction may be an operation of double-clicking or clicking an outer surface on the stereoscopic control, so that the required device contextual model is selected according to the fact that the user clicks or double-clicks a control unit where the device contextual model is located.
Therefore, the stereoscopic control is controlled to rotate, and the device contextual model is switched according to the selection instruction of the user. The switching of the device contextual model can be rapidly carried out, the operation is simple, the interestingness is high, and the user experience degree can be greatly improved.
The above embodiments are explained below by way of an example:
fig. 10 shows an interface diagram for switching the device scene mode according to the selection instruction, and as shown in fig. 10, the interface diagram in fig. 10 is presented in the order of (e) - (f) - (g).
As shown in the interface (e), the user inputs an operation gesture at a preset position of the display interface, and the operation gesture can be long-pressing, double-clicking or sliding, so as to switch to the interface (f). As shown in the interface (f), according to the operation gesture, a stereoscopic control for switching the device scene mode is displayed in the display interface. When the user needs to switch the device contextual models, as shown in an interface (g), the user controls the three-dimensional control to rotate according to the operation gesture, so that the three-dimensional control can display different device contextual models, and the required device contextual models are determined according to the selection instruction of the user. For example, when the user clicks an "away scene" on the stereo control, the smart home product switches to an "away scene".
In an optional embodiment, the method further comprises:
and responding to a parameter setting command, and displaying a parameter setting interface corresponding to the corresponding control unit, wherein the parameter setting interface is used for setting the parameters of the equipment contextual model corresponding to the control unit.
Here, the parameter setting command may be to click or long-press a control unit on the stereoscopic control, and a user may display a parameter setting interface by clicking or long-pressing the control unit on one sub-outer surface of the stereoscopic control, so as to adjust the parameter setting of the device contextual model corresponding to the control unit on the parameter setting interface.
Fig. 11 shows an interface schematic diagram of a parameter setting interface, as shown in fig. 11, when a user needs to check or set an equipment contextual model, the user clicks a sub-outer surface where a "home" scene in a stereoscopic control is located, so as to display intelligent equipment and parameter settings in the "home" scene, and checks that "a living room air conditioner is turned on, and the parameters are set to 26 ℃, a low wind speed" or "a hallway light is turned on, and the light is warm light" and other intelligent equipment and parameter settings thereof in the "home" scene, and the parameter settings of the equipment contextual model corresponding to the control unit can be adjusted in the interface. For example, the user may turn off the living room air conditioner or adjust the living room air conditioner from 26 ℃ to 24 ℃.
Therefore, the user can quickly view or adjust the parameter setting of the device contextual model.
In an optional embodiment, the method further comprises:
associating the intelligent equipment with the equipment contextual model corresponding to the control unit by moving the intelligent equipment label to the corresponding area on the outer surface of the three-dimensional control where the corresponding control unit is located;
and/or;
and removing the association between the intelligent equipment and the equipment contextual model corresponding to the control unit by moving the label of the intelligent equipment out of the corresponding area on the outer surface of the three-dimensional control where the corresponding control unit is located.
In the embodiment, a method for rapidly managing an intelligent device in a device contextual model is provided. For example, one device profile includes "air purifier", "hallway light", and "living room light" and other intelligent devices and corresponding parameter settings, and when a user needs to add "living room air conditioner" and corresponding parameter settings to the device profile. Only the intelligent device label of the 'air conditioner in living room' needs to be moved to a partial area of the outer surface of the three-dimensional control where the device contextual model is located, so that the 'air conditioner in living room' is associated to the device contextual model.
If one device contextual model comprises intelligent devices such as an air purifier, a hallway light, a living room light and a living room air conditioner and corresponding parameter settings, when a user needs to disassociate the intelligent device such as the living room air conditioner from the device contextual model, the intelligent device label is moved out of a partial area of the outer surface of the three-dimensional control where the device contextual model corresponding to the intelligent device label is located, so as to disassociate the intelligent device corresponding to the intelligent device label from the device contextual model.
Fig. 12 is a schematic diagram of an interface for associating smart device tags with device profiles, as shown in fig. 12, where the interfaces in fig. 12 are presented in the order interface 01-interface 02.
When a user needs to add a living room air conditioner in a home-returning scene, the intelligent device label of the living room air conditioner is dragged to the outer sub-surface of the home-returning scene, so that the living room air conditioner is added in the home-returning scene, and the demonstration process is from the interface 02 to the interface 03. The linkage of the air conditioners in the living room can be realized when the scene of 'going home' is started. When the user needs to delete the ' air conditioner in the ' going home ' scene, the intelligent device label of the ' air conditioner in the living room ' can be dragged out of the outer sub-surface where the ' going home ' scene is located, and therefore the association between the ' air conditioner in the living room ' and the ' going home ' scene can be cancelled.
It is noted that the operation of associating the smart device tag with the device profile may be performed in a parameter setting interface. For example, when a user needs to view or set the device contextual model, the sub-outer surface where the "home" scene in the stereoscopic control is located is clicked, so that the intelligent device and the parameter setting in the "home" scene are displayed.
When the user needs to add the intelligent device of the living room air conditioner in the scene of going home, the user can drag the intelligent device label corresponding to the intelligent device of the living room air conditioner to the outer sub-surface of the scene of going home in the parameter setting interface, so that the association between the scene of the living room air conditioner and the scene of going home is realized.
EXAMPLE III
According to an embodiment of the present invention, there is also provided a storage medium having stored thereon program code, which when executed by a processor, implements the device control method according to any one of the above-described embodiments.
Example four
According to an embodiment of the present invention, there is also provided an electronic device, including a memory and a processor, where the memory stores program codes executable on the processor, and when the program codes are executed by the processor, the electronic device implements the device control method according to any one of the above embodiments.
The technical scheme of the invention is explained in detail with the accompanying drawings, and the technical problems that the usability of the existing intelligent household product scene mode setting is poor, the setting process is troublesome, and a user cannot understand a complex interaction mode in the related technology are considered. The invention provides an equipment control method, a storage medium and electronic equipment, wherein a three-dimensional control on a display interface is controlled to rotate, so that different equipment contextual models are displayed in the rotating process, a required equipment contextual model is selected according to a user instruction, and the intelligent home product is controlled according to the required equipment contextual model. Therefore, the device control method provided by the embodiment of the invention can rapidly switch the contextual model of the device, is simple to operate, is very interesting, and can greatly improve the user experience.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, a division of a unit is merely a logical division, and an actual implementation may have another division, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed.
Units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment of the present invention.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention essentially or partially contributes to the prior art, or all or part of the technical solution can be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing an electronic device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
Although the embodiments of the present invention have been described above, the above description is only for the convenience of understanding the present invention, and is not intended to limit the present invention. It will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (11)

1. An apparatus control method characterized by comprising:
controlling a three-dimensional control on a display interface to rotate according to the operation gesture so as to select a required equipment contextual model; the external surface of the three-dimensional control is provided with a control unit related to the equipment scene mode;
and controlling the operation mode of the corresponding equipment according to the required equipment contextual model.
2. The device control method according to claim 1, wherein controlling a stereoscopic control on a display interface to rotate according to an operation gesture to select a desired device contextual model comprises:
according to an operation gesture, controlling the three-dimensional control to rotate by taking a preset point as a rotation center and taking the moving direction of the operation gesture as a rotation direction, so that a partial area of the outer surface of the three-dimensional control faces a preset direction;
and selecting a corresponding device contextual model from the control units on the partial area of the outer surface of the three-dimensional control facing the preset direction, and determining the device contextual model as the required device contextual model.
3. The device control method according to claim 2, characterized by further comprising:
and when the three-dimensional control keeps a static state and reaches a preset duration, selecting a corresponding equipment contextual model from the control units on the partial area of the outer surface of the three-dimensional control facing to the preset direction, and determining the equipment contextual model as the required equipment contextual model.
4. The device control method according to claim 2 or 3, wherein selecting a corresponding device contextual model from the control units on the partial region of the outer surface of the stereoscopic control facing the preset orientation to determine the device contextual model as the required device contextual model comprises:
judging the number of control units on a partial area of the outer surface of the three-dimensional control facing a preset direction;
when only one control unit is arranged on a partial area of the outer surface of the three-dimensional control facing a preset direction, determining the equipment contextual model corresponding to the control unit as the required equipment contextual model.
5. The device control method according to claim 2 or 3, wherein selecting a corresponding device contextual model from the control units on the partial region of the outer surface of the stereoscopic control facing the preset orientation to determine the device contextual model as the required device contextual model comprises:
judging the number of control units on a partial area of the outer surface of the three-dimensional control facing a preset direction;
when a plurality of control units are arranged on a partial region of the outer surface of the three-dimensional control facing a preset direction, comparing the projection area of the sub-surface where the control unit of each equipment contextual model is located, projected on the preset direction;
and determining the device contextual model corresponding to the control unit on the sub-surface corresponding to the projection with the largest area as the required device contextual model.
6. The device control method according to claim 2 or 3, wherein selecting a corresponding device contextual model from the control units on the partial region of the outer surface of the stereoscopic control facing the preset orientation to determine the device contextual model as the required device contextual model comprises:
and responding to a mode selection command, determining a selected control unit from the control units on the partial area of the outer surface of the three-dimensional control facing to the preset direction, and determining the equipment contextual model corresponding to the control unit as the required equipment contextual model.
7. The device control method according to claim 2, wherein the preset point comprises a center point of the stereoscopic control, and/or
The three-dimensional control comprises one of a cube control, a sphere control and a cone control.
8. The apparatus control method according to claim 1, characterized in that the method further comprises:
and responding to a parameter setting command, and displaying a parameter setting interface corresponding to the corresponding control unit, wherein the parameter setting interface is used for setting the parameters of the equipment contextual model corresponding to the control unit.
9. The apparatus control method according to claim 1, characterized in that the method further comprises:
associating the intelligent equipment with the equipment contextual model corresponding to the control unit by moving the intelligent equipment label to the corresponding area on the outer surface of the three-dimensional control where the corresponding control unit is located;
and/or;
and removing the association between the intelligent equipment and the equipment contextual model corresponding to the control unit by moving the label of the intelligent equipment out of the corresponding area on the outer surface of the three-dimensional control where the corresponding control unit is located.
10. A storage medium having program code stored thereon, wherein the program code, when executed by a processor, implements the device control method according to any one of claims 1 to 9.
11. An electronic device, characterized in that the electronic device comprises a memory, a processor, the memory having stored thereon program code executable on the processor, the program code, when executed by the processor, implementing the device control method according to any one of claims 1 to 9.
CN202010224486.2A 2020-03-26 2020-03-26 Device control method, storage medium, and electronic device Active CN111443843B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010224486.2A CN111443843B (en) 2020-03-26 2020-03-26 Device control method, storage medium, and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010224486.2A CN111443843B (en) 2020-03-26 2020-03-26 Device control method, storage medium, and electronic device

Publications (2)

Publication Number Publication Date
CN111443843A true CN111443843A (en) 2020-07-24
CN111443843B CN111443843B (en) 2021-06-15

Family

ID=71650904

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010224486.2A Active CN111443843B (en) 2020-03-26 2020-03-26 Device control method, storage medium, and electronic device

Country Status (1)

Country Link
CN (1) CN111443843B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112379807A (en) * 2020-11-30 2021-02-19 北京城市网邻信息技术有限公司 Business information display method and device, electronic equipment and computer readable medium
CN115291766A (en) * 2022-07-21 2022-11-04 珠海格力电器股份有限公司 Interaction method, interaction device, electronic equipment and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013091493A1 (en) * 2011-12-19 2013-06-27 厦门万安智能股份有限公司 Intelligent home centralized control device with environmental adaptive scene mode
CN107589884A (en) * 2017-07-18 2018-01-16 朱小军 A kind of 3D stereoscopic displays exchange method and intelligent mobile terminal
CN107765556A (en) * 2016-08-23 2018-03-06 广州零号软件科技有限公司 The smart home product human-computer interaction interface that solid figure is shown
CN108873730A (en) * 2018-08-31 2018-11-23 珠海格力电器股份有限公司 Control method and device for household equipment
CN110554615A (en) * 2019-09-06 2019-12-10 珠海格力电器股份有限公司 method and device for centralized control and management of intelligent household equipment

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013091493A1 (en) * 2011-12-19 2013-06-27 厦门万安智能股份有限公司 Intelligent home centralized control device with environmental adaptive scene mode
CN107765556A (en) * 2016-08-23 2018-03-06 广州零号软件科技有限公司 The smart home product human-computer interaction interface that solid figure is shown
CN107589884A (en) * 2017-07-18 2018-01-16 朱小军 A kind of 3D stereoscopic displays exchange method and intelligent mobile terminal
CN108873730A (en) * 2018-08-31 2018-11-23 珠海格力电器股份有限公司 Control method and device for household equipment
CN110554615A (en) * 2019-09-06 2019-12-10 珠海格力电器股份有限公司 method and device for centralized control and management of intelligent household equipment

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112379807A (en) * 2020-11-30 2021-02-19 北京城市网邻信息技术有限公司 Business information display method and device, electronic equipment and computer readable medium
CN115291766A (en) * 2022-07-21 2022-11-04 珠海格力电器股份有限公司 Interaction method, interaction device, electronic equipment and storage medium

Also Published As

Publication number Publication date
CN111443843B (en) 2021-06-15

Similar Documents

Publication Publication Date Title
CN109275040B (en) Interaction method, device and system based on live game
US20210074062A1 (en) Three dimensional virtual room-based user interface for a home automation system
CN111443843B (en) Device control method, storage medium, and electronic device
CN104216379A (en) Information processing method and electronic device
US20130024819A1 (en) Systems and methods for gesture-based creation of interactive hotspots in a real world environment
WO2017084532A1 (en) Method and device for displaying icons
US10802668B2 (en) Small screen virtual room-based user interface
CN104216752A (en) Window-based information loading method and device
US20220261088A1 (en) Artificial reality platforms and controls
JP2012529147A (en) Control of virtual room-based lighting equipment and equipment
CN110764679B (en) Control method of electrical equipment, storage medium and processor
US20130241944A1 (en) Electronic Device and Display Control Method Thereof
CN111123851A (en) Method, device and system for controlling electric equipment according to user emotion
CN114007235A (en) Scene rule writing method and device, storage medium, processor and electronic equipment
CN114110783A (en) Cabinet air conditioner, control method, electronic equipment and storage medium
CN112051956A (en) House source interaction method and device
IL289743A (en) Beyond-line-of-sight communication
CN108415570B (en) Control selection method and device based on augmented reality
CN114363866B (en) Method and device for configuring scene mode of Bluetooth mesh network equipment and electronic device
CN115509656A (en) Device control method, device, storage medium and electronic device
CN201557200U (en) Self-help comparison system
CN109375851A (en) Sensor binding method and device, computer equipment and storage medium
EP3679464B1 (en) Small screen virtual room-based user interface
CN111309227B (en) Animation production method and equipment and computer readable storage medium
CN111475076A (en) Method, device and storage medium for deleting scene in user interface

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant