CN108037743B - Scene sharing method, scene construction method, UE (user Equipment) equipment and household intelligent system - Google Patents

Scene sharing method, scene construction method, UE (user Equipment) equipment and household intelligent system Download PDF

Info

Publication number
CN108037743B
CN108037743B CN201711245697.9A CN201711245697A CN108037743B CN 108037743 B CN108037743 B CN 108037743B CN 201711245697 A CN201711245697 A CN 201711245697A CN 108037743 B CN108037743 B CN 108037743B
Authority
CN
China
Prior art keywords
equipment
actual
virtual scene
scene
virtual
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201711245697.9A
Other languages
Chinese (zh)
Other versions
CN108037743A (en
Inventor
王建龙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jiangsu Mulin Zhizao Technology Co.,Ltd.
Original Assignee
Jiangsu Mu Lin Intelligent Electric Appliance Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jiangsu Mu Lin Intelligent Electric Appliance Co Ltd filed Critical Jiangsu Mu Lin Intelligent Electric Appliance Co Ltd
Priority to CN202010080621.0A priority Critical patent/CN111221259B/en
Priority to CN201711245697.9A priority patent/CN108037743B/en
Publication of CN108037743A publication Critical patent/CN108037743A/en
Application granted granted Critical
Publication of CN108037743B publication Critical patent/CN108037743B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B15/00Systems controlled by a computer
    • G05B15/02Systems controlled by a computer electric
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/418Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS], computer integrated manufacturing [CIM]
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/20Pc systems
    • G05B2219/26Pc applications
    • G05B2219/2642Domotique, domestic, home control, automation, smart house
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Abstract

The invention relates to a scene sharing method, a scene construction method, UE (user equipment) equipment and a home intelligent system, wherein the scene sharing method is used for issuing virtual scenes corresponding to current arrangement scenes of each actual device in the UE equipment through identification codes so as to be scanned and called by other UE equipment to obtain arrangement data of the virtual scenes; the invention can lay out each UE device according to the actual placing scene of the actual device to construct the virtual scene, and share the virtual scene, so that more UE devices can access and call the virtual scene, and further control multiple UE devices and multiple actual devices is realized.

Description

Scene sharing method, scene construction method, UE (user Equipment) equipment and household intelligent system
Technical Field
The invention belongs to the field of intelligent control, and particularly relates to a scene sharing method, a scene construction method, UE (user equipment) equipment and a home intelligent system.
Background
Currently, a plurality of controlled devices are controlled by User Equipment (UE) such as, but not limited to, a mobile phone, an individual computer, and an iPad.
However, in the field of traditional smart homes, each controlled device often adopts an independent control mode, and only the specific position of the controlled device is remembered through the serial number, so that a user cannot accurately correspond to the specific controlled device, and uncertainty is brought to selection and control of the controlled device.
Therefore, how to perform layout according to the actual placing scene of the actual device and share the laid scene so as to facilitate access call of more UE devices is a technical problem to be solved in the art.
Disclosure of Invention
The invention aims to provide a virtual scene sharing method, a virtual scene construction method, scene setting UE equipment and a home intelligent system.
In order to achieve the above purpose, the embodiment of the present invention adopts the following technical solutions:
the invention provides a virtual scene sharing method, which is characterized in that virtual scenes corresponding to current arrangement scenes of each actual device are issued through identification codes in UE (user equipment) so as to be scanned and called by other UE devices, and arrangement data of the virtual scenes are obtained.
Further, the virtual scene sharing method further includes:
the UE equipment is suitable for respectively creating corresponding virtual scenes for the current arrangement scenes of the actual equipment in the rooms, and sharing the virtual scenes of the corresponding rooms through the corresponding identification codes.
Further, constructing the virtual scene by the UE device;
respectively storing the position data of each actual device in the virtual scene constructed by the UE device into the control module of each actual device, wherein other UE devices are suitable for calling the position data in the virtual scene from the control module; or
Storing position data corresponding to each actual device in a virtual scene in the virtual scene constructed by the UE device in the UE device, issuing and sharing the position data through the identification code, and obtaining the position data in the virtual scene by other UE devices through reading the identification code; and
after the other UE devices obtain the location data of the corresponding virtual scene, the adaptation identifiers corresponding to the actual devices are respectively arranged in the virtual scene area in the UE device according to the corresponding location data.
In another aspect, the present invention further provides a virtual scene construction method, including:
in the creation of the virtual scene,
the UE equipment is suitable for receiving the broadcast signals sent by each actual device, obtaining each device code from the broadcast signals, and judging whether the actual device is the newly added actual device according to the device codes.
Further, the method for judging whether the actual device is the newly added actual device according to the device code includes:
if the equipment code is not recorded in the current UE equipment, judging that the actual equipment is newly added; wherein
The device encoding includes: MAC address of control module within the actual device.
Further, the UE equipment is suitable for opening the broadcast signal of the connected actual equipment and setting the closing delay time of the broadcast signal;
when the UE equipment receives broadcast signals sent by each actual equipment, the UE equipment is suitable for creating a virtual scene according to the current arrangement scene of the actual equipment, and
and the UE equipment issues the virtual scene through the identification code after the virtual scene is created so as to be scanned and called by other UE equipment to obtain the virtual scene.
Further, the identification code is adapted to store corresponding location data of the actual device in the virtual scene.
In a third aspect, the present invention further provides a scene determination method.
The scene judging method comprises the following steps: the UE equipment is suitable for judging the current arrangement scene where the actual equipment is located according to the perceived signal strength of each actual equipment, and then calling the virtual scene corresponding to the arrangement scene.
Further, the UE equipment is suitable for respectively creating corresponding virtual scenes aiming at the current arrangement scenes of the actual equipment in a plurality of rooms, and storing the arrangement data of the virtual scenes into the UE equipment;
after sensing the actual equipment with the strongest signal, the UE equipment calls the virtual scene corresponding to the actual equipment according to the equipment code of the actual equipment in the UE equipment.
In a third aspect, the present invention further provides a UE device.
The UE apparatus includes:
the scene construction module is used for constructing a virtual scene according to the current arrangement scene of each actual device;
and the issuing module is used for issuing and sharing the position data of each actual device in the virtual scene constructed by each UE device through the identification code.
Further, the UE device further includes:
the scanning module acquires a virtual scene constructed by another UE device by identifying the identification code;
the scene construction module is further adapted to construct a virtual scene according to the identification code.
Further, the UE device further includes:
the wireless module establishes a connection relation with a control module in the actual equipment;
and sending a control instruction to a control module of the actual equipment by operating the adaptation identification corresponding to the actual equipment in the virtual scene.
Further, the UE device is further adapted to automatically identify a virtual scene corresponding to an area where the UE device is currently located by the scene determination method.
In a fourth aspect, the present invention further provides a home intelligent system, including:
at least one UE device;
at least one actual device controlled by the UE device;
the UE equipment is suitable for constructing a corresponding virtual scene according to the current arrangement scene of the actual equipment, and issuing the virtual scene through the identification code so as to scan and call other UE equipment to obtain the arrangement data of the virtual scene.
Further, the UE device is adapted to create a virtual scene by the virtual scene construction method.
Further, the UE device is adapted to share the virtual scene by the virtual scene sharing method.
Further, the UE device is further adapted to automatically identify a virtual scene corresponding to an area where the UE device is currently located by the scene determination method.
The scene sharing method, the scene construction method, the UE equipment and the home intelligent system have the advantages that the UE equipment can be arranged according to the actual placement scene of the actual equipment to construct the virtual scene, and the virtual scene is shared, so that more UE equipment can access and call the virtual scene, and further control over multiple UE equipment and multiple actual equipment is achieved.
Drawings
The invention is further illustrated with reference to the following figures and examples.
Fig. 1 is a flowchart of an embodiment 1 of the virtual scene sharing method of the present invention;
FIG. 2 is a flow chart of embodiment 2 of the virtual scene sharing method of the present invention;
FIG. 3 is a flow chart of embodiment 3 of the virtual scene sharing method of the present invention;
FIG. 4 is a flow chart of embodiment 4 of the virtual scene sharing method of the present invention;
FIG. 5 is a schematic view of the virtual scene construction of the present invention;
FIG. 6 is a first functional block diagram of a UE apparatus of the present invention;
fig. 7 is a second schematic block diagram of the UE apparatus of the present invention.
In the figure: the display area 1, the layout schematic area 101, the adaptation mark 2, the independent trigger sub-mark 201 and the test key 3.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. The components of embodiments of the present invention generally described and illustrated in the figures herein may be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present invention, presented in the figures, is not intended to limit the scope of the invention, as claimed, but is merely representative of selected embodiments of the invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Currently, each UE device (hereinafter, referred to as UE or UE device) such as, but not limited to, a mobile phone, an individual computer, and an iPad is used to control a plurality of actual devices.
However, in the field of traditional smart homes, each controlled device often adopts an independent control mode, and only the specific position of the controlled device is remembered through the serial number, so that a user cannot accurately correspond to the specific controlled device, and uncertainty is brought to selection and control of the controlled device.
Therefore, it is very important to construct a virtual scene matched with the actual scene, and after one UE device constructs the virtual scene, other devices can obtain the virtual scene, and directly use the virtual scene to control the actual device, so as to simplify the setting steps of intelligent control, and make the operation more humanized.
In order to solve the above problem, the present embodiment provides a virtual scene sharing method, which is suitable for publishing a virtual scene corresponding to a current arrangement scene of each actual device, so as to be called by other UE devices; according to the virtual scene sharing method, after the virtual scene is shared, multiple UE devices and multiple actual devices can be accessed and controlled in the same arrangement scene.
The technical solution of the above embodiments of the present invention will be described in detail by using several specific embodiments.
Example 1
Fig. 1 is a flowchart of embodiment 1 of the virtual scene sharing method according to the present invention.
As shown in fig. 1, the method of this embodiment may include:
step S101, creating and releasing a virtual scene;
step S102, other UE devices call the virtual scene.
The actual device may represent a controlled object or a controlled device, and specifically is one or more of home intelligent devices, including but not limited to an electric sofa, a television, an air conditioner, a lamp, an electric window, an electric curtain, and the like.
The method for issuing the virtual scene in step S101 may be constructed by any one or more UE devices according to the position of the actual device in the actual layout scene; or may be preset in a UE device in a pre-stored manner.
The interconnection mode between the UE equipment and each actual equipment can adopt that a plurality of UE equipment are directly connected with the control module in each actual equipment; by adopting a direct connection mode, the position data corresponding to each actual device in the virtual scene constructed by each UE device can be respectively stored in the control module of each actual device, when the UE device calls the virtual scene, the UE device is suitable for respectively obtaining the corresponding position data from each control module, and the adaptation identifications corresponding to each actual device are respectively arranged in the virtual scene area of the UE device according to the corresponding position data; and storing the position data corresponding to each actual device in the virtual scene constructed by the UE device in the UE device, issuing and sharing the position data through the identification code, and obtaining the position data in the virtual scene by other UE devices through reading the identification code.
After the other UE equipment obtains the position data of the corresponding virtual scene, the adaptation identifications corresponding to each actual equipment are respectively arranged in the virtual scene area in the UE equipment according to the corresponding position data.
The identification code can be but is not limited to be a one-dimensional code, a two-dimensional code and the like which can be scanned and read by a scanning device such as a camera or a scanning gun.
The position data may be a grid in the virtual scene, and the position data is obtained by adapting the horizontal and vertical coordinates of the grid, and the position data further includes the steering position of the actual device.
In step S102, other UE devices invoke the virtual scene.
Specifically, the UE device may obtain existing virtual scenes of other UE devices by scanning the identification code, multiple UE devices may issue respective created virtual scenes by using the identification code, and for a newly added UE device, the arrangement data of a corresponding virtual scene may be obtained by using the identification code, and whether to call or additionally create a new virtual scene may be selected.
The control module may include, but is not limited to, a processor module, a BLE bluetooth module, a WiFi module, and a storage module, and for those skilled in the art, the control module used for smart home control, having wireless communication, and capable of interacting with the UE device all fall within the protection scope of the present invention.
In other embodiments, any technical solution that implements virtual scene publishing for other UE devices to invoke is within the scope of the present invention.
For the embodiment, the set virtual scene is shared, the method is particularly suitable for family users, setting operations for families can be reduced, and after one family member builds the virtual scene, the virtual scene can be shared to other family members, so that the user has better experience.
Example 2
FIG. 2 is a flow chart of embodiment 2 of the virtual scene sharing method of the present invention;
in step S201, the UE device is adapted to create corresponding virtual scenes respectively for current arrangement scenes of actual devices in a plurality of rooms.
Step S202, sharing the virtual scene of the corresponding room through the corresponding identification code.
The embodiment mainly aims at the condition that actual equipment is distributed in a plurality of rooms, the identification codes of the individual rooms can be generated, and a client can scan the identification codes according to needs in a targeted manner to obtain the distribution data of the virtual scenes of the rooms.
According to the embodiment, the size of the identification code is reduced, the rapid scanning is facilitated, the user experience is improved, and the user can scan the corresponding identification code according to needs to obtain the layout data of the virtual scene of the corresponding room in a targeted manner.
Example 3
FIG. 3 is a flow chart of embodiment 3 of the virtual scene sharing method of the present invention; as shown in fig. 3, the method of this embodiment may include:
step S301, creating and publishing the virtual scene.
The execution process of step S301 is similar to the execution process of step S101 in embodiment 1, and is not described here again.
Step S302, respectively storing the position data of each actual device in the virtual scene constructed by the UE device in the virtual scene into the control module of each actual device, wherein other UE devices are suitable for calling the position data in the virtual scene from the control module.
After the other UE equipment obtains the position data of the corresponding virtual scene, the adaptation identifications corresponding to each actual equipment are respectively arranged in the virtual scene area in the UE equipment according to the corresponding position data.
Therefore, for the actual arrangement scene, the virtual scene layout conforming to the personal habits can be created through each UE device, so that the setting is more open, the humanized setting is met, and the actual device is controlled to better conform to the personal habits.
Example 4
FIG. 4 is a flow chart of embodiment 4 of the virtual scene sharing method of the present invention; as shown in fig. 4, the method of this embodiment may include:
step S401, one or more UE devices construct a virtual scene and issue the virtual scene to other UE devices; and naming the constructed virtual scene.
Step S402, when the virtual scene is released, the name of the virtual scene is disclosed.
The UE equipment needing to call the virtual scene is suitable for presenting all published virtual scenes for calling;
in the UE equipment, the virtual scenes issued by the UE equipment are displayed on a page, and the UE equipment (which can be newly added UE equipment) needing to call the virtual scenes is suitable for selecting the scenes needing to be called according to the name.
When the position data corresponding to each actual device in the virtual scene constructed by each UE device is respectively stored in the control module of each actual device, the position data is respectively stored in the control module of each actual device according to the named number; when the UE device calls the virtual scene, the UE device is suitable for respectively obtaining corresponding position data from each control module according to the corresponding named number, and respectively arranging the adaptive identifications corresponding to each actual device in the virtual scene area of the UE device according to the corresponding position data, thereby completing the calling operation.
Each control module is suitable for respectively saving the position data of the actual equipment, reduces the dependence on an integrated controller in the traditional intelligent home control, and can make the hardware of the whole home intelligent system simpler and make the user experience better.
Specifically, the presentation of all published virtual scenes means that preview of each virtual scene can be realized, so that other users can view and select a virtual scene suitable for their own personal habits through the UE device to call, and convenience is brought to sharing, selection and calling of the virtual scenes.
Example 5
On the basis of the foregoing embodiments 1 to 3, the method for constructing a virtual scene in this embodiment may include:
FIG. 5 is a schematic diagram of the virtual scene construction of the present invention.
In the creation of the virtual scene,
the UE equipment is suitable for receiving the broadcast signals sent by each actual device, obtaining each device code from the broadcast signals, and judging whether the actual device is the newly added actual device according to the device codes.
The method for judging whether the actual device is the newly added actual device according to the device code comprises the following steps:
if the equipment code is not recorded in the current UE equipment, judging that the actual equipment is newly added; wherein
The device code may include, but is not limited to: MAC address of control module in actual device, or ID number of control module.
The UE equipment is suitable for opening a broadcast signal of connected actual equipment and setting the closing delay time of the broadcast signal; when the UE equipment receives broadcast signals sent by each actual device, the UE equipment is suitable for creating a virtual scene according to the current arrangement scene of the actual device, and the UE equipment is issued through the identification code after the virtual scene is created so as to be scanned and called by other UE equipment to obtain the virtual scene.
The identification code may store corresponding location data of the actual device in the virtual scene.
Establishing an association relationship between each actual device and a corresponding adaptation identifier 2 in a display area 1 in a screen of the UE device, and distributing the adaptation identifiers in the virtual scene according to the placement positions of the corresponding actual devices in the actual scene; after the virtual scene layout is completed, the virtual scene sharing method mentioned in the above embodiments 1 to 3 may be adopted to perform publishing and sharing on the virtual scene; or selecting the associated adaptive identifier 2 corresponding to the actual device in the virtual scene to implement specific control operation on the actual device.
Each actual device has a corresponding adaptation identifier 2 in a virtual scene and is stored through an identifier database; the adaptation mark 2 adopts an icon mode and is matched with the shape of the specific device so as to be convenient for identification, a fictive scene is constructed and is suitable for being completed in the layout schematic region 101 in the display region 1, the controlled object or the controlled device can be more accurately identified and operated through the icon mode, and the user experience effect is improved.
Specifically, the method for establishing an association relationship between each actual device and the corresponding adaptation identifier in the virtual scene includes:
presetting an identification database required by virtual scene construction; presetting an identification database required for constructing a virtual scene specifically means that a preset mode is adopted to store the corresponding adaptation identifications 2 of the devices in the virtual scene through the identification database, and the adaptation identifications 2 correspond to the corresponding device numbers.
The method comprises the steps of obtaining the adaptation identification 2 corresponding to actual equipment from an identification database, sending an equipment number stored in a control module in the actual equipment to UE equipment, calling the adaptation identification 2 corresponding to the actual equipment and matched with the equipment number from the identification database according to a received equipment code by the UE equipment, namely completing calling pairing of the adaptation identification 2, obtaining the adaptation identification 2 corresponding to the actual equipment, and displaying the adaptation identification 2 in a layout schematic area 101.
Laying out in a layout schematic region 101 according to the placement position of the actual device in the actual scene to construct the virtual scene; the placing operation may be performed by manually placing the adaptation identifier 2 according to the actual scene, for example, by dragging the adaptation identifier 2, and in other embodiments, the purpose that the adaptation identifier 2 is placed according to the actual device in the actual scene is all within the protection scope of the present application.
Taking the sofa as an example, in fig. 5, the adaptive identifier 2 of the sofa includes two electric single-seat sofas and one electric double-seat sofa, and is arranged in a U shape.
Because the corresponding device codes of the same actual devices are also the same, when the actual placement operation is performed, it often happens that which adaptation identifier 2 the actual device corresponds to cannot be identified, and therefore, when the same two or more adaptation identifiers 2 are placed according to an actual scene, a placement error often occurs.
To the combination sofa, because electronic sofa generally adopts independent control module to control, for example, as shown in fig. 5, when predetermineeing the combination electronic sofa, two single seat correspond two independent control box (control module), can appear two single seat correspond sofa adaptation sign 2, the user can't accurately learn, two single seat correspond accurate adaptation sign 2, cause the situation that the nonconformity appears in the locating position of constructing virtual scene and actual scene, can bring a great deal of inconvenience for controlling.
In order to solve the above problem, in this embodiment, the method for establishing an association relationship between each actual device and the corresponding adaptation identifier in the virtual scene further includes:
after the adaptation identifiers 2 are presented, performing association test on the actual devices in the virtual scene by triggering the adaptation identifiers to establish the association relationship, that is, performing association test on the same adaptation identifiers 2 corresponding to the same plurality of actual devices to determine the placement positions of the actual devices in the actual scene, and laying out the adaptation identifiers 2 subjected to the association test in the virtual scene according to the corresponding placement positions of the actual devices in the actual scene.
For example, in fig. 5, the combined electric sofa is shown, the association test may call the test key 3 for individual identification test in the adaptation identifier 2 to implement a related test function, the test function performs test control on two single seats or one double seat respectively to send control actions, the action conditions of the actual sofa are observed in sequence, if a certain sofa acts, the position of the sofa in the actual scene may be determined, the adaptation identifier 2 corresponding to the sofa is placed at the corresponding position in the actual scene, and the U-shaped layout is constructed in the virtual scene by analogy.
In the embodiment, in order to meet the diversity of the devices, the following control method is adopted to facilitate the control of the independent sub-devices in some combined devices.
When the actual device includes a plurality of independent sub-devices, and may be a combined device composed of a plurality of independent sub-devices, the independent trigger sub-identifiers 201 matched with the independent sub-devices are respectively divided on the adaptation identifiers 2 corresponding to the combined device. The independent trigger sub-identifier is suitable for calling the control interface corresponding to the independent sub-equipment after being triggered.
The specific position of the independent sub-device in the actual device can be determined according to the actual scene.
Taking two seats in the combined electric sofa as an example, each of the two seats may be equivalent to an independent sub-device, and here, the adaptive identifier 2 corresponding to the two seats is divided into independent trigger sub-identifiers 201 capable of representing left and right sofa positions.
The control module of the two-person position can determine motors used by the left sofa position and the right sofa position, further determine the independent trigger sub-identifiers 201 corresponding to the left sofa position and the right sofa position, and further automatically establish a matching relation between the independent sub-devices and the independent trigger sub-identifiers 201; by clicking the independent trigger sub-identifier 201, the control interface corresponding to the independent sub-device can be called to perform independent control.
In order to satisfy the requirement that multiple UEs control multiple actual devices or multiple independent sub-devices, the actual devices and the multiple independent sub-devices may be collectively referred to as controlled devices hereinafter, that is, to avoid contention for a certain controlled device, especially for electric sanding, to achieve exclusivity control; locking the independent trigger sub-identifier 201 or the adaptation identifier 2 to lock the independent sub-device or the actual device corresponding to the independent trigger sub-identifier or the adaptation identifier, so that the independent sub-device or the actual device corresponding to the independent trigger sub-identifier 201 or the adaptation identifier 2 is not controlled by other UE devices; the UE equipment executing the locking operation actively unlocks the locked independent trigger sub-identifier 201 or the adaptive identifier 2 so as to release the independent sub-equipment or the actual equipment corresponding to the locked independent trigger sub-identifier 201 or the adaptive identifier 2; the locked independent trigger sub-identifier 201 or the adaptive identifier 2 may also perform self-unlocking operation to release the independent sub-device or the actual device when the UE device is offline (i.e. after the UE device performing the locking operation is disconnected from the locked independent sub-device or the actual device).
After a UE device locks a controlled device, other UE devices are prevented from controlling the controlled device. The UE equipment sends the locking instruction to a control module corresponding to the controlled equipment, the control module issues the locking instruction to other UE equipment, a locking mark is added to the independent trigger sub-identifier 201 or the adaptation identifier 2 of the controlled equipment in a virtual scene of other UE equipment, and other UE equipment cannot operate, namely, the function of synchronously displaying the locking or unlocking state of the independent trigger sub-identifier 201 or the adaptation identifier 2 in the virtual scene presented by other UE equipment is realized, and further the control state of the controlled equipment is published in real time.
If the actual equipment is the electric sofa, the locking operation can effectively avoid other UE equipment from performing interference operation on the locked sofa, so that the user experience is improved, and misoperation can also be avoided.
For two-position sofas, the left sofa position and the right sofa position can be locked respectively.
In other embodiments, the virtual scene configuration method can be widely applied to intelligent home control, and specifically, modes of locking operation of lamps, televisions, air conditioners, lamps, electric windows, electric curtains and the like are all within the protection scope of the application.
The virtual scene construction method carries out the correlation test on the actual equipment through the adaptive identification, simplifies the configuration process, changes the configuration process which is realized by technicians traditionally into the configuration process which can be realized by common users, and reduces the after-sales cost.
Example 6
In order to be able to automatically adapt and call a virtual scene corresponding to an actual scene where an actual device is located according to an actual device placed in an area after a client reaches the area.
Therefore, the present embodiment provides a scene determination method.
The scene judging method comprises the following steps: the UE equipment is suitable for judging the current arrangement scene where the actual equipment is located according to the perceived signal strength of each actual equipment, and then calling the virtual scene corresponding to the arrangement scene.
Specifically, the UE device is adapted to create corresponding virtual scenes for current arrangement scenes of actual devices in multiple rooms, and store arrangement data of the virtual scenes in the UE device; after sensing the actual equipment with the strongest signal, the UE equipment calls the virtual scene corresponding to the actual equipment according to the equipment code of the actual equipment in the UE equipment.
The device code may adopt the MAC address of the control module in the actual device, and the layout data of the virtual scenes includes the MAC address of the control module in each actual device, that is, after the MAC address corresponding to an actual device is obtained, the corresponding MAC address is searched for in the stored layout data of a plurality of virtual scenes, and the layout data of the virtual scenes corresponding to the actual device is searched for through the MAC address, thereby realizing automatic loading of the virtual scenes corresponding to the layout scenes.
After a user can carry UE equipment to a certain area, the virtual scene adaptation corresponding to the area can be automatically realized, the intelligent level of the UE equipment is improved, and the user experience is improved.
Example 7
FIG. 6 is a first functional block diagram of a UE apparatus of the present invention;
as shown in fig. 6, the present embodiment provides a UE device, which can be used for setting a virtual scene and controlling an actual device, and includes:
the scene construction module is used for constructing a virtual scene according to the current arrangement scene of each actual device;
the issuing module issues and shares the position data, corresponding to each actual device in the virtual scene, in the virtual scene constructed by each UE device through the identification code; or may be transmitted to the control module of each actual device and stored therein.
In this embodiment, the UE devices may be configured to implement layout in the display interface according to an actual placement scene of the actual device, and share the virtual scene, so that more UE devices can access and call the virtual scene, and control over multiple UE devices and multiple actual devices is further implemented.
The sharing method comprises the following steps: issuing and sharing position data corresponding to each actual device in a virtual scene constructed by UE devices through identification codes; and/or respectively storing the position data of each actual device in the virtual scene constructed by the UE device into the control module of each actual device; other UE devices may obtain the layout data of a specific virtual scene from the identification code or obtain the location data of the corresponding actual device by connecting each control module, thereby forming the layout data of the corresponding virtual scene in the UE device.
The implementation processes of the scene building module and the publishing module are similar to the steps of creating and publishing the virtual scene in step S101, step S201, step S301 and embodiment 4 in the foregoing embodiments, and are not described again here.
Fig. 7 is a second schematic block diagram of the UE apparatus of the present invention.
As shown in fig. 7, in this embodiment, the UE device further includes:
the scanning module acquires a virtual scene constructed by another UE device by identifying the identification code; and the scene construction module is further adapted to construct a virtual scene according to the identification code.
Specifically, the virtual scene is constructed by identifying the identification code, that is, the identification code includes the position data, the orientation data and the like of the actual device, and the UE device can read the identification code to interpret the position data and the orientation data of the actual device to complete the construction of the virtual scene.
Fig. 7 is a third schematic block diagram of a UE device of the present invention.
As shown in fig. 7, in this embodiment, the UE device further includes:
the wireless module establishes a connection relation with a control module in the actual equipment;
and sending a control instruction to a control module of the actual equipment by operating the adaptation identification corresponding to the actual equipment in the virtual scene.
In this embodiment, the wireless module may be, but is not limited to, a BLE bluetooth module, a ZigBee module, a WiFi module, or other wireless modules capable of exchanging data transmission.
In order to satisfy the requirement that multiple UEs control multiple actual devices or multiple independent sub-devices, the actual devices and the multiple independent sub-devices may be collectively referred to as controlled devices hereinafter, that is, to avoid contention for a certain controlled device, especially for electric sanding, to achieve exclusivity control; locking the independent trigger sub-identifier 201 or the adaptation identifier 2 to lock the independent sub-device or the actual device corresponding to the independent trigger sub-identifier or the adaptation identifier, so that the independent sub-device or the actual device corresponding to the independent trigger sub-identifier 201 or the adaptation identifier 2 is not controlled by other UE devices; the UE equipment executing the locking operation actively unlocks the locked independent trigger sub-identifier 201 or the adaptive identifier 2 so as to release the independent sub-equipment or the actual equipment corresponding to the locked independent trigger sub-identifier 201 or the adaptive identifier 2; the locked independent trigger sub-identifier 201 or the adaptive identifier 2 may also perform self-unlocking operation to release the independent sub-device or the actual device when the UE device is offline (i.e. after the UE device performing the locking operation is disconnected from the locked independent sub-device or the actual device).
After a UE device locks a controlled device, other UE devices are prevented from controlling the controlled device. The UE equipment sends the locking instruction to a control module corresponding to the controlled equipment, the control module issues the locking instruction to other UE equipment, a locking mark is added to the independent trigger sub-identifier 201 or the adaptation identifier 2 of the controlled equipment in a virtual scene of other UE equipment, and other UE equipment cannot operate, namely, the function of synchronously displaying the locking or unlocking state of the independent trigger sub-identifier 201 or the adaptation identifier 2 in the virtual scene presented by other UE equipment is realized, and further the control state of the controlled equipment is published in real time.
If the actual equipment is the electric sofa, the locking operation can effectively avoid other UE equipment from performing interference operation on the locked sofa, so that the user experience is improved, and misoperation can also be avoided.
In other embodiments, the UE device may be widely used for smart home control, and the actual devices include, but are not limited to, lamps, televisions, air conditioners, lamps, power windows, power curtains, and other modes of locking operation are within the scope of the present application.
The UE device is further adapted to automatically identify a virtual scene corresponding to an area where the current UE device is located, by using the scene determination method in the above embodiment.
Example 8
This embodiment 8 provides a house intelligent system, includes:
at least one UE device;
at least one actual device controlled by the UE device;
the UE equipment is suitable for constructing a corresponding virtual scene according to the current arrangement scene of the actual equipment, and issuing the virtual scene through the identification code so as to be scanned and called by other UE equipment to obtain the arrangement data of the virtual scene.
The UE device is adapted to create a virtual scene by the virtual scene construction method in the above embodiment.
The UE device is adapted to share the virtual scene by the virtual scene sharing method in the above embodiments.
The UE device is further adapted to automatically identify a virtual scene corresponding to an area where the current UE device is located, by using the scene determination method in the above embodiment.
According to the scene sharing method, the scene construction method, the UE equipment and the home intelligent system, the UE equipment can be arranged according to the actual placing scene of the actual equipment to construct the virtual scene, and the virtual scene is shared in the mode of the identification code, so that more UE equipment can access and call the virtual scene by scanning the identification code, other UE equipment can obtain the arrangement data of the virtual scene conveniently and quickly, the actual equipment is controlled, and the user experience is improved.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
In the description of the present invention, it should also be noted that, unless otherwise explicitly specified or limited, the terms "disposed," "mounted," "connected," and "connected" are to be construed broadly and may, for example, be fixedly connected, detachably connected, or integrally connected; can be mechanically or electrically connected; they may be connected directly or indirectly through intervening media, or they may be interconnected between two elements. The specific meanings of the above terms in the present invention can be understood in specific cases to those skilled in the art.
In the description of the present invention, it should be noted that the terms "upper", "lower", "left", "right", "inner", "outer", and the like indicate orientations or positional relationships based on orientations or positional relationships shown in the drawings or orientations or positional relationships that the products of the present invention are conventionally placed in use, and are used only for convenience in describing the present invention and for simplicity in description, and do not indicate or imply that the referred devices or elements must have a specific orientation, be constructed in a specific orientation, and be operated, and thus, should not be construed as limiting the present invention.
The above is only a preferred embodiment of the present invention, and is not intended to limit the present invention, and various modifications and changes will occur to those skilled in the art. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention. It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures.

Claims (13)

1. A virtual scene sharing method is applied to the situation that a user can not accurately correspond to a specific controlled device through an adaptive identifier on user equipment,
issuing virtual scenes corresponding to the current arrangement scenes of each actual device in the UE device through the identification codes so as to be scanned and called by other UE devices to obtain the arrangement data of the virtual scenes;
constructing, by a UE device, the virtual scene;
respectively storing the position data of each actual device in the virtual scene constructed by the UE device into a control module of each actual device, wherein other UE devices are suitable for calling the position data in the virtual scene from the control module, respectively obtaining corresponding position data from each control module when the virtual scene is called by other UE devices, and respectively arranging the adaptation identifications corresponding to each actual device in the virtual scene area of the UE device according to the corresponding position data; or
Storing position data corresponding to each actual device in a virtual scene in the virtual scene constructed by the UE device in the UE device, issuing and sharing the position data through the identification code, and obtaining the position data in the virtual scene by other UE devices through reading the identification code; and
after other UE equipment obtains the position data of the corresponding virtual scene, the adaptation identification corresponding to each actual equipment is respectively arranged in the virtual scene area in the UE equipment according to the corresponding position data;
after the adaptation identifiers are presented, performing association test on the actual devices in the virtual scene by triggering the adaptation identifiers to establish the association relationship, that is, performing association test on the same adaptation identifiers corresponding to the same plurality of actual devices to determine the placement positions of the actual devices in the actual scene, and laying out the adaptation identifiers subjected to the association test in the virtual scene according to the placement positions of the corresponding actual devices in the actual scene.
2. The virtual scene sharing method according to claim 1,
the virtual scene sharing method further comprises the following steps:
the UE equipment is suitable for respectively creating corresponding virtual scenes for the current arrangement scenes of the actual equipment in the rooms, and sharing the virtual scenes of the corresponding rooms through the corresponding identification codes.
3. A virtual scene construction method for implementing the virtual scene sharing method according to claim 1, comprising:
in the creation of the virtual scene,
the UE equipment is suitable for receiving broadcast signals sent by each actual device, acquiring each equipment code from the broadcast signals, and judging whether the actual device is newly added actual device or not according to the equipment codes;
the UE equipment is suitable for opening a broadcast signal of connected actual equipment and setting the closing delay time of the broadcast signal;
when the UE equipment receives broadcast signals sent by each actual equipment, the UE equipment is suitable for creating a virtual scene according to the current arrangement scene of the actual equipment, and
the UE equipment is issued through the identification code after the virtual scene is created so as to be scanned and called by other UE equipment to obtain the virtual scene;
and establishing an incidence relation between each actual device and the corresponding adaptation identifier in a display area in a screen of the UE device, and distributing the adaptation identifiers in the virtual scene according to the placement positions of the corresponding actual devices in the actual scene.
4. The virtual scene construction method according to claim 3,
the method for judging whether the actual device is the newly added actual device according to the device code comprises the following steps:
if the equipment code is not recorded in the current UE equipment, judging that the actual equipment is newly added; wherein
The device encoding includes: MAC address of control module within the actual device.
5. The virtual scene construction method according to claim 4,
the identification code is adapted to store corresponding location data of the actual device in the virtual scene.
6. A scene judgment method based on the virtual scene construction method of claim 3,
the UE equipment is suitable for judging the current arrangement scene where the actual equipment is located according to the perceived signal strength of each actual equipment, and then calling the virtual scene corresponding to the arrangement scene;
the UE equipment is suitable for respectively creating corresponding virtual scenes aiming at the current arrangement scenes of actual equipment in a plurality of rooms and storing the arrangement data of the virtual scenes into the UE equipment;
after sensing the actual equipment with the strongest signal, the UE equipment calls the virtual scene corresponding to the actual equipment according to the equipment code of the actual equipment in the UE equipment.
7. The UE device of the virtual scene construction method according to claim 3, comprising:
the scene construction module is used for constructing a virtual scene according to the current arrangement scene of each actual device;
the issuing module issues and shares the position data, corresponding to each actual device in the virtual scene, in the virtual scene constructed by each UE device through the identification code;
the UE device further includes:
the scanning module acquires a virtual scene constructed by another UE device by identifying the identification code;
the scene construction module is further adapted to construct a virtual scene according to the identification code.
8. The UE apparatus of claim 7,
the UE device further includes:
the wireless module establishes a connection relation with a control module in the actual equipment;
and sending a control instruction to a control module of the actual equipment by operating the adaptation identification corresponding to the actual equipment in the virtual scene.
9. The UE apparatus of claim 8,
the UE device is further adapted to automatically identify a corresponding virtual scene of an area where the UE device is currently located by the scene determination method according to claim 6.
10. A home intelligent system based on the virtual scene construction method of claim 3, comprising:
at least one UE device;
at least one actual device controlled by the UE device;
the UE equipment is suitable for constructing a corresponding virtual scene according to the current arrangement scene of the actual equipment, and issuing the virtual scene through the identification code so as to scan and call other UE equipment to obtain the arrangement data of the virtual scene.
11. The home intelligence system of claim 10,
the UE device is adapted to create a virtual scene by the virtual scene construction method according to claim 4 or 5.
12. The home intelligence system of claim 11,
the UE device is adapted to share a virtual scene by the virtual scene sharing method of claim 2.
13. The home intelligence system of claim 12,
the UE device is further adapted to automatically identify a corresponding virtual scene of an area where the UE device is currently located by the scene determination method according to claim 6.
CN201711245697.9A 2017-12-01 2017-12-01 Scene sharing method, scene construction method, UE (user Equipment) equipment and household intelligent system Active CN108037743B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202010080621.0A CN111221259B (en) 2017-12-01 2017-12-01 Sofa control household intelligent system and sofa household scene layout method
CN201711245697.9A CN108037743B (en) 2017-12-01 2017-12-01 Scene sharing method, scene construction method, UE (user Equipment) equipment and household intelligent system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711245697.9A CN108037743B (en) 2017-12-01 2017-12-01 Scene sharing method, scene construction method, UE (user Equipment) equipment and household intelligent system

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN202010080621.0A Division CN111221259B (en) 2017-12-01 2017-12-01 Sofa control household intelligent system and sofa household scene layout method

Publications (2)

Publication Number Publication Date
CN108037743A CN108037743A (en) 2018-05-15
CN108037743B true CN108037743B (en) 2020-01-21

Family

ID=62094843

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202010080621.0A Active CN111221259B (en) 2017-12-01 2017-12-01 Sofa control household intelligent system and sofa household scene layout method
CN201711245697.9A Active CN108037743B (en) 2017-12-01 2017-12-01 Scene sharing method, scene construction method, UE (user Equipment) equipment and household intelligent system

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN202010080621.0A Active CN111221259B (en) 2017-12-01 2017-12-01 Sofa control household intelligent system and sofa household scene layout method

Country Status (1)

Country Link
CN (2) CN111221259B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114153307A (en) * 2020-09-04 2022-03-08 中移(成都)信息通信科技有限公司 Scene block processing method, device, electronic equipment and computer storage medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106647310A (en) * 2016-11-30 2017-05-10 芜湖美智空调设备有限公司 Household electrical appliance linkage starting method and household electrical appliance linkage starting system
CN107094107A (en) * 2017-05-09 2017-08-25 捷开通讯(深圳)有限公司 Intelligent domestic system and control method, mobile terminal, with store function device

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106155002B (en) * 2015-04-17 2020-05-22 乐金电子研发中心(上海)有限公司 Intelligent household system
CN105022281A (en) * 2015-07-29 2015-11-04 中国电子科技集团公司第十五研究所 Intelligent household control system based on virtual reality
US20170091999A1 (en) * 2015-09-25 2017-03-30 Rafael Blumenfeld Method and system for determining a configuration of a virtual robot in a virtual environment
CN106569408B (en) * 2015-10-09 2021-04-27 阿里巴巴集团控股有限公司 Processing system, method and device for personalized information of intelligent equipment
CN105388453B (en) * 2015-12-09 2017-10-17 小米科技有限责任公司 The method and device of positioning intelligent equipment
CN105827610B (en) * 2016-03-31 2020-01-31 联想(北京)有限公司 information processing method and electronic equipment
CN106127844A (en) * 2016-06-22 2016-11-16 民政部零研究所 Mobile phone users real-time, interactive access long-range 3D scene render exchange method
CN106095114A (en) * 2016-06-29 2016-11-09 宁波市电力设计院有限公司 Electric power industry based on VR technology expands engineering aid system and method for work thereof
CN106249607A (en) * 2016-07-28 2016-12-21 桂林电子科技大学 Virtual Intelligent household analogue system and method
CN106506287A (en) * 2016-09-29 2017-03-15 杭州鸿雁智能科技有限公司 Scenery control method and system based on ZigBee
CN106713082A (en) * 2016-11-16 2017-05-24 惠州Tcl移动通信有限公司 Virtual reality method for intelligent home management
CN106713118B (en) * 2016-11-29 2020-04-10 深圳信息职业技术学院 Remote control system and method based on intelligent routing and chat room mechanism

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106647310A (en) * 2016-11-30 2017-05-10 芜湖美智空调设备有限公司 Household electrical appliance linkage starting method and household electrical appliance linkage starting system
CN107094107A (en) * 2017-05-09 2017-08-25 捷开通讯(深圳)有限公司 Intelligent domestic system and control method, mobile terminal, with store function device

Also Published As

Publication number Publication date
CN111221259A (en) 2020-06-02
CN108037743A (en) 2018-05-15
CN111221259B (en) 2023-10-17

Similar Documents

Publication Publication Date Title
CN102984039B (en) The intelligent control method of intelligent gateway, intelligent domestic system and home appliance
CN102945029B (en) Intelligent gateway, smart home system and intelligent control method for home appliance equipment
CN104885406B (en) For the method and apparatus in domestic network system medium-long range control household equipment
US10848789B2 (en) Gateway device and system and method for use of same
CN105373165B (en) The power outlet wireless access point device of networking life and working space
US9258508B2 (en) IR pairing for RF4CE remote controls
CN203151535U (en) Intelligent gateway and intelligent household system
CN110262274A (en) Smart home device control display methods and system based on Internet of Things operating system
CN107203144B (en) Intelligent household control method and system and integrated controller
CN103246267B (en) There is remote control and the interface creating method thereof of three-dimensional user interface
CN105137841B (en) Remote control method and device based on intelligent socket
CN108919663A (en) A kind of management method and electronic equipment of smart home device
CN103438549A (en) System and method for centralized control of central air conditioner based on graphical interface
KR20150059081A (en) Method and apparatus for controlling a group of home devices in a home network system
CN106597860A (en) Household electrical appliance control system, and control device, construction method and control method thereof
CN104780470A (en) Household appliance control method and household appliance control terminal
KR101835176B1 (en) Responder device binding in a wireless system
CN105573132A (en) Control method for household gateway, intelligent household system and household electrical appliance
US20190213819A1 (en) Management device, control method, and program
CN105825656A (en) Electric appliance control method and device, air conditioner and remote control
CN112789828B (en) Intelligent adaptation of remote control functions in a local area network
CN108037743B (en) Scene sharing method, scene construction method, UE (user Equipment) equipment and household intelligent system
KR20190050485A (en) The UI(user interface) management server and UI management server control method
WO2019047288A1 (en) Scene control method, locking method, user equipment, and sofa control system
KR20150005800A (en) Remote controlled home appliances and mobile communication terminal and system for managing of home appliances therefor

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20191120

Address after: 213000 Jiangsu Province, Changzhou city Wujin District Henglin town Cui North Industrial Park Road.

Applicant after: Jiangsu Mu Lin intelligent electric appliance Co., Ltd

Address before: 213000 Jiangsu city of Changzhou Province Wang Zhen Bin Road No. 10

Applicant before: Wang Jianlong

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant
CP01 Change in the name or title of a patent holder

Address after: 213000 Bangshang Road, cuibei Industrial Park, Henglin Town, Wujin District, Changzhou City, Jiangsu Province

Patentee after: Jiangsu Mulin Zhizao Technology Co.,Ltd.

Address before: 213000 Bangshang Road, cuibei Industrial Park, Henglin Town, Wujin District, Changzhou City, Jiangsu Province

Patentee before: JIANGSU MULIN INTELLIGENT ELECTRIC CO.,LTD.

CP01 Change in the name or title of a patent holder