CN107290972B - Equipment control method and device - Google Patents

Equipment control method and device Download PDF

Info

Publication number
CN107290972B
CN107290972B CN201710542645.1A CN201710542645A CN107290972B CN 107290972 B CN107290972 B CN 107290972B CN 201710542645 A CN201710542645 A CN 201710542645A CN 107290972 B CN107290972 B CN 107290972B
Authority
CN
China
Prior art keywords
head
target device
control instruction
action type
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710542645.1A
Other languages
Chinese (zh)
Other versions
CN107290972A (en
Inventor
梁效富
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics China R&D Center
Samsung Electronics Co Ltd
Original Assignee
Samsung Electronics China R&D Center
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Electronics China R&D Center, Samsung Electronics Co Ltd filed Critical Samsung Electronics China R&D Center
Priority to CN201710542645.1A priority Critical patent/CN107290972B/en
Publication of CN107290972A publication Critical patent/CN107290972A/en
Application granted granted Critical
Publication of CN107290972B publication Critical patent/CN107290972B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B15/00Systems controlled by a computer
    • G05B15/02Systems controlled by a computer electric
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/418Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS], computer integrated manufacturing [CIM]
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/20Pc systems
    • G05B2219/26Pc applications
    • G05B2219/2642Domotique, domestic, home control, automation, smart house
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Abstract

The application discloses a device control method and device. One embodiment of the method comprises: acquiring first motion posture information of a head wearing the head-mounted equipment; analyzing the acquired first motion attitude information, and determining a head action type matched with the first motion attitude information; determining a control instruction for the target device based on the determined head action type; and sending a control instruction to the target equipment so that the target equipment can perform corresponding operation in response to the received control instruction. This embodiment enriches the means of device control.

Description

Equipment control method and device
Technical Field
The present application relates to the field of computer technologies, and in particular, to a device control method and apparatus.
Background
With the development of immersive devices such as Virtual Reality (VR) devices, Augmented Reality (AR) devices, Mixed Reality (MR) devices, and the popularization of smart home systems based on internet of things technology, how to control devices in smart home systems is a problem faced at present when using the immersive devices.
In the prior art, a control instruction for a target device is usually generated based on voice sent by a user or input of an external controller such as a handle, however, this method is complex to implement and is not efficient in control.
Disclosure of Invention
The present application aims to propose an improved device control method and apparatus to solve the technical problems mentioned in the background section above.
In a first aspect, an embodiment of the present application provides an apparatus control method, where the method includes: acquiring first motion posture information of a head wearing the head-mounted equipment; analyzing the acquired first motion attitude information, and determining a head action type matched with the first motion attitude information; determining a control instruction for the target device based on the determined head action type; and sending a control instruction to the target equipment so that the target equipment can perform corresponding operation in response to the received control instruction.
In this embodiment, the target device is determined via the following steps: presenting an image of a controlled environment including a controlled device, the image further including tagging information for tagging the controlled device; determining the marked controlled equipment in the image according to the second motion posture information of the head wearing the head-mounted equipment; in response to receiving an instruction to determine the marked controlled device as the target device, determining the marked controlled device as the target device.
In the embodiment, the head-mounted device and the target device are in communication connection with the control device; and determining control instructions for the target device based on the determined head action type, including: sending a message including the determined head action type and an identification of the target device to the control device, determining, by the control device, a control instruction for the target device based on the determined head action type; and sending a control instruction to the target device, so that the target device performs corresponding operation in response to receiving the control instruction, including: and sending a control instruction to the target equipment through the control equipment, so that the target equipment responds to the received control instruction to perform corresponding operation.
In this embodiment, parsing the acquired first motion posture information and determining a head action type matching the first motion posture information includes: and determining the head action type matched with the first motion attitude information in response to the analyzed fact that the acceleration of the head-mounted device in the preset time length is smaller than the preset threshold value.
In this embodiment, determining the head motion type matching the first motion gesture information includes: if the first motion posture information comprises information that the head-mounted device completes clockwise acceleration within a first preset time length, determining that the head action type is clockwise rotation action of the head; and if the first motion posture information comprises information that the head-mounted device completes anticlockwise acceleration within a second preset time length, determining that the head action type is head anticlockwise rotation action.
In this embodiment, the types of operations that the target device can perform include turning on or off; and determining control instructions for the target device based on the determined head action type, including: generating a control instruction for instructing the target device to turn on in response to the determined head action type being a head clockwise turning action, and generating a control instruction for instructing the target device to turn off in response to the determined head action type being a head counterclockwise turning action; or generating a control instruction for instructing the target device to turn off in response to the determined type of the head action being the head clockwise turning action, and generating a control instruction for instructing the target device to turn on in response to the determined type of the head action being the head counterclockwise turning action.
In the present embodiment, the types of operations that the target apparatus can perform include adjustment of the control amount; and determining control instructions for the target device based on the determined head action type, including: generating a control instruction for instructing the target device to increase the control amount in response to the determined head action type being a head clockwise turning action, and generating a control instruction for instructing the target device to decrease the control amount in response to the determined head action type being a head counterclockwise turning action; or generating a control instruction for instructing the target apparatus to decrease the control amount in response to the determined head action type being the head clockwise turning action, and generating a control instruction for instructing the target apparatus to increase the control amount in response to the determined head action type being the head counterclockwise turning action.
In this embodiment, determining the head motion type matching the first motion gesture information includes: and if the first motion posture information comprises information that the head-mounted device completes downward acceleration and/or upward acceleration within a third preset time period, determining that the head action type is nodding head action.
In this embodiment, the types of operations that the target device can perform include turning on or off; and determining control instructions for the target device based on the determined head action type, including: and in response to the fact that the determined head action type is nodding head action, if the target device is determined to be in the on state, generating a control instruction for indicating the target device to be off, and if the target device is determined to be in the off state, generating a control instruction for indicating the target device to be on.
In a second aspect, an embodiment of the present application provides an apparatus control device, including: the device comprises an acquisition unit, a processing unit and a display unit, wherein the acquisition unit is used for acquiring first motion attitude information of a head wearing the head-mounted equipment; the analysis unit is used for analyzing the acquired first motion attitude information and determining a head action type matched with the first motion attitude information; a determination unit configured to determine a control instruction for the target device based on the determined head action type; and the sending unit is used for sending the control instruction to the target equipment so that the target equipment can respond to the received control instruction to carry out corresponding operation.
In this embodiment, the device further includes a target device determining unit, where the target device determining unit is configured to: presenting an image of a controlled environment including a controlled device, the image further including tagging information for tagging the controlled device; determining the marked controlled equipment in the image according to the second motion posture information of the head wearing the head-mounted equipment; in response to receiving an instruction to determine the marked controlled device as the target device, determining the marked controlled device as the target device.
In the embodiment, the head-mounted device and the target device are in communication connection with the control device; and the determining unit is further configured to: sending a message including the determined head action type and an identification of the target device to the control device, determining, by the control device, a control instruction for the target device based on the determined head action type; and the sending unit is further configured to: and sending a control instruction to the target equipment through the control equipment, so that the target equipment responds to the received control instruction to perform corresponding operation.
In this embodiment, the parsing unit is further configured to: and determining the head action type matched with the first motion attitude information in response to the analyzed fact that the acceleration of the head-mounted device in the preset time length is smaller than the preset threshold value.
In this embodiment, the parsing unit is further configured to: if the first motion posture information comprises information that the head-mounted device completes clockwise acceleration within a first preset time length, determining that the head action type is clockwise rotation action of the head; and if the first motion posture information comprises information that the head-mounted device completes anticlockwise acceleration within a second preset time length, determining that the head action type is head anticlockwise rotation action.
In this embodiment, the types of operations that the target device can perform include turning on or off; and the determining unit is further configured to: generating a control instruction for instructing the target device to turn on in response to the determined head action type being a head clockwise turning action, and generating a control instruction for instructing the target device to turn off in response to the determined head action type being a head counterclockwise turning action; or generating a control instruction for instructing the target device to turn off in response to the determined type of the head action being the head clockwise turning action, and generating a control instruction for instructing the target device to turn on in response to the determined type of the head action being the head counterclockwise turning action.
In the present embodiment, the types of operations that the target apparatus can perform include adjustment of the control amount; and the determining unit is further configured to: generating a control instruction for instructing the target device to increase the control amount in response to the determined head action type being a head clockwise turning action, and generating a control instruction for instructing the target device to decrease the control amount in response to the determined head action type being a head counterclockwise turning action; or generating a control instruction for instructing the target apparatus to decrease the control amount in response to the determined head action type being the head clockwise turning action, and generating a control instruction for instructing the target apparatus to increase the control amount in response to the determined head action type being the head counterclockwise turning action.
In this embodiment, the parsing unit is further configured to: and if the first motion posture information comprises information that the head-mounted device completes downward acceleration and/or upward acceleration within a third preset time period, determining that the head action type is nodding head action.
In this embodiment, the types of operations that the target device can perform include turning on or off; and the determining unit is further configured to: and in response to the fact that the determined head action type is nodding head action, if the target device is determined to be in the on state, generating a control instruction for indicating the target device to be off, and if the target device is determined to be in the off state, generating a control instruction for indicating the target device to be on.
In a third aspect, an embodiment of the present application provides an apparatus control system, including: the head-mounted device is in communication connection with the target device; the head-mounted device is used for acquiring first motion attitude information of a head wearing the head-mounted device, analyzing the acquired first motion attitude information, determining a head action type matched with the first motion attitude information, determining a control instruction aiming at the target device based on the determined head action type, and sending the control instruction to the target device; and the target equipment is used for responding to the received control instruction to carry out corresponding operation.
In a fourth aspect, an embodiment of the present application provides another device control system, including: the system comprises head-mounted equipment, control equipment and target equipment, wherein the control equipment is in communication connection with the head-mounted equipment and the target equipment; the head-mounted device is used for acquiring first motion posture information of a head wearing the head-mounted device, analyzing the acquired first motion posture information, determining a head action type matched with the first motion posture information, and sending a message comprising the determined head action type and an identifier of the target device to the control device; a control device for determining a control instruction for the target device based on the determined head action type in response to receiving the message, and transmitting the control instruction to the target device; and the target equipment is used for responding to the received control instruction to carry out corresponding operation.
In a fifth aspect, an embodiment of the present application provides an apparatus, including: one or more processors; storage means for storing one or more programs which, when executed by the one or more processors, cause the one or more processors to carry out the method as described above in relation to the first aspect.
In a sixth aspect, the present application provides a computer-readable storage medium, on which a computer program is stored, where the computer program is executed by a processor to implement the method as described above in the first aspect.
According to the device control method and device provided by the embodiment of the application, the first motion posture information of the head wearing the head-mounted device is acquired, the acquired first motion posture information is analyzed, the head action type matched with the first motion posture information is determined, finally, the control instruction for the target device is determined based on the determined head action type, the control instruction is sent to the target device, the target device can respond to the received control instruction to perform corresponding operation, and the device control means are enriched.
Drawings
Other features, objects and advantages of the present application will become more apparent upon reading of the following detailed description of non-limiting embodiments thereof, made with reference to the accompanying drawings in which:
FIG. 1 is an exemplary system architecture diagram in which the present application may be applied;
FIG. 2 is a schematic flow chart diagram illustrating one embodiment of a plant control method according to the present application;
FIG. 3 is a schematic diagram of an application scenario of a device control method according to the present application;
FIG. 4 is a schematic flow chart diagram of yet another embodiment of a plant control method according to the present application;
FIG. 5 is an exemplary block diagram of one embodiment of a device control apparatus according to the present application;
FIG. 6 is a block diagram of a computer system suitable for use in implementing the apparatus of an embodiment of the present application.
Detailed Description
The present application will be described in further detail with reference to the following drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the relevant invention and not restrictive of the invention. It should be noted that, for convenience of description, only the portions related to the related invention are shown in the drawings.
It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict. The present application will be described in detail below with reference to the embodiments with reference to the attached drawings.
Fig. 1 illustrates an exemplary system architecture 100 to which embodiments of the device control method or device control apparatus of the present application may be applied.
As shown in fig. 1, system architecture 100 may include a head mounted device 101, a target device 102, and a network 104. Network 104 is used to provide the medium of a communication link between head mounted device 101 and target device 102, and network 104 is used to provide the medium of a communication link between head mounted device 101 and target device 102. Network 104 may include various connection types, such as wired, wireless communication links, or fiber optic cables, to name a few.
The head-mounted device 101 may be various immersive devices, such as VR helmet and AR glasses, and the head-mounted device 101 may be integrated with hardware such as a display device, a processor, and a memory, or may be combined with the mobile terminal, and some functions of the head-mounted device 101 may be implemented by using hardware in the mobile terminal. The head-mounted device 101 may acquire first motion posture information of a head on which the head-mounted device 101 is worn, analyze the acquired first motion posture information, determine a head motion type matched with the first motion posture information, then determine a control instruction for the target device based on the determined head motion type, and finally send the control instruction to the target device 102, so that the target device 102 performs a corresponding operation in response to the received control instruction.
The target device 102 may be various intelligent devices, or devices that can be controlled using an intelligent switch, such as audio-video devices, lighting devices, windows and doors, air conditioners, security devices, various household appliances. The target device 102 may perform a corresponding operation in response to receiving the control instruction.
In addition, the system architecture 100 may further include a control device 103, where the control device 103 may be a centralized controller for managing all devices in the smart home scene, or may be a cloud server of the smart home platform. The control device 103 may determine control instructions for the target device 102 and send the control instructions to the target device 102 in response to receiving a message sent by the head mounted device 101 that includes the determined head action type and the identification of the target device 102.
It should be noted that the device control method provided in the embodiment of the present application is generally executed by the head-mounted device 101, and accordingly, the device control method apparatus is generally disposed in the head-mounted device 101. In addition, some steps in the device control method provided in the embodiment of the present application may also be executed by the control device 103, for example, the control device 103 may determine a control instruction for the target device based on the determined head action type, and send the control instruction to the target device, so that the target device performs a corresponding operation in response to receiving the control instruction. Accordingly, some units in the device control apparatus may also be provided in the control device 103.
It should be understood that the number of head mounted devices, target devices, control devices, and networks in fig. 1 is merely illustrative. There may be any number of head mounted devices, target devices, control devices, and networks, as desired for implementation.
With continued reference to FIG. 2, a flow 200 of one embodiment of a device control method according to the present application is shown. The equipment control method comprises the following steps:
step 201, obtaining first motion posture information of a head wearing the head-mounted device.
In the present embodiment, an electronic device (for example, the head-mounted device 101 shown in fig. 1) on which the device control method operates may detect a motion state of a head on which the head-mounted device is worn by a sensor mounted thereon to acquire first motion posture information of the head. Among them, the sensor may be various sensors that can be used to capture motion information of the head. As an example, the electronic device may acquire the motion acceleration of the head and the motion direction of the head through an accelerometer. Because the gyroscope data has a serious drift problem, namely the gyroscope data also has drift in a static state, compared with the method that the gyroscope infers the left inclination, the right inclination and the back-and-forth swing of the head according to the inclination angle and the like, the accelerometer data is more stable and effective, and the calculation is simple and reliable. In addition, the motion posture information of the head can also be acquired through a camera and other peripherals, but the mode is complex to realize and has low accuracy compared with the mode of acquiring the motion posture information of the head through an accelerometer.
Step 202, analyzing the acquired first motion attitude information, and determining a head action type matched with the first motion attitude information.
In this embodiment, the electronic device may parse the first motion posture information obtained in step 201 and determine a head motion type matching the first motion posture information. The motion attitude information can be judged through a preset algorithm, whether the motion attitude information corresponds to a preset head action type or not is judged, and the motion attitude information specifically corresponds to which head action type is judged. For example, the head acceleration may be determined comprehensively by whether the head acceleration is within a predetermined range and the head movement direction. The head motion types include, but are not limited to, turning motion of the head in various directions and nodding motion of the head.
Based on the determined head action type, a control instruction for the target device is determined, step 203.
In this embodiment, the electronic device may determine a control instruction for the target device based on the head action type determined in step 202. The electronic device may store a correspondence between a head action type and a control instruction preset for each target device, and the control instruction for the target device may be determined according to the correspondence after the head action type is determined. Common control rules may also be established, for example, a nodding head action for modifying an option, a nodding head action for determining the option, and also a nodding head action for switching on and off a target device, a nodding head action for adjusting a control quantity of the target device, the control quantity may be a volume, a brightness of a light, a temperature of an air conditioner, a degree of opening and closing a curtain, an operation mode of the device, and the like.
And step 204, sending a control instruction to the target device, so that the target device can perform corresponding operation in response to the received control instruction.
In this embodiment, the electronic device may directly send the control instruction determined in step 203 to the target device through Wi-Fi (WIreless-Fidelity), zigbee (zigbee protocol of zigbee), Z-Wave (a radio frequency-based, low-cost, low-power, highly reliable, and network-suitable short-range WIreless communication technology), and the like. The control instruction determined in step 203 may also be sent to the target device through a control device, for example, a home routing terminal of the smart home platform. The control instruction determined in step 203 may also be transmitted through the cloud. And the target equipment responds to the received control instruction to perform corresponding operation. The operation may be on/off, or adjustment of a control amount. The target device can also send feedback information to the control device or the electronic device after corresponding operation, so that the electronic device can display the current state of the target device in real time to realize interactive control.
In the method provided by the above embodiment of the application, the first motion posture information of the head wearing the head-mounted device is acquired, then the acquired first motion posture information is analyzed, the head action type matched with the first motion posture information is determined, finally, the control instruction for the target device is determined based on the determined head action type, and the control instruction is sent to the target device, so that the target device can respond to the received control instruction to perform corresponding operation, and the device control means is enriched.
In some optional implementations of this embodiment, the target device may be determined via: presenting an image of a controlled environment including a controlled device, the image further including tagging information for tagging the controlled device; determining the marked controlled equipment in the image according to the second motion posture information of the head wearing the head-mounted equipment; in response to receiving an instruction to determine the marked controlled device as the target device, determining the marked controlled device as the target device.
In this implementation, the controlled device in the controlled environment may be a door or window, a home appliance, or the like in an intelligent home scene. The image may be generated in advance or may be a real-time image, for example, when the head-mounted device is a VR headset, the image may be an animation created in advance according to a real controlled environment, and when the head-mounted device is an AR or MR device, the image may be an image formed by superimposing some additional information on the real controlled environment. The marking information may be a superimposed light spot at the display position of the controlled device, an aperture surrounding the controlled device, or the like. The controlled equipment which is worn by the head-mounted equipment and is right opposite to the head can be determined according to the second motion attitude information, the controlled equipment can be the controlled equipment where the sight of the user is located, and the right opposite controlled equipment is marked; the head action type may also be determined according to the second motion posture information, where the head action type may be a head nodding action, a head clockwise rotation action, or a head counterclockwise rotation action, and when the head clockwise rotation action or the head counterclockwise rotation action is determined, the marked controlled device may be modified, and the instruction for determining the marked controlled device as the target device may be generated when the head nodding action is determined. Furthermore, the target device may also be determined based on user input received from other external devices, such as a mobile terminal or a control pad. Compared with the mode of determining the target equipment through the external equipment, the method and the device have the advantages that the user operation is smooth, and the efficiency is higher.
In some optional implementation manners of this embodiment, after the target device is determined, the electronic device may further prompt a corresponding relationship between the head action type and the motion instruction in a form of playing voice or outputting an image.
In some optional implementations of the embodiment, parsing the acquired first motion posture information and determining a head action type matching the first motion posture information includes: and determining the head action type matched with the first motion attitude information in response to the analyzed fact that the acceleration of the head-mounted device in the preset time length is smaller than the preset threshold value. The preset threshold may be set according to actual needs, and the gravity acceleration may be taken as the preset threshold as an example. Because the data collected by the sensor may have a drift phenomenon, the misjudgment of the head action type by the head-mounted device can be prevented through the setting of the preset threshold value, and the accuracy of the device control is further improved.
In some optional implementations of the embodiment, determining the head action type matching the first motion gesture information includes: if the first motion posture information comprises information that the head-mounted device completes clockwise acceleration within a first preset time length, determining that the head action type is clockwise rotation action of the head; and if the first motion posture information comprises information that the head-mounted device completes anticlockwise acceleration within a second preset time length, determining that the head action type is head anticlockwise rotation action. The clockwise rotation motion may be a head motion viewed from the top of the head, and if the head is viewed from the front, the clockwise rotation motion may be a head right rotation motion, and similarly, the counterclockwise rotation motion may also be a head left rotation motion. Since the user may have a return-to-positive action after rotating the head, for example, the head-mounted device completes the counterclockwise acceleration again within a predetermined time period after completing the clockwise acceleration, the head action type is determined to be the head clockwise rotation action without considering the counterclockwise acceleration thereafter, instead of the head clockwise rotation action and the head counterclockwise rotation action, and similarly, the head-mounted device completes the clockwise acceleration again within the predetermined time period after completing the counterclockwise acceleration, and determines to be the head counterclockwise rotation action.
In some optional implementations of this embodiment, the types of operations that the target device may perform include turning on or off; and determining control instructions for the target device based on the determined head action type, including: generating a control instruction for instructing the target device to turn on in response to the determined head action type being a head clockwise turning action, and generating a control instruction for instructing the target device to turn off in response to the determined head action type being a head counterclockwise turning action; or generating a control instruction for instructing the target device to turn off in response to the determined type of the head action being the head clockwise turning action, and generating a control instruction for instructing the target device to turn on in response to the determined type of the head action being the head counterclockwise turning action.
In some optional implementations of the embodiment, the type of operation that the target device may perform includes an adjustment to a control quantity; and determining control instructions for the target device based on the determined head action type, including: generating a control instruction for instructing the target device to increase the control amount in response to the determined head action type being a head clockwise turning action, and generating a control instruction for instructing the target device to decrease the control amount in response to the determined head action type being a head counterclockwise turning action; or generating a control instruction for instructing the target apparatus to decrease the control amount in response to the determined head action type being the head clockwise turning action, and generating a control instruction for instructing the target apparatus to increase the control amount in response to the determined head action type being the head counterclockwise turning action. When the control amount is increased or decreased, the control amount may be continuously increased or decreased during the duration of the head movement, or the control amount may be increased or decreased by a preset gradient after one complete head movement is acquired. For example, if the target device is an air conditioner, the controlled variable is temperature, the preset gradient is 1 degree celsius, and the determined head movement type is a clockwise head rotation movement, a control instruction for instructing the air conditioner to increase the temperature by 1 degree celsius is generated.
In some optional implementations of the embodiment, determining the head action type matching the first motion gesture information includes: and if the first motion posture information comprises information that the head-mounted device completes downward acceleration and/or upward acceleration within a third preset time period, determining that the head action type is nodding head action. The preset time can be set according to actual needs, and can be 1-3 seconds, for example.
In some optional implementations of this embodiment, the types of operations that the target device may perform include turning on or off; and determining control instructions for the target device based on the determined head action type, including: and in response to the fact that the determined head action type is nodding head action, if the target device is determined to be in the on state, generating a control instruction for indicating the target device to be off, and if the target device is determined to be in the off state, generating a control instruction for indicating the target device to be on.
With continued reference to fig. 3, fig. 3 is a schematic diagram of an application scenario of the device control method according to the present embodiment. In the application scenario of fig. 3, ring 306 may be a marker for indicating a target device. In this case, the target device is a window 304, and similarly, the target device may be a display 302, an illumination lamp 303, or a door 305. The user's head may be wearing the head mounted device 301 and the target device may be controlled by a head motion such as turning or nodding, e.g. the user knows that nodding head motion may generate an instruction to close the window 304 and that the window 304 may be closed by nodding head motion after receiving a weather forecast that today will have rainfall.
Referring to fig. 4, fig. 4 is a schematic flowchart illustrating a device control method according to still another embodiment of the present embodiment.
In fig. 4, the flow 400 of the device control method includes the following steps:
step 401, obtaining first motion posture information of a head wearing the head-mounted device.
In the present embodiment, an electronic device (e.g., the head mounted device 101 shown in fig. 1) on which the device control method operates may acquire first motion posture information of a head wearing the head mounted device.
Step 402, analyzing the acquired first motion attitude information, and determining a head action type matched with the first motion attitude information.
In this embodiment, the electronic device may analyze the first motion posture information obtained in step 401 directly or through the control device itself, and determine the head motion type matched with the first motion posture information.
Step 403, sending a message including the determined head action type and the identification of the target device to the control device, determining, by the control device, a control instruction for the target device based on the determined head action type.
In this embodiment, the electronic device may send a message including the head action type determined in step 402 and the identifier of the target device to the control device, and determine, by the control device, a control instruction for the target device based on the determined head action type. The control device may be a server in the cloud, or may be a hub that controls multiple devices, for example, a central device of an internet of things platform of a smart home, and the control device may connect different sensors in controlled environments such as a home and an office, so that a user may receive a notification of what is happening, and control the controlled devices in the controlled environments using various terminals, for example, a head-mounted device.
And the control device inquires information associated with the target device in the storage device according to the identification. The storage device may store therein a correspondence relationship between the head action type and the control instruction that is preset in advance for various target devices. The storage device may also store information such as an operating state and a type of operation that can be performed by the target device, and according to the determined head motion type and the queried information, the control device may determine a control instruction for the target device, for example, the head-mounted device sends a message including a head lighting motion and an identifier of the smart bulb to the control device, the control device determines that the smart bulb is currently in an on state through querying, and the types of operation that can be performed by the smart bulb include turning on, turning off, and turning on/off, where the turning on and off respectively correspond to a clockwise rotation motion of the head, a counterclockwise rotation motion of the head, and the turning on/off corresponds to a head lighting motion, and then determines that the control instruction for the smart bulb is a turning off instruction.
Step 404, sending a control instruction to the target device by the control device, so that the target device performs a corresponding operation in response to receiving the control instruction.
In this embodiment, the electronic device may send the control instruction determined in step 403 to the target device through the control device, so that the target device performs a corresponding operation in response to receiving the control instruction. The target device may further send feedback information to the control device after performing corresponding operations, so that the control device may update the running state of the target device recorded in the storage device for use in controlling the target device next time.
For details and technical effects of step 401 and step 402, reference may be made to the descriptions in step 201 and step 202, which are not described herein again.
As can be seen from fig. 4, compared to the embodiment corresponding to fig. 2, a part of the steps of the flow 400 of the device control method in the present embodiment are performed by the control device. Therefore, the scheme described in this embodiment further enriches the device control means, and in the case of multiple devices, the head-mounted device does not need to establish a direct communication connection relationship with each target device, thereby improving the device control efficiency.
With further reference to fig. 5, as an implementation of the above method, the present application provides an embodiment of a device control apparatus, which corresponds to the method embodiment shown in fig. 2, and which can be applied to various electronic devices.
As shown in fig. 5, the device control apparatus 500 of the present embodiment includes: the device comprises an acquisition unit 501, an analysis unit 502, a determination unit 503 and a sending unit 504, wherein the acquisition unit 501 is used for acquiring first motion posture information of a head wearing the head-mounted device; an analyzing unit 502, configured to analyze the acquired first motion posture information, and determine a head action type matched with the first motion posture information; a determination unit 503 configured to determine a control instruction for the target device based on the determined head action type; the determining unit 504 is configured to send a control instruction to the target device, so that the target device performs a corresponding operation in response to receiving the control instruction.
In this embodiment, the specific processing of the obtaining unit 501, the analyzing unit 502, the determining unit 503, and the sending unit 504 may refer to the detailed description of step 201, step 202, step 203, and step 204 in the embodiment of fig. 2, and is not described herein again.
In some optional implementations of this embodiment, the apparatus further includes a target apparatus determining unit 503, where the target apparatus determining unit 503 is configured to: presenting an image of a controlled environment including a controlled device, the image further including tagging information for tagging the controlled device; determining the marked controlled equipment in the image according to the second motion posture information of the head wearing the head-mounted equipment; in response to receiving an instruction to determine the marked controlled device as the target device, determining the marked controlled device as the target device.
In some optional implementations of this embodiment, the head-mounted device, the target device, and the control device are communicatively connected; and the determining unit 503 is further configured to: sending a message including the determined head action type and an identification of the target device to the control device, determining, by the control device, a control instruction for the target device based on the determined head action type; and the determining unit 504 is further configured to: and sending a control instruction to the target equipment through the control equipment, so that the target equipment responds to the received control instruction to perform corresponding operation.
In some optional implementations of this embodiment, the parsing unit 502 is further configured to: and determining the head action type matched with the first motion attitude information in response to the analyzed fact that the acceleration of the head-mounted device in the preset time length is smaller than the preset threshold value.
In some optional implementations of this embodiment, the parsing unit 502 is further configured to: if the first motion posture information comprises information that the head-mounted device completes clockwise acceleration within a first preset time length, determining that the head action type is clockwise rotation action of the head; and if the first motion posture information comprises information that the head-mounted device completes anticlockwise acceleration within a second preset time length, determining that the head action type is head anticlockwise rotation action.
In some optional implementations of this embodiment, the types of operations that the target device may perform include turning on or off; and the determining unit 503 is further configured to: generating a control instruction for instructing the target device to turn on in response to the determined head action type being a head clockwise turning action, and generating a control instruction for instructing the target device to turn off in response to the determined head action type being a head counterclockwise turning action; or generating a control instruction for instructing the target device to turn off in response to the determined type of the head action being the head clockwise turning action, and generating a control instruction for instructing the target device to turn on in response to the determined type of the head action being the head counterclockwise turning action.
In some optional implementations of the embodiment, the type of operation that the target device may perform includes an adjustment to a control quantity; and the determining unit 503 is further configured to: generating a control instruction for instructing the target device to increase the control amount in response to the determined head action type being a head clockwise turning action, and generating a control instruction for instructing the target device to decrease the control amount in response to the determined head action type being a head counterclockwise turning action; or generating a control instruction for instructing the target apparatus to decrease the control amount in response to the determined head action type being the head clockwise turning action, and generating a control instruction for instructing the target apparatus to increase the control amount in response to the determined head action type being the head counterclockwise turning action.
In some optional implementations of this embodiment, the parsing unit 502 is further configured to: and if the first motion posture information comprises information that the head-mounted device completes downward acceleration and/or upward acceleration within a third preset time period, determining that the head action type is nodding head action.
In some optional implementations of this embodiment, the types of operations that the target device may perform include turning on or off; and the determining unit 503 is further configured to: and in response to the fact that the determined head action type is nodding head action, if the target device is determined to be in the on state, generating a control instruction for indicating the target device to be off, and if the target device is determined to be in the off state, generating a control instruction for indicating the target device to be on.
As can be seen from fig. 5, in the present embodiment, the device control apparatus 500 enriches the means for device control by acquiring the first motion posture information of the head wearing the head-mounted device, then parsing the acquired first motion posture information, determining the head action type matching with the first motion posture information, finally determining the control instruction for the target device based on the determined head action type, and sending the control instruction to the target device, so that the target device performs corresponding operation in response to the received control instruction.
Referring now to FIG. 6, shown is a block diagram of a computer system 600 suitable for use in implementing the apparatus of an embodiment of the present application. The apparatus shown in fig. 6 is only an example, and should not bring any limitation to the function and use range of the embodiments of the present application.
As shown in fig. 6, the computer system 600 includes a Central Processing Unit (CPU)601 that can perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM)602 or a program loaded from a storage section 608 into a Random Access Memory (RAM) 603. In the RAM 603, various programs and data necessary for the operation of the system 600 are also stored. The CPU 601, ROM 602, and RAM 603 are connected to each other via a bus 604. An input/output (I/O) interface 605 is also connected to bus 604.
The following components are connected to the I/O interface 605: an input portion 606 including a keyboard, a mouse, and the like; an output portion 607 including a display such as a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and the like, and a speaker; a storage section 608 including a hard disk and the like; and a communication section 609 including a network interface card such as a LAN card, a modem, or the like. The communication section 609 performs communication processing via a network such as the internet. The driver 610 is also connected to the I/O interface 606 as needed. A removable medium 611 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive 610 as necessary, so that a computer program read out therefrom is mounted in the storage section 608 as necessary.
In particular, according to an embodiment of the present disclosure, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network through the communication section 609, and/or installed from the removable medium 611. The computer program performs the above-described functions defined in the method of the present application when executed by a Central Processing Unit (CPU) 601. It should be noted that the computer readable medium described herein can be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present application, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In this application, however, a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a unit, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present application may be implemented by software or hardware. The described units may also be provided in a processor, and may be described as: a processor includes an acquisition unit, a parsing unit, a determination unit, and a transmission unit. Here, the names of the units do not constitute a limitation of the unit itself in some cases, and for example, the acquisition unit may also be described as "a unit for acquiring first motion posture information of a head wearing the head mounted device".
As another aspect, the present application also provides a non-volatile computer storage medium, which may be the non-volatile computer storage medium included in the apparatus in the above-described embodiments; or it may be a non-volatile computer storage medium that exists separately and is not assembled into a server. The non-transitory computer storage medium stores one or more programs that, when executed by a device, cause the device to: acquiring first motion posture information of a head wearing the head-mounted equipment; analyzing the acquired first motion attitude information, and determining a head action type matched with the first motion attitude information; determining a control instruction for the target device based on the determined head action type; and sending a control instruction to the target equipment so that the target equipment can perform corresponding operation in response to the received control instruction.
The above description is only a preferred embodiment of the application and is illustrative of the principles of the technology employed. It will be appreciated by a person skilled in the art that the scope of the invention as referred to in the present application is not limited to the embodiments with a specific combination of the above-mentioned features, but also covers other embodiments with any combination of the above-mentioned features or their equivalents without departing from the inventive concept. For example, the above features may be replaced with (but not limited to) features having similar functions disclosed in the present application.

Claims (20)

1. An apparatus control method, characterized in that the method comprises:
acquiring first motion posture information of a head wearing the head-mounted equipment;
analyzing the acquired first motion attitude information, and determining a head action type matched with the first motion attitude information;
determining a control instruction for the target device based on the determined head action type;
sending the control instruction to the target device, so that the target device can perform corresponding operation in response to the received control instruction;
wherein the target device is determined via:
presenting an image of a controlled environment including controlled devices, the image further including marking information for marking the controlled devices, the marking information corresponding one-to-one to the controlled devices;
determining a controlled device marked in the image according to second motion posture information of the head wearing the head-mounted device;
determining the tagged controlled device as a target device in response to receiving an instruction to determine the tagged controlled device as a target device.
2. The method of claim 1, wherein the head mounted device, the target device and a control device are communicatively connected; and
the determining control instructions for the target device based on the determined head action type comprises:
sending a message including the determined head action type and an identification of the target device to the control device, determining, by the control device, a control instruction for the target device based on the determined head action type; and
the sending of the control instruction to the target device, so that the target device performs corresponding operations in response to receiving the control instruction, includes:
and sending the control instruction to the target equipment through the control equipment, so that the target equipment responds to the received control instruction to perform corresponding operation.
3. The method of claim 1, wherein the parsing the obtained first motion pose information and determining a head action type matching the first motion pose information comprises:
and determining the head action type matched with the first motion posture information in response to the fact that the acceleration of the head-mounted device within the preset time length is smaller than a preset threshold value.
4. The method of any of claims 1-3, wherein the determining the type of head motion that matches the first motion pose information comprises:
if the first motion posture information comprises information that the head-mounted device completes clockwise acceleration within a first preset time length, determining that the head action type is a clockwise rotation action of the head;
and if the first motion posture information comprises information that the head-mounted device completes anticlockwise acceleration within a second preset time period, determining that the head action type is head anticlockwise rotation action.
5. The method of claim 4, wherein the types of operations that the target device can perform include turning on or off; and
the determining control instructions for the target device based on the determined head action type comprises:
generating a control instruction for instructing the target device to turn on in response to the determined head action type being a head clockwise turning action, and generating a control instruction for instructing the target device to turn off in response to the determined head action type being a head counterclockwise turning action; or
Generating a control instruction for instructing the target device to turn off in response to the determined head action type being a head clockwise turning action, and generating a control instruction for instructing the target device to turn on in response to the determined head action type being a head counterclockwise turning action.
6. The method of claim 4, wherein the types of operations that the target device may perform include adjustments to a control quantity; and
the determining control instructions for the target device based on the determined head action type comprises:
generating a control instruction for instructing the target apparatus to increase the control amount in response to the determined head action type being a head clockwise turning action, and generating a control instruction for instructing the target apparatus to decrease the control amount in response to the determined head action type being a head counterclockwise turning action; or
Generating a control instruction for instructing the target apparatus to decrease the control amount in response to the determined head action type being a head clockwise turning action, and generating a control instruction for instructing the target apparatus to increase the control amount in response to the determined head action type being a head counterclockwise turning action.
7. The method of any of claims 1-3, wherein the determining the type of head motion that matches the first motion pose information comprises:
if the first motion posture information includes information that the head-mounted device completes downward acceleration and/or upward acceleration within a third preset time period, determining that the head action type is nodding head action.
8. The method of claim 7, wherein the types of operations that the target device can perform include turning on or off; and
the determining control instructions for the target device based on the determined head action type comprises:
and in response to the fact that the determined head action type is a nodding head action, if the target device is determined to be in the on state, generating a control instruction for indicating the target device to be off, and if the target device is determined to be in the off state, generating a control instruction for indicating the target device to be on.
9. An apparatus control apparatus, characterized in that the apparatus comprises:
the device comprises an acquisition unit, a processing unit and a display unit, wherein the acquisition unit is used for acquiring first motion attitude information of a head wearing the head-mounted equipment;
the analysis unit is used for analyzing the acquired first motion attitude information and determining a head action type matched with the first motion attitude information;
a determination unit configured to determine a control instruction for the target device based on the determined head action type;
a sending unit, configured to send the control instruction to the target device, so that the target device performs a corresponding operation in response to receiving the control instruction;
wherein the device further comprises a target device determining unit configured to:
presenting an image of a controlled environment including controlled devices, the image further including marking information for marking the controlled devices, the marking information corresponding one-to-one to the controlled devices;
determining a controlled device marked in the image according to second motion posture information of the head wearing the head-mounted device;
determining the tagged controlled device as a target device in response to receiving an instruction to determine the tagged controlled device as a target device.
10. The device of claim 9, wherein the head mounted device, the target device and a control device are communicatively coupled; and
the determining unit is further configured to:
sending a message including the determined head action type and an identification of the target device to the control device, determining, by the control device, a control instruction for the target device based on the determined head action type; and
the sending unit is further configured to:
and sending the control instruction to the target equipment through the control equipment, so that the target equipment responds to the received control instruction to perform corresponding operation.
11. The apparatus of claim 9, wherein the parsing unit is further configured to:
and determining the head action type matched with the first motion posture information in response to the fact that the acceleration of the head-mounted device within the preset time length is smaller than a preset threshold value.
12. The apparatus according to any of claims 9-11, wherein the parsing unit is further configured to:
if the first motion posture information comprises information that the head-mounted device completes clockwise acceleration within a first preset time length, determining that the head action type is a clockwise rotation action of the head;
and if the first motion posture information comprises information that the head-mounted device completes anticlockwise acceleration within a second preset time period, determining that the head action type is head anticlockwise rotation action.
13. The device of claim 12, wherein the types of operations that the target device can perform include on or off; and
the determining unit is further configured to:
generating a control instruction for instructing the target device to turn on in response to the determined head action type being a head clockwise turning action, and generating a control instruction for instructing the target device to turn off in response to the determined head action type being a head counterclockwise turning action; or
Generating a control instruction for instructing the target device to turn off in response to the determined head action type being a head clockwise turning action, and generating a control instruction for instructing the target device to turn on in response to the determined head action type being a head counterclockwise turning action.
14. The apparatus of claim 12, wherein the types of operations that the target apparatus can perform include adjustment of a control quantity; and
the determining unit is further configured to:
generating a control instruction for instructing the target apparatus to increase the control amount in response to the determined head action type being a head clockwise turning action, and generating a control instruction for instructing the target apparatus to decrease the control amount in response to the determined head action type being a head counterclockwise turning action; or
Generating a control instruction for instructing the target apparatus to decrease the control amount in response to the determined head action type being a head clockwise turning action, and generating a control instruction for instructing the target apparatus to increase the control amount in response to the determined head action type being a head counterclockwise turning action.
15. The apparatus according to any of claims 9-11, wherein the parsing unit is further configured to:
if the first motion posture information includes information that the head-mounted device completes downward acceleration and/or upward acceleration within a third preset time period, determining that the head action type is nodding head action.
16. The device of claim 15, wherein the types of operations that the target device can perform include on or off; and
the determining unit is further configured to:
and in response to the fact that the determined head action type is a nodding head action, if the target device is determined to be in the on state, generating a control instruction for indicating the target device to be off, and if the target device is determined to be in the off state, generating a control instruction for indicating the target device to be on.
17. An appliance control system, comprising: the system comprises a head-mounted device and a target device, wherein the head-mounted device is in communication connection with the target device;
the head-mounted device is used for acquiring first motion attitude information of a head wearing the head-mounted device, analyzing the acquired first motion attitude information, determining a head action type matched with the first motion attitude information, determining a control instruction for a target device based on the determined head action type, and sending the control instruction to the target device;
the target device is used for responding to the received control instruction to perform corresponding operation;
wherein the target device is determined via:
presenting an image of a controlled environment including controlled devices, the image further including marking information for marking the controlled devices, the marking information corresponding one-to-one to the controlled devices;
determining a controlled device marked in the image according to second motion posture information of the head wearing the head-mounted device;
determining the tagged controlled device as a target device in response to receiving an instruction to determine the tagged controlled device as a target device.
18. An appliance control system, comprising: the device comprises a head-mounted device, a control device and a target device, wherein the control device is in communication connection with the head-mounted device and the target device;
the head-mounted device is configured to acquire first motion posture information of a head wearing the head-mounted device, analyze the acquired first motion posture information, determine a head action type matched with the first motion posture information, and send a message including the determined head action type and an identifier of the target device to the control device;
the control device is used for responding to the received message, determining a control instruction for a target device based on the determined head action type, and sending the control instruction to the target device;
the target device is used for responding to the received control instruction to perform corresponding operation;
wherein the target device is determined via:
presenting an image of a controlled environment including controlled devices, the image further including marking information for marking the controlled devices, the marking information corresponding one-to-one to the controlled devices;
determining a controlled device marked in the image according to second motion posture information of the head wearing the head-mounted device;
determining the tagged controlled device as a target device in response to receiving an instruction to determine the tagged controlled device as a target device.
19. An apparatus, comprising:
one or more processors;
a storage device for storing one or more programs,
when executed by the one or more processors, cause the one or more processors to implement the method of any one of claims 1-8.
20. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1 to 8.
CN201710542645.1A 2017-07-05 2017-07-05 Equipment control method and device Active CN107290972B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710542645.1A CN107290972B (en) 2017-07-05 2017-07-05 Equipment control method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710542645.1A CN107290972B (en) 2017-07-05 2017-07-05 Equipment control method and device

Publications (2)

Publication Number Publication Date
CN107290972A CN107290972A (en) 2017-10-24
CN107290972B true CN107290972B (en) 2021-02-26

Family

ID=60100904

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710542645.1A Active CN107290972B (en) 2017-07-05 2017-07-05 Equipment control method and device

Country Status (1)

Country Link
CN (1) CN107290972B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108345907B (en) * 2018-02-09 2022-02-11 广东美的制冷设备有限公司 Recognition method, augmented reality device, and storage medium
CN109144245B (en) * 2018-07-04 2021-09-14 Oppo(重庆)智能科技有限公司 Equipment control method and related product
CN110134246A (en) * 2019-05-22 2019-08-16 联想(北京)有限公司 Interaction control method, device and electronic equipment
CN110737260B (en) * 2019-08-29 2022-02-11 南京智慧光信息科技研究院有限公司 Automatic operation method based on big data and artificial intelligence and robot system
CN111338476A (en) * 2020-02-25 2020-06-26 上海唯二网络科技有限公司 Method and device for realizing human-computer interaction through head-mounted VR display equipment
CN111796682B (en) * 2020-07-09 2021-11-16 联想(北京)有限公司 Control method and device electronic equipment
CN114527864B (en) * 2020-11-19 2024-03-15 京东方科技集团股份有限公司 Augmented reality text display system, method, equipment and medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105938396A (en) * 2016-06-07 2016-09-14 陈火 Music player control system and method
CN106445156A (en) * 2016-09-29 2017-02-22 宇龙计算机通信科技(深圳)有限公司 Method, device and terminal for intelligent home device control based on virtual reality

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106249847A (en) * 2015-07-21 2016-12-21 深圳市拓丰源电子科技有限公司 A kind of virtual augmented reality system realizing based on headset equipment remotely controlling
CN106200899A (en) * 2016-06-24 2016-12-07 北京奇思信息技术有限公司 The method and system that virtual reality is mutual are controlled according to user's headwork
CN106445176B (en) * 2016-12-06 2018-10-23 腾讯科技(深圳)有限公司 Man-machine interactive system based on virtual reality technology and exchange method
CN106647303B (en) * 2016-12-23 2019-10-01 重庆墨希科技有限公司 A kind of intelligent home furnishing control method, system and a kind of Intelligent bracelet
CN106873767B (en) * 2016-12-30 2020-06-23 深圳超多维科技有限公司 Operation control method and device for virtual reality application

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105938396A (en) * 2016-06-07 2016-09-14 陈火 Music player control system and method
CN106445156A (en) * 2016-09-29 2017-02-22 宇龙计算机通信科技(深圳)有限公司 Method, device and terminal for intelligent home device control based on virtual reality

Also Published As

Publication number Publication date
CN107290972A (en) 2017-10-24

Similar Documents

Publication Publication Date Title
CN107290972B (en) Equipment control method and device
US20190312747A1 (en) Method, apparatus and system for controlling home device
EP3094045B1 (en) Home device control apparatus and control method using wearable devices
CN111937051A (en) Smart home device placement and installation using augmented reality visualization
CN104182051B (en) Head-wearing type intelligent equipment and the interactive system with the head-wearing type intelligent equipment
CN104731441A (en) Information processing method and electronic devices
KR20200068075A (en) Remote guidance apparatus and method capable of handling hyper-motion step based on augmented reality and machine learning
CN106569409A (en) Graph capturing based household equipment control system, device and method
KR20210062988A (en) Multilateral participation remote collaboration system based on Augmented reality sharing and method thereof
CN110601933A (en) Control method, device and equipment of Internet of things equipment and storage medium
CN112487973A (en) User image recognition model updating method and device
CN204166478U (en) Head-wearing type intelligent equipment
US11151797B2 (en) Superimposing a virtual representation of a sensor and its detection zone over an image
CN112015267B (en) System, apparatus and method for managing building automation environment
EP3669617B1 (en) Storing a preference for a light state of a light source in dependence on an attention shift
US20230033157A1 (en) Displaying a light control ui on a device upon detecting interaction with a light control device
KR101725436B1 (en) System and Method for Controlling Electronic Equipment by Folder
KR20160129143A (en) Robot control service system based on smart device and robot control service method therefor
US20240144517A1 (en) Displaying an aggregation of data in dependence on a distance to a closest device in an image
JP7059452B1 (en) Determining lighting design preferences in augmented and / or virtual reality environments
WO2022184569A1 (en) Displaying an aggregation of data in dependence on a distance to a closest device in an image
JP7126507B2 (en) Controller and method for indicating presence of virtual object via lighting device
US20220386436A1 (en) Indicating a likelihood of presence being detected via multiple indications
CN114488836A (en) Intelligent device control method and device, electronic device, cleaning system and medium
CN204166477U (en) Head-wearing type intelligent equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant