CN108375911B - Equipment control method and device, storage medium and equipment - Google Patents

Equipment control method and device, storage medium and equipment Download PDF

Info

Publication number
CN108375911B
CN108375911B CN201810057973.7A CN201810057973A CN108375911B CN 108375911 B CN108375911 B CN 108375911B CN 201810057973 A CN201810057973 A CN 201810057973A CN 108375911 B CN108375911 B CN 108375911B
Authority
CN
China
Prior art keywords
user
control command
preset
area
control
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810057973.7A
Other languages
Chinese (zh)
Other versions
CN108375911A (en
Inventor
陈铭进
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Gree Electric Appliances Inc of Zhuhai
Original Assignee
Gree Electric Appliances Inc of Zhuhai
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Gree Electric Appliances Inc of Zhuhai filed Critical Gree Electric Appliances Inc of Zhuhai
Priority to CN201810057973.7A priority Critical patent/CN108375911B/en
Publication of CN108375911A publication Critical patent/CN108375911A/en
Application granted granted Critical
Publication of CN108375911B publication Critical patent/CN108375911B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B15/00Systems controlled by a computer
    • G05B15/02Systems controlled by a computer electric
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/418Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM]
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/20Pc systems
    • G05B2219/26Pc applications
    • G05B2219/2642Domotique, domestic, home control, automation, smart house
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Manufacturing & Machinery (AREA)
  • Quality & Reliability (AREA)
  • Selective Calling Equipment (AREA)

Abstract

The invention provides a device control method, a device, a storage medium and a device, wherein the method comprises the following steps: the method comprises the steps that motion capture is conducted on a user in a preset area through a Kinect device, so that whether the user conducts preset motion or not is recognized, and/or a voice command sent by the user is received; and if the user is caught to make the preset action and/or a voice command sent by the user is received, controlling the corresponding equipment according to the preset action and/or the voice command. According to the scheme provided by the invention, the intelligent household equipment can be controlled through the preset action or the voice command, the intelligent household equipment is not required to be controlled by using the control terminal, and the user experience is improved.

Description

Equipment control method and device, storage medium and equipment
Technical Field
The present invention relates to the field of control, and in particular, to a device control method, apparatus, storage medium, and device.
Background
At present, there are many control methods for smart home, such as remote network control, voice control, etc. At present, the intelligent household equipment is still controlled through control terminals such as a mobile phone, a control panel and a remote controller, and the operation is still complex and not intelligent enough.
Disclosure of Invention
The main purpose of the present invention is to overcome the above-mentioned defects in the prior art, and provide an apparatus control method, apparatus, storage medium and apparatus, so as to solve the problem in the prior art that the control operation of the smart home apparatus is relatively complicated.
One aspect of the present invention provides an apparatus control method, including: the method comprises the steps that motion capture is conducted on a user in a preset area through a Kinect device, whether the user conducts preset motion or not is identified, and/or a voice command sent by the user is received; and if the user is caught to make the preset action and/or a voice command sent by the user is received, controlling the corresponding equipment according to the preset action and/or the voice command.
Optionally, the predetermined area comprises more than one sub-area; before motion capture is performed on a user in a predetermined area through a Kinect device, the method further comprises the following steps: detecting the position of a user in the preset area so as to determine a sub-area of the user in the preset area according to the position of the user; motion capture of a user within a predetermined area by a Kinect device, comprising: and performing motion capture on the user through a Kinect device corresponding to the sub-area where the user is located in the preset area.
Optionally, detecting the location of the user in the predetermined area includes: detecting the position of the user through an infrared sensor; and/or positioning a sound source of the user through an array microphone; and/or determining the position of the user by detecting the strength of a wireless signal sent by the equipment carried by the user.
Optionally, controlling the corresponding device according to the preset action includes: identifying a first control command corresponding to the preset action; sending the first control command to corresponding equipment to enable the corresponding equipment to execute corresponding operation according to the first control command; and/or, controlling the corresponding equipment according to the voice command, comprising: recognizing the semantic meaning corresponding to the voice command so as to determine a second control command corresponding to the voice command according to the recognized semantic meaning; and sending the second control command to corresponding equipment so that the corresponding equipment executes corresponding operation according to the second control command.
Optionally, sending the first control command to the corresponding device and/or sending the second control command to the corresponding device includes: and sending the first control command and/or the second control command to the corresponding equipment through at least one communication mode of a local area network, WiFi, a router and Bluetooth.
Optionally, the apparatus, comprises: at least one of an air conditioner, a refrigerator, a washing machine, a television, a water heater, and a microwave oven.
Another aspect of the present invention provides an apparatus control device, including: 7. an apparatus control device, characterized by comprising: the device comprises a capturing unit and a receiving unit, wherein the capturing unit is used for capturing the motion of a user in a preset area through a Kinect device so as to identify whether the user makes a preset motion, and/or the receiving unit is used for receiving a voice command sent by the user; and the control unit is used for controlling the corresponding equipment according to the preset action and/or the voice command if the capture unit captures that the user makes the preset action and/or the receiving unit receives the voice command sent by the user.
Optionally, the predetermined area comprises more than one sub-area; the device further comprises: the detection unit is used for detecting the position of a user in a preset area before the capturing unit captures the motion of the user in the preset area through a Kinect device, so as to determine a sub-area of the user in the preset area according to the position of the user; the capture unit further to: and performing motion capture on the user through a Kinect device corresponding to the sub-area where the user is located in the preset area.
Optionally, the detecting unit detects a location of the user in the predetermined area, and includes: detecting the position of the user through an infrared sensor; and/or positioning a sound source of the user through an array microphone; and/or determining the position of the user by detecting the strength of a wireless signal sent by the equipment carried by the user.
Optionally, the controlling unit, configured to control the corresponding device according to the preset action, includes: identifying a first control command corresponding to the preset action; sending the first control command to corresponding equipment to enable the corresponding equipment to execute corresponding operation according to the first control command; and/or the control unit controls the corresponding equipment according to the voice command, and comprises: recognizing the semantic meaning corresponding to the voice command so as to determine a second control command corresponding to the voice command according to the recognized semantic meaning; and sending the second control command to corresponding equipment so that the corresponding equipment executes corresponding operation according to the second control command.
Optionally, the sending, by the control unit, the first control command to the corresponding device and/or the second control command to the corresponding device includes: and sending the first control command and/or the second control command to the corresponding equipment through at least one communication mode of a local area network, WiFi, a router and Bluetooth.
Optionally, the apparatus, comprises: at least one of an air conditioner, a refrigerator, a washing machine, a television, a water heater, and a microwave oven.
Yet another aspect of the invention provides a computer readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of any of the methods described above.
Yet another aspect of the invention provides an apparatus comprising a processor, a memory, and a computer program stored on the memory and executable on the processor, the processor implementing the steps of any of the methods described above when executing the program.
In a further aspect, the present invention provides an apparatus comprising apparatus control means as described in any preceding claim.
Optionally, the apparatus comprises: at least one of a gateway, a router, an air conditioner, a refrigerator, a washing machine, a television, a water heater, and a microwave oven.
According to the technical scheme of the invention, the Kinect equipment is used for capturing the motion of the user in the preset area and/or receiving a voice command sent by the user; if the user is caught to make the preset action and/or the voice command sent by the user is received, the corresponding equipment is controlled according to the preset action and/or the voice command, the intelligent home equipment is controlled through the preset action or the voice command, the intelligent home equipment is not required to be controlled by a control terminal such as a mobile phone, a control panel and a remote controller, and the user experience is improved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description serve to explain the invention and not to limit the invention. In the drawings:
FIG. 1 is a method diagram of one embodiment of a device control method provided by the present invention;
fig. 2 is an exemplary diagram of at least one Kinect device disposed within a preset area according to an embodiment of the present invention;
FIG. 3 is a method diagram of another embodiment of a device control method provided by the present invention;
FIG. 4 is a schematic structural diagram of an embodiment of an apparatus control device provided in the present invention;
fig. 5 is a schematic structural diagram of another embodiment of the device control apparatus provided in the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the technical solutions of the present invention will be clearly and completely described below with reference to the specific embodiments of the present invention and the accompanying drawings. It is to be understood that the described embodiments are merely exemplary of the invention, and not restrictive of the full scope of the invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and claims of the present invention and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the invention described herein are capable of operation in sequences other than those illustrated or described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Fig. 1 is a schematic method diagram of an embodiment of a device control method provided by the present invention. The method may be used to control the device and/or other devices. The device comprises: gateways, routers, or appliances (e.g., air conditioners, refrigerators, washing machines, televisions, water heaters, microwave ovens, etc.), including: at least one of an air conditioner, a refrigerator, a washing machine, a television, a water heater, and a microwave oven. The device controls other devices by communicating with the other devices. For example, networked with the devices through a gateway or router to control the devices.
As shown in fig. 1, according to an embodiment of the present invention, the device control method includes at least step S110 and step S120.
Step S110, performing motion capture on a user in a predetermined area through a Kinect device to identify whether the user performs a preset motion and/or receives a voice command sent by the user.
Specifically, preset actions for controlling different devices are preset, the preset actions can be specifically set by a user, the action of the user in a predetermined area is captured by using a Kinect action capture technology, at least one Kinect device (such as a Kinect camera) (such as a camera) is arranged in the predetermined area, the action of the user in the predetermined area is captured by the at least one Kinect device, and whether the action made by the user is the preset action is identified. For example, as shown in fig. 2, the present invention is implemented in a gateway C that controls devices B1, B2, B3, and B4 in a living area 1, and Kinect devices a1, a2, A3, and a4 are installed at four corners of the living area 1, respectively. The Kinect equipment captures depth information, generates three-dimensional point cloud of actions through a preset program, transmits the three-dimensional point cloud of the actions to a model in a VR mode to track human action tracks, and uses a machine learning technology to recognize the actions.
The voice command sent by the user can be received through a microphone, for example, the voice command sent by the user is received through an array microphone built in the Kinect device, or the voice command sent by the user is received through a voice bluetooth remote controller.
And step S120, if the preset action made by the user and/or the voice command sent by the user is captured, controlling the corresponding equipment according to the preset action and/or the voice command.
Specifically, the corresponding device includes the device and other devices, and if a preset action of the user is captured, the corresponding device is controlled according to the preset action, a first control command corresponding to the preset action is recognized, and the first control command is sent to the corresponding device, so that the corresponding device executes a corresponding operation according to the first control command. Presetting preset actions for controlling different devices and corresponding first control commands for controlling the corresponding devices, acquiring the first control commands corresponding to the preset actions made by the user when the preset actions made by the user are captured, sending the first control commands to the corresponding devices, and executing operations corresponding to the first control commands after the corresponding devices receive the first control commands.
Controlling the corresponding device according to the voice command comprises: recognizing the semantic meaning corresponding to the voice command so as to determine a second control command corresponding to the voice command according to the recognized semantic meaning; and sending the second control command to corresponding equipment so that the corresponding equipment executes corresponding operation according to the second control command. The method comprises the steps of presetting semantics corresponding to a second control command for controlling corresponding equipment, carrying out semantic recognition on a voice command sent by a user when the voice command sent by the user is received, determining a second control command corresponding to the voice command according to the recognized semantics corresponding to the voice command, sending the second control command to the corresponding equipment, and executing operation corresponding to the second control command after the corresponding equipment receives the second control command.
The first control command is sent to the corresponding device and/or the second control command is sent to the corresponding device, and the first control command and/or the second control command can be communicated through at least one of a local area network, WiFi, a router and Bluetooth. Specifically, the device to be controlled may be connected through at least one of a local area network, WiFi, a router, and bluetooth, so as to send the first control command and/or the second control command to the corresponding device.
Fig. 3 is a schematic method diagram of another embodiment of the device control method provided by the present invention. As shown in fig. 3, based on the above-described embodiment, the apparatus control method further includes step S100.
Step S100, detecting the position of the user in the preset area, and determining the sub-area of the user in the preset area according to the position of the user.
Specifically, the predetermined area includes more than one sub-area, a corresponding Kinect device may be respectively disposed in each sub-area, and when motion capture is performed on a user in the predetermined area by using a Kinect device, before the motion capture is performed, a position of the user in the predetermined area is detected, so as to determine the sub-area where the user is located in the predetermined area according to the position of the user, so that in step S110, the user is motion captured by using the Kinect device corresponding to the sub-area where the user is located in the predetermined area, that is, a Kinect device closest to the user is turned on to capture the motion of the user, and energy consumption is reduced without turning on all the Kinect motion capture devices.
The detecting of the location of the user in the predetermined area may specifically be implemented in at least one of the following manners:
(1) and detecting the position of the user through an infrared sensor.
Since any substance can radiate infrared rays as long as it has a certain temperature (higher than absolute zero degrees), the position of the user can be detected by the infrared ray sensor.
(2) And carrying out sound source positioning on the user through the array type microphones.
The Kinect equipment is internally provided with array microphones, namely comprises at least two microphones, sound is received through a microphone array formed by the at least two microphones, and then the position direction of a sound source can be judged by calculating the direction angle of the sound source incident to the microphone array through a preset algorithm.
(3) And determining the position of the user by detecting the strength of a wireless signal sent by the equipment carried by the user.
If the user carries equipment capable of sending wireless signals, such as a mobile phone, a bracelet, a watch and the like, the user can be positioned according to the strength of the wireless signals sent by the equipment carried by the user in a networking mode.
According to the technical scheme, the intelligent household equipment is controlled through the specified action or voice command, for example, when a user wants to open a curtain, the user only needs to stick a thumb to open the curtain, and therefore the curtain can be controlled to be opened. Therefore, by using the technical scheme of the invention, the intelligent household equipment is not required to be controlled by a control terminal such as a mobile phone, a control panel and a remote controller, and the user experience is improved.
Fig. 4 is a schematic structural diagram of an embodiment of the device control apparatus provided in the present invention. The device control means may be used to control the device and/or other devices. The device comprises: gateways, routers, or appliances (e.g., air conditioners, refrigerators, washing machines, televisions, water heaters, microwave ovens, etc.), including: at least one of an air conditioner, a refrigerator, a washing machine, a television, a water heater, and a microwave oven. The device controls other devices by communicating with the other devices. For example, networked with the devices through gateways and/or routers to control the devices.
As shown in fig. 4, the device control apparatus 100 includes: the capture unit 110 and/or the receiving unit 120, and further comprises a control unit 130.
The capturing unit 110 is used for capturing the motion of the user in the predetermined area through the Kinect device, and/or the receiving unit 120 is used for receiving the voice command sent by the user; the control unit 130 is configured to, if the capturing unit captures that the user makes a preset action and/or the receiving unit receives a voice command of the user, control the corresponding device according to the preset action and/or the voice command.
Specifically, preset actions for controlling different devices are preset, which can be specifically set by a user, a user within a predetermined area is motion-captured using a Kinect motion capture technology, at least one Kinect device (e.g., a Kinect camera) is provided within the predetermined area, the capturing unit 110 performs motion capture on the user within the predetermined area by the at least one Kinect device, and identifies whether the actions made by the user are preset actions, for example, as shown in fig. 2, the present invention is implemented in a gateway C, controls devices B1, B2, B3 and B4 within a living area 1, and Kinect devices a1, a2, A3 and a4 are respectively installed at four corners of the living area 1 (e.g., a room) wherein the Kinect device captures depth information, generates a three-dimensional point cloud of the actions by a preset program, and transmits the three-dimensional point cloud to a model in VR mode, to track human motion trajectory and use machine learning techniques for motion recognition.
The receiving unit 120 may receive a voice command sent by a user through a microphone, for example, an array microphone built in a Kinect device, or the receiving unit 120 may receive a voice command sent by a user through a voice bluetooth remote controller.
If the capturing unit 110 captures the preset action of the user and/or the receiving unit 120 receives a voice command sent by the user, the control unit 130 controls the corresponding device according to the preset action and/or the voice command.
Specifically, the corresponding device includes a device to which the corresponding device belongs and other devices, and if the capturing unit 110 captures a preset action made by the user, the control unit 130 controls the corresponding device according to the preset action, identifies a first control command corresponding to the preset action, and sends the first control command to the corresponding device, so that the corresponding device executes a corresponding operation according to the first control command. The method includes the steps that preset actions for controlling different devices and corresponding first control commands for controlling the corresponding devices are preset, when the capturing unit 110 captures that a user makes the preset actions, the control unit 130 obtains the first control commands corresponding to the preset actions made by the user and sends the first control commands to the corresponding devices, and after the corresponding devices receive the first control commands, the corresponding devices execute operations corresponding to the first control commands.
The controlling unit 130 controls the corresponding device according to the voice command, including: recognizing the semantic meaning corresponding to the voice command so as to determine a second control command corresponding to the voice command according to the recognized semantic meaning; and sending the second control command to corresponding equipment so that the corresponding equipment executes corresponding operation according to the second control command. The method includes the steps that semantics corresponding to a second control command used for controlling corresponding equipment are preset, when a voice command sent by a user is received by the receiving unit 120, the control unit 130 conducts semantic recognition on the voice command sent by the user, determines a second control command corresponding to the voice command according to the recognized semantics corresponding to the voice command, and sends the second control command to the corresponding equipment, and after the corresponding equipment receives the second control command, the corresponding equipment executes operation corresponding to the second control command.
The control unit 130 may send the first control command to the corresponding device and/or send the second control command to the corresponding device, and may communicate through at least one of a local area network, WiFi, a router, and bluetooth.
Specifically, the control unit 130 may connect to a device to be controlled through at least one of local area network, WiFi, router, and bluetooth, so as to send the first control command and/or the second control command to the corresponding device.
Fig. 5 is a schematic structural diagram of another embodiment of the device control apparatus provided in the present invention. As shown in fig. 5, based on the above-described embodiment, the device control apparatus 100 further includes a detection unit 102.
The detecting unit 102 is configured to detect a location of a user in a predetermined area before the capturing unit 110 performs motion capture on the user in the predetermined area through a Kinect device, so as to perform motion capture on the user through the Kinect device corresponding to the location of the user.
Specifically, the predetermined area includes more than one sub-area, a corresponding Kinect device may be respectively disposed in each sub-area, and before the capturing unit 110 captures the motion of the user in the predetermined area through the Kinect device, the detecting unit 102 detects the position of the user in the predetermined area, so as to determine the sub-area where the user is located in the predetermined area according to the position of the user, so that the capturing unit 110 captures the motion of the user through the Kinect device corresponding to the sub-area where the user is located in the predetermined area, that is, opens the Kinect device closest to the user to capture the motion of the user, without opening all the Kinect motion capturing devices, thereby reducing energy consumption.
The detecting unit 102 may specifically detect the location of the user in the predetermined area through at least one of the following implementation manners:
(1) and detecting the position of the user through an infrared sensor.
Since any substance can radiate infrared rays as long as it has a certain temperature (higher than absolute zero degrees), the position of the user can be detected by the infrared ray sensor.
(2) And carrying out sound source positioning on the user through the array type microphones.
The Kinect equipment is internally provided with array microphones, namely comprises at least two microphones, sound is received through a microphone array formed by the at least two microphones, and then the position direction of a sound source can be judged by calculating the direction angle of the sound source incident to the microphone array through a preset algorithm.
(3) And determining the position of the user by detecting the strength of a wireless signal sent by the equipment carried by the user.
If the user carries equipment capable of sending wireless signals, such as a mobile phone, a bracelet, a watch and the like, the user can be positioned according to the strength of the wireless signals sent by the equipment carried by the user in a networking mode.
The present invention also provides a computer-readable storage medium corresponding to the device control method, on which a computer program is stored, which when executed by a processor implements the steps of any of the methods described above.
The invention also provides a device corresponding to the device control method, which comprises a processor, a memory and a computer program stored in the memory and capable of running on the processor, wherein the processor executes the program to realize the steps of any one of the methods.
The invention also provides equipment corresponding to the equipment control device, which comprises any one of the equipment control devices. Any one of the above devices may specifically include: at least one of a gateway, a router, an air conditioner, a refrigerator, a washing machine, a television, a water heater, and a microwave oven. The device may control itself or other devices through the device control means.
According to the scheme provided by the invention, the Kinect equipment is used for capturing the motion of the user in the preset area and/or receiving the voice command sent by the user; if the user is caught to make the preset action and/or the voice command sent by the user is received, the corresponding equipment is controlled according to the preset action and/or the voice command, the intelligent home equipment is controlled through the preset action or the voice command, the intelligent home equipment is not required to be controlled by a control terminal such as a mobile phone, a control panel and a remote controller, and the user experience is improved.
The functions described herein may be implemented in hardware, software executed by a processor, firmware, or any combination thereof. If implemented in software executed by a processor, the functions may be stored on or transmitted over as one or more instructions or code on a computer-readable medium. Other examples and implementations are within the scope and spirit of the invention and the following claims. For example, due to the nature of software, the functions described above may be implemented using software executed by a processor, hardware, firmware, hardwired, or a combination of any of these. In addition, each functional unit may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
In the embodiments provided in the present application, it should be understood that the disclosed technology can be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units may be a logical division, and in actual implementation, there may be another division, for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, units or modules, and may be in an electrical or other form.
The units described as separate parts may or may not be physically separate, and the parts serving as the control device may or may not be physical units, may be located in one place, or may be distributed on a plurality of units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic or optical disk, and other various media capable of storing program codes.
The above description is only an example of the present invention, and is not intended to limit the present invention, and it is obvious to those skilled in the art that various modifications and variations can be made in the present invention. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the scope of the claims of the present invention.

Claims (14)

1. An apparatus control method characterized by comprising:
the method comprises the steps that motion capture is conducted on a user in a preset area through a Kinect device, whether the user conducts preset motion or not is identified, and/or a voice command sent by the user is received;
if the user is caught to make the preset action and/or a voice command sent by the user is received, controlling corresponding equipment according to the preset action and/or the voice command;
the predetermined area comprises more than one sub-area; before motion capture is performed on a user in a predetermined area through a Kinect device, the method further comprises the following steps:
detecting the position of a user in the preset area so as to determine a sub-area of the user in the preset area according to the position of the user;
motion capture of a user within a predetermined area by a Kinect device, comprising:
and performing motion capture on the user through a Kinect device corresponding to the sub-area where the user is located in the preset area.
2. The method of claim 1, wherein detecting the location of the user within the predetermined area comprises:
detecting the position of the user through an infrared sensor;
and/or the presence of a gas in the gas,
carrying out sound source positioning on the user through an array microphone;
and/or the presence of a gas in the gas,
and determining the position of the user by detecting the strength of a wireless signal sent by the equipment carried by the user.
3. The method according to claim 1 or 2,
controlling the corresponding equipment according to the preset action, comprising:
identifying a first control command corresponding to the preset action;
sending the first control command to corresponding equipment to enable the corresponding equipment to execute corresponding operation according to the first control command;
and/or the presence of a gas in the gas,
controlling the corresponding equipment according to the voice command, comprising:
recognizing the semantic meaning corresponding to the voice command so as to determine a second control command corresponding to the voice command according to the recognized semantic meaning;
and sending the second control command to corresponding equipment so that the corresponding equipment executes corresponding operation according to the second control command.
4. The method of claim 3, wherein sending the first control command to the respective device and/or sending the second control command to the respective device comprises:
and sending the first control command and/or the second control command to the corresponding equipment through at least one communication mode of a local area network, WiFi, a router and Bluetooth.
5. The method according to any of claims 1-4, wherein the apparatus comprises: at least one of an air conditioner, a refrigerator, a washing machine, a television, a water heater, and a microwave oven.
6. An apparatus control device, characterized by comprising:
the device comprises a capturing unit and a receiving unit, wherein the capturing unit is used for capturing the motion of a user in a preset area through a Kinect device so as to identify whether the user makes a preset motion, and/or the receiving unit is used for receiving a voice command sent by the user;
the control unit is used for controlling the corresponding equipment according to the preset action and/or the voice command if the capture unit captures that the user makes the preset action and/or the receiving unit receives the voice command sent by the user;
the predetermined area comprises more than one sub-area; the device further comprises:
the detection unit is used for detecting the position of a user in a preset area before the capturing unit captures the motion of the user in the preset area through a Kinect device, so as to determine a sub-area of the user in the preset area according to the position of the user;
the capture unit further to: and performing motion capture on the user through a Kinect device corresponding to the sub-area where the user is located in the preset area.
7. The apparatus of claim 6, wherein the detecting unit detects the location of the user in the predetermined area, and comprises:
detecting the position of the user through an infrared sensor;
and/or the presence of a gas in the gas,
carrying out sound source positioning on the user through an array microphone;
and/or the presence of a gas in the gas,
and determining the position of the user by detecting the strength of a wireless signal sent by the equipment carried by the user.
8. The apparatus according to claim 6 or 7,
the control unit controls the corresponding equipment according to the preset action, and comprises:
identifying a first control command corresponding to the preset action;
sending the first control command to corresponding equipment to enable the corresponding equipment to execute corresponding operation according to the first control command;
and/or the presence of a gas in the gas,
the control unit controls the corresponding equipment according to the voice command, and comprises:
recognizing the semantic meaning corresponding to the voice command so as to determine a second control command corresponding to the voice command according to the recognized semantic meaning;
and sending the second control command to corresponding equipment so that the corresponding equipment executes corresponding operation according to the second control command.
9. The apparatus of claim 8, wherein the control unit to send the first control command to the corresponding device and/or to send the second control command to the corresponding device comprises:
and sending the first control command and/or the second control command to the corresponding equipment through at least one communication mode of a local area network, WiFi, a router and Bluetooth.
10. The apparatus according to any one of claims 6-9, wherein the device comprises: at least one of an air conditioner, a refrigerator, a washing machine, a television, a water heater, and a microwave oven.
11. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 5.
12. An apparatus comprising a processor, a memory, and a computer program stored on the memory and executable on the processor, the processor implementing the steps of the method of any of claims 1-5 when executing the program.
13. An apparatus, characterized by comprising an apparatus control device according to any one of claims 6-10.
14. The apparatus according to claim 12 or 13, characterized in that it comprises: at least one of a gateway, a router, an air conditioner, a refrigerator, a washing machine, a television, a water heater, and a microwave oven.
CN201810057973.7A 2018-01-22 2018-01-22 Equipment control method and device, storage medium and equipment Active CN108375911B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810057973.7A CN108375911B (en) 2018-01-22 2018-01-22 Equipment control method and device, storage medium and equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810057973.7A CN108375911B (en) 2018-01-22 2018-01-22 Equipment control method and device, storage medium and equipment

Publications (2)

Publication Number Publication Date
CN108375911A CN108375911A (en) 2018-08-07
CN108375911B true CN108375911B (en) 2020-03-27

Family

ID=63015171

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810057973.7A Active CN108375911B (en) 2018-01-22 2018-01-22 Equipment control method and device, storage medium and equipment

Country Status (1)

Country Link
CN (1) CN108375911B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109839827B (en) * 2018-12-26 2021-11-30 哈尔滨拓博科技有限公司 Gesture recognition intelligent household control system based on full-space position information
CN110613457A (en) * 2019-08-23 2019-12-27 珠海格力电器股份有限公司 Detection method and device
CN111306714A (en) * 2020-03-03 2020-06-19 青岛海尔空调器有限总公司 Air conditioner and control method thereof
CN112965592A (en) * 2021-02-24 2021-06-15 中国工商银行股份有限公司 Equipment interaction method, device and system
CN114488831B (en) * 2022-01-10 2023-09-08 锋芒科技南京有限公司 Internet of things household intelligent control system and method based on man-machine interaction

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102824092A (en) * 2012-08-31 2012-12-19 华南理工大学 Intelligent gesture and voice control system of curtain and control method thereof
CN102932212A (en) * 2012-10-12 2013-02-13 华南理工大学 Intelligent household control system based on multichannel interaction manner
CN104750085A (en) * 2015-04-23 2015-07-01 谢玉章 Intelligent hardware voice body control method
CN105137771A (en) * 2015-07-27 2015-12-09 上海斐讯数据通信技术有限公司 Intelligent household appliance control system and method based on mobile terminal
CN205594339U (en) * 2016-04-13 2016-09-21 南京工业职业技术学院 Intelligent house control system is felt to body
CN107272902A (en) * 2017-06-23 2017-10-20 深圳市盛路物联通讯技术有限公司 Smart home service end, control system and control method based on body feeling interaction

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100786108B1 (en) * 2006-05-01 2007-12-18 김준식 Sound communication networks
CN101198030B (en) * 2007-12-18 2010-06-23 北京中星微电子有限公司 Camera locating method and locating device of video monitoring system
CN101587606B (en) * 2008-05-21 2012-05-30 上海新联纬讯科技发展有限公司 Method and system for detecting staff flow in exhibition venue
CN102004900B (en) * 2010-11-04 2016-08-24 北京理工大学 A kind of Novel infrared index point coding manner being applied to follow the tracks of system on a large scale

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102824092A (en) * 2012-08-31 2012-12-19 华南理工大学 Intelligent gesture and voice control system of curtain and control method thereof
CN102932212A (en) * 2012-10-12 2013-02-13 华南理工大学 Intelligent household control system based on multichannel interaction manner
CN104750085A (en) * 2015-04-23 2015-07-01 谢玉章 Intelligent hardware voice body control method
CN105137771A (en) * 2015-07-27 2015-12-09 上海斐讯数据通信技术有限公司 Intelligent household appliance control system and method based on mobile terminal
CN205594339U (en) * 2016-04-13 2016-09-21 南京工业职业技术学院 Intelligent house control system is felt to body
CN107272902A (en) * 2017-06-23 2017-10-20 深圳市盛路物联通讯技术有限公司 Smart home service end, control system and control method based on body feeling interaction

Also Published As

Publication number Publication date
CN108375911A (en) 2018-08-07

Similar Documents

Publication Publication Date Title
CN108375911B (en) Equipment control method and device, storage medium and equipment
US10334304B2 (en) Set top box automation
CN110085233B (en) Voice control method and device, electronic equipment and computer readable storage medium
EP3517849B1 (en) Household appliance control method, device and system, and intelligent air conditioner
CN105785782B (en) Intelligent home equipment control method and device
CN107991891B (en) Method and system for adjusting environmental parameters and user equipment
CN106705385A (en) Control method, control device and control system of air conditioner
US20130241830A1 (en) Gesture input apparatus, control program, computer-readable recording medium, electronic device, gesture input system, and control method of gesture input apparatus
CN107219766B (en) Control method and device of intelligent household equipment
CN104915225A (en) Method and device for controlling intelligent device
WO2017016432A1 (en) Intelligent home appliance control method and intelligent home appliance controller
CN110767225B (en) Voice interaction method, device and system
CN105045140A (en) Method and device for intelligently controlling controlled equipment
CN105045240A (en) Household appliance control method and device
CN204116902U (en) To the voice-operated Voice command end of household electrical appliance and control terminal
CN105830397A (en) Method for changing over domestic appliances between an at-home mode and a not-at-home mode, portable operating apparatus, system and computer program product
CN107621784A (en) Intelligent home furnishing control method, apparatus and system
WO2016192458A1 (en) User terminal, home central controller and smart home control method and system
CN104216351A (en) Household appliance voice control method and system
CN105206020B (en) Remote controler matching method, apparatus and system
CN105245416A (en) Household appliance controlling method and device
CN110895934A (en) Household appliance control method and device
CN105094063A (en) Intelligent household control method and device
CN106504510B (en) Remote infrared control method and device
CN110632854A (en) Voice control method and device, voice control node and system and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant