CN112580551A - Equipment control method and device - Google Patents

Equipment control method and device Download PDF

Info

Publication number
CN112580551A
CN112580551A CN202011556721.2A CN202011556721A CN112580551A CN 112580551 A CN112580551 A CN 112580551A CN 202011556721 A CN202011556721 A CN 202011556721A CN 112580551 A CN112580551 A CN 112580551A
Authority
CN
China
Prior art keywords
target
somatosensory
determining
camera
action
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011556721.2A
Other languages
Chinese (zh)
Inventor
赵睿
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qingdao Haier Technology Co Ltd
Haier Smart Home Co Ltd
Original Assignee
Qingdao Haier Technology Co Ltd
Haier Smart Home Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qingdao Haier Technology Co Ltd, Haier Smart Home Co Ltd filed Critical Qingdao Haier Technology Co Ltd
Priority to CN202011556721.2A priority Critical patent/CN112580551A/en
Publication of CN112580551A publication Critical patent/CN112580551A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/28Data switching networks characterised by path configuration, e.g. LAN [Local Area Networks] or WAN [Wide Area Networks]
    • H04L12/2803Home automation networks
    • H04L12/2816Controlling appliance services of a home automation network by calling their functionalities
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • G06T2207/10012Stereo images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Engineering & Computer Science (AREA)
  • Psychiatry (AREA)
  • Social Psychology (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Automation & Control Theory (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention provides a device control method and a device, wherein the method comprises the following steps: acquiring a target somatosensory action of a target object through an arranged somatosensory camera; determining a target control command corresponding to the target somatosensory action; according to the intelligent equipment controlled by the target control instruction, the problems that in the related art, the intelligent equipment is remotely controlled through a central control device and the implementation is complex and high in cost can be solved, the user can operate the intelligent equipment conveniently and quickly without a remote controller or a mobile phone, the intelligent equipment can be controlled easily through a simple gesture, meanwhile, the intelligent equipment can be controlled in an air-isolated mode without supporting other additional equipment, and only the intelligent equipment is controlled by embedding the body sensing camera.

Description

Equipment control method and device
Technical Field
The invention relates to the field of smart home, in particular to a device control method and device.
Background
At present, can control the air conditioner with cell-phone remote control and with the remote controller, still provide the intelligent house of taking the area to feel function and manage home control equipment now, well accuse equipment is a robot, and the user needs to do corresponding limbs action in the front of the robot, and well accuse equipment discerns after corresponding action with corresponding control command long-range transmission for intelligent house, and then equipment such as control air conditioner. The scheme of remotely sending the information to the intelligent home through the central control equipment has the defects of complex realization, high cost and capability of completing corresponding functions only by a special central control robot.
Aiming at the problems of complex realization and high cost of remotely controlling intelligent equipment through central control equipment in the related art, no solution is provided.
Disclosure of Invention
The embodiment of the invention provides a device control method and device, which are used for at least solving the problems of complex realization and high cost of remotely controlling intelligent devices through central control equipment in the related art.
According to an embodiment of the present invention, there is provided a device control method applied to an intelligent device, including: acquiring a target somatosensory action of a target object through an arranged somatosensory camera; determining a target control command corresponding to the target somatosensory action; and controlling the intelligent equipment according to the target control instruction.
In one exemplary embodiment, the acquiring of the target body sensing action of the target object through the arranged body sensing camera comprises the following steps: acquiring a plurality of RGB (Red Green blue) color images and a plurality of depth images collected by the somatosensory camera; performing skeleton tracking according to the RGB color images and the depth images to obtain a skeleton image; and determining the target somatosensory action according to the skeleton image.
In an exemplary embodiment, the somatosensory camera comprises an RGB color camera and a depth camera, wherein the RGB color camera is used for collecting the RGB color images, and the depth camera is used for collecting the depth images.
In an exemplary embodiment, determining a target control command corresponding to the target somatosensory action includes: and determining a target control command corresponding to the target body sensing action according to a pre-stored corresponding relation between the body sensing action and the control command.
In an exemplary embodiment, the determining, according to the pre-stored correspondence between the body-sensing motion and the control instruction, that the target control command corresponding to the target body-sensing motion includes at least one of: if the target body sensation is taken as a first limb action, determining the target control instruction as a starting instruction according to the corresponding relation; if the target body feeling is taken as a second limb action, determining the target control instruction as a shutdown instruction according to the corresponding relation; if the target body feeling is taken as a third limb action, determining the target control instruction as a temperature control instruction according to the corresponding relation; if the target body sensation movement is taken as a fourth limb body movement, determining that the target control instruction is a mode control instruction according to the corresponding relation; and if the target body feeling is taken as the fifth limb action, determining the target control instruction as the humidity control instruction according to the corresponding relation.
In an exemplary embodiment, before acquiring a target somatosensory action of a target object through a set somatosensory camera, the method further includes: monitoring the target object through the camera; and sending out voice information for prompting the somatosensory action.
According to another embodiment of the present invention, there is also provided an apparatus control device applied to an intelligent apparatus, including: the acquisition module is used for acquiring target somatosensory motion of the target object through the set somatosensory camera; the determining module is used for determining a target control command corresponding to the target somatosensory action; and the control module is used for controlling the intelligent equipment according to the target control instruction.
In one exemplary embodiment, the obtaining module includes: the acquisition unit is used for acquiring a plurality of RGB color images and a plurality of depth images collected by the somatosensory camera; the skeleton tracking unit is used for carrying out skeleton tracking according to the RGB color images and the depth images to obtain a skeleton image; and the determining unit is used for determining the target somatosensory motion according to the skeleton image.
In an exemplary embodiment, the somatosensory camera comprises an RGB color camera and a depth camera, wherein the RGB color camera is used for collecting the RGB color images, and the depth camera is used for collecting the depth images.
In one exemplary embodiment, the determining module further comprises: and the command determining unit is used for determining a target control command corresponding to the target body sensing action according to the corresponding relation between the pre-stored body sensing action and the control command.
In an exemplary embodiment, the command determining unit is further configured to perform at least one of:
if the target body sensation is taken as a first limb action, determining the target control instruction as a starting instruction according to the corresponding relation;
if the target body feeling is taken as a second limb action, determining the target control instruction as a shutdown instruction according to the corresponding relation;
if the target body feeling is taken as a third limb action, determining the target control instruction as a temperature control instruction according to the corresponding relation;
if the target body sensation movement is taken as a fourth limb body movement, determining that the target control instruction is a mode control instruction according to the corresponding relation;
and if the target body feeling is taken as the fifth limb action, determining the target control instruction as the humidity control instruction according to the corresponding relation.
In one exemplary embodiment, further comprising: the camera detection module is used for monitoring the target object through the camera; and the voice prompt module is used for sending out voice information for prompting the somatosensory action.
According to a further embodiment of the present invention, a computer-readable storage medium is also provided, in which a computer program is stored, wherein the computer program is configured to perform the steps of any of the above-described method embodiments when executed.
According to yet another embodiment of the present invention, there is also provided an electronic device, including a memory in which a computer program is stored and a processor configured to execute the computer program to perform the steps in any of the above method embodiments.
According to the invention, the target somatosensory action of the target object is acquired through the somatosensory camera; determining a target control command corresponding to the target somatosensory action; according to target control instruction control smart machine can solve through well accuse equipment remote control smart machine among the correlation technique, realizes more complicated and the higher problem of cost, and user operation is got up convenient and fast, does not need remote controller or cell-phone, and simple gesture just can easily control smart machine, also need not extra other equipment to support simultaneously, only need imbed body and feel the camera, just can realize keeping apart empty control smart machine.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the invention and together with the description serve to explain the invention without limiting the invention. In the drawings:
fig. 1 is a block diagram of a hardware configuration of a mobile terminal of an apparatus control method of an embodiment of the present invention;
FIG. 2 is a flow chart of a device control method according to an embodiment of the present invention;
FIG. 3 is a flow chart of a device control method according to an alternative embodiment of the present invention;
FIG. 4 is a flow chart illustrating a device control method according to an alternative embodiment of the present invention;
fig. 5 is a sequence diagram of a device control method according to an alternative embodiment of the present invention;
FIG. 6 is a skeletal tracking identification image in accordance with an alternative embodiment of the present invention;
fig. 7 is a block diagram of a device control apparatus according to an embodiment of the present invention.
Detailed Description
The invention will be described in detail hereinafter with reference to the accompanying drawings in conjunction with embodiments. It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict.
It should be noted that the terms "first," "second," and the like in the description and claims of the present invention and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order.
Example 1
The method provided by the first embodiment of the present application may be executed in a mobile terminal, a computer terminal, or a similar computing device. Taking a mobile terminal as an example, fig. 1 is a hardware structure block diagram of the mobile terminal of the device control method according to the embodiment of the present invention, as shown in fig. 1, the mobile terminal may include one or more processors 102 (only one is shown in fig. 1) (the processor 102 may include, but is not limited to, a processing device such as a microprocessor MCU or a programmable logic device FPGA), and a memory 104 for storing data, and optionally, the mobile terminal may further include a transmission device 106 for a communication function and an input/output device 108. It will be understood by those skilled in the art that the structure shown in fig. 1 is only an illustration, and does not limit the structure of the mobile terminal. For example, the mobile terminal may also include more or fewer components than shown in FIG. 1, or have a different configuration than shown in FIG. 1.
The memory 104 may be used to store computer programs, for example, software programs and modules of application software, such as computer programs corresponding to the device control method in the embodiment of the present invention, and the processor 102 executes various functional applications and data processing by running the computer programs stored in the memory 104, so as to implement the method described above. The memory 104 may include high speed random access memory, and may also include non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid-state memory. In some examples, the memory 104 may further include memory located remotely from the processor 102, which may be connected to the mobile terminal over a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The transmission device 106 is used for receiving or transmitting data via a network. Specific examples of the network described above may include a wireless network provided by a communication provider of the mobile terminal. In one example, the transmission device 106 includes a Network adapter (NIC) that can be connected to other Network devices through a base station to communicate with the internet. In one example, the transmission device 106 can be a Radio FrequeNcy (RF) module, which is used to communicate with the internet in a wireless manner.
Based on the foregoing mobile terminal or network architecture, in this embodiment, an apparatus control method is provided, which is applied to an intelligent apparatus, and fig. 2 is a flowchart of the apparatus control method according to the embodiment of the present invention, as shown in fig. 2, the flowchart includes the following steps:
step S202, acquiring a target somatosensory action of a target object through a set somatosensory camera;
step S204, determining a target control command corresponding to the target somatosensory motion;
and step S206, controlling the intelligent equipment according to the target control instruction.
Acquiring a target body sensing action of the target object through the arranged body sensing camera through the steps S202 to S206; determining a target control command corresponding to the target somatosensory action; according to target control instruction control smart machine can solve through well accuse equipment remote control smart machine among the correlation technique, realizes more complicated and the higher problem of cost, and user operation is got up convenient and fast, does not need remote controller or cell-phone, and simple gesture just can easily control smart machine, also need not extra other equipment to support simultaneously, only need imbed body and feel the camera, just can realize keeping apart empty control smart machine.
Fig. 3 is a flowchart illustrating a device control method according to an alternative embodiment of the present invention (i), and as shown in fig. 3, the step S202 includes:
step S302, acquiring a plurality of RGB color images and a plurality of depth images collected by the somatosensory camera;
step S304, carrying out skeleton tracking according to the RGB color images and the depth images to obtain a skeleton image;
and S306, determining the target somatosensory motion according to the skeleton image.
That is, the motion sensing camera acquires the motion sensing action by acquiring a color image and a depth image through the camera, then performing skeleton tracking on the image to obtain a skeleton image, and finally determining the motion sensing action according to the skeleton image.
In an optional embodiment, the motion sensing camera includes an RGB color camera and a depth camera, where the RGB color camera is configured to collect the RGB color images, and the depth camera is configured to collect the depth images.
In an alternative embodiment, the step S204 includes: and determining a target control command corresponding to the target body sensing action according to a pre-stored corresponding relation between the body sensing action and the control command.
That is, determining a control command according to a body-sensory motion requires storing a correspondence between the body-sensory motion and a control command in advance.
In an optional embodiment, determining, according to a pre-stored correspondence between a somatosensory motion and a control instruction, that a target control command corresponding to the target somatosensory motion includes at least one of: if the target body sensation is taken as a first limb action, determining the target control instruction as a starting instruction according to the corresponding relation; if the target body feeling is taken as a second limb action, determining the target control instruction as a shutdown instruction according to the corresponding relation; if the target body feeling is taken as a third limb action, determining the target control instruction as a temperature control instruction according to the corresponding relation; if the target body sensation movement is taken as a fourth limb body movement, determining that the target control instruction is a mode control instruction according to the corresponding relation; and if the target body feeling is taken as the fifth limb action, determining the target control instruction as the humidity control instruction according to the corresponding relation. It should be noted that the first limb movement, the second limb movement, the third limb movement, the fourth limb movement, and the fifth limb movement in this embodiment may be customized according to the usage habits of the user, and the specific implementation manner is not limited.
That is, the corresponding command is determined according to different limb movements, wherein the relationship between the limb movement and the control command is stored in advance.
Fig. 4 is a schematic flow chart (ii) of a device control method according to an alternative embodiment of the present invention, and as shown in fig. 4, before the step S202, the method further includes:
step S402, monitoring the target object through the camera;
in step S404, voice information for prompting a somatosensory action is generated.
That is, before the somatosensory motion is acquired, the target object needs to be monitored through the camera, and then the voice prompt is sent out to perform the somatosensory motion.
The following describes an embodiment of the present invention in detail by taking an intelligent device as an air conditioner as an example.
Fig. 5 is a sequence diagram of a device control method according to an alternative embodiment of the present invention, as shown in fig. 5, including:
step S1, the user opens and lifts the arm;
step S2, the camera carries out somatosensory motion recognition;
step S3, sending out prompt tone or lamp power to user;
step S4, the camera sends a starting command to the air conditioner;
step S5, the air conditioner processes the starting command;
step S6, the user makes corresponding limb movement;
step S7, the camera carries out somatosensory motion recognition;
step S8, the camera sends a corresponding control command to the air conditioner;
step S9, the air conditioner processes the control command;
step S10, the user clenches the fist and puts down the arm;
step S11, the camera carries out somatosensory motion recognition;
step S12, sending a shutdown command to the air conditioner by the camera;
in step S13, the air conditioner processes the shutdown command.
Specifically, the camera adopts a Kinect somatosensory camera, and a development platform of Kinect for Windows SDK v1.8+ OpenCV 3.1.0 is adopted for development to obtain somatosensory motion of a human body. The SDK provided by Kinect provides audio support, a rotating motor to adjust tilt, and in the tracking of bones throughout the body: nonstandard posture detects, and the processing on details such as head, hand, foot, clavicle detection and joint shelter from is more careful, and the camera later stage can also be developed more body and feel functional identification to Kinect body, and expansibility is very long wide. The motion sensing camera adopts a 3D motion sensing camera which is provided with three lenses, and the middle lens is an RGB color camera and is used for collecting color images. The left and right lenses are 3D structured light depth sensors formed by an infrared emitter and an infrared camera, respectively, for collecting depth data (distance from an object in the scene to the camera). The functions of real-time dynamic capture, image recognition, microphone input, voice recognition, community interaction and the like are introduced.
The somatosensory motion can be set as: the palm opens and lifts the arm (one of the concrete implementation modes of the first limb action), a prompt tone is sent, and the air conditioner is started;
lifting the arm (one of the specific implementation manners of the second limb action) when the user clenches, and giving out a prompt sound, so that the air conditioner is turned off;
the palm is swung upwards to raise the temperature;
the temperature is reduced, wherein the upward swinging of the palm and the downward swinging of the palm are concrete implementation manners of the third limb action;
swinging the palm leftwards- > switching the mode leftwards;
the right-hand swing- > right-hand switching mode is adopted, wherein the left-hand swing and the right-hand swing of the palm are specific implementation manners of the fourth limb movement, but the fourth limb movement is not limited;
the hands are lifted, the palms swing downwards, and the air conditioner swings upwards and downwards in the wind direction;
the hands are lifted, the palms swing left and right, and the air conditioner swings left and right in the air direction.
Wherein, the motion sensing action recognition details: first, it is necessary to integrate the Kinect2.0 SDK provided by Microsoft, which contains drivers, rich raw sensing data stream development interfaces, natural user interfaces, installation files, and reference routines.
In the present embodiment, two functions, namely, raw sensing data stream and skeleton tracking, are mainly used.
Raw sensing data stream: developers can directly get raw data streams of the distance sensor, the color camera, and the four-unit microphone array. These data allow developers to develop applications based on the low-level data stream of the Kinect sensor.
Skeleton tracking: the set of SDK can track skeleton images of one or two users in the Kinect visual field, and is convenient for establishing an application program operated by body feeling.
Fig. 6 is a skeletal tracking identification image according to an alternative embodiment of the invention, as shown in fig. 6, the image obtained through the above processing shows a plurality of skeletal points of the human body, according to which the limb movement performed by the user can be identified, and corresponding control commands are issued according to different limb movements.
Through the above description of the embodiments, those skilled in the art can clearly understand that the method according to the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but the former is a better implementation mode in many cases. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (e.g., ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal device (e.g., a mobile phone, a computer, a server, or a network device) to execute the method according to the embodiments of the present invention.
Example 2
In this embodiment, an apparatus control device is further provided, which is applied to an intelligent apparatus, and is used to implement the foregoing embodiments and preferred embodiments, and the description of the apparatus is omitted. As used below, the term "module" may be a combination of software and/or hardware that implements a predetermined function. Although the means described in the embodiments below are preferably implemented in software, an implementation in hardware, or a combination of software and hardware is also possible and contemplated.
Fig. 7 is a block diagram of an apparatus control device according to an embodiment of the present invention, as shown in fig. 7, including:
the acquisition module 72 is used for acquiring a target somatosensory action of the target object through the arranged somatosensory camera;
a determining module 74, configured to determine a target control command corresponding to the target somatosensory motion;
and a control module 76 for controlling the intelligent device according to the target control instruction.
Through the device, the target somatosensory motion of the target object is acquired through the somatosensory camera; determining a target control command corresponding to the target somatosensory action; according to target control instruction control smart machine can solve through well accuse equipment remote control smart machine among the correlation technique, realizes more complicated and the higher problem of cost, and user operation is got up convenient and fast, does not need remote controller or cell-phone, and simple gesture just can easily control smart machine, also need not extra other equipment to support simultaneously, only need imbed body and feel the camera, just can realize keeping apart empty control smart machine.
In an optional embodiment, the obtaining module includes: the acquisition unit is used for acquiring a plurality of RGB color images and a plurality of depth images collected by the somatosensory camera; the skeleton tracking unit is used for carrying out skeleton tracking according to the RGB color images and the depth images to obtain a skeleton image; and the determining unit is used for determining the target somatosensory motion according to the skeleton image.
That is, the motion sensing camera acquires the motion sensing action by acquiring a color image and a depth image through the camera, then performing skeleton tracking on the image to obtain a skeleton image, and finally determining the motion sensing action according to the skeleton image.
In an optional embodiment, the motion sensing camera includes an RGB color camera and a depth camera, where the RGB color camera is configured to collect the RGB color images, and the depth camera is configured to collect the depth images.
In an optional embodiment, the determining module further comprises: and the command determining unit is used for determining a target control command corresponding to the target body sensing action according to the corresponding relation between the pre-stored body sensing action and the control command.
That is, determining a control command according to a body-sensory motion requires storing a correspondence between the body-sensory motion and a control command in advance.
In an alternative embodiment, the command determination unit is further configured to perform at least one of:
if the target body sensation is taken as a first limb action, determining the target control instruction as a starting instruction according to the corresponding relation;
if the target body feeling is taken as a second limb action, determining the target control instruction as a shutdown instruction according to the corresponding relation;
if the target body feeling is taken as a third limb action, determining the target control instruction as a temperature control instruction according to the corresponding relation;
if the target body sensation movement is taken as a fourth limb body movement, determining that the target control instruction is a mode control instruction according to the corresponding relation;
and if the target body feeling is taken as the fifth limb action, determining the target control instruction as the humidity control instruction according to the corresponding relation.
That is, the corresponding command is determined according to different limb movements, wherein the relationship between the limb movement and the control command is stored in advance.
In an optional embodiment, further comprising: the camera detection module is used for monitoring the target object through the camera; and the voice prompt module is used for sending out voice information for prompting the somatosensory action.
That is, before the somatosensory motion is acquired, the target object needs to be monitored through the camera, and then the voice prompt is sent out to perform the somatosensory motion.
It should be noted that, the above modules may be implemented by software or hardware, and for the latter, the following may be implemented, but not limited to: the modules are all positioned in the same processor; alternatively, the modules are respectively located in different processors in any combination.
Example 3
Embodiments of the present invention also provide a storage medium having a computer program stored therein, wherein the computer program is arranged to perform the steps of any of the above method embodiments when executed.
Alternatively, in the present embodiment, the storage medium may be configured to store a computer program for executing the steps of:
s1, acquiring a target body sensing action of the target object through the arranged body sensing camera;
s2, determining a target control command corresponding to the target somatosensory motion;
and S3, controlling the intelligent equipment according to the target control instruction.
Optionally, in this embodiment, the storage medium may include, but is not limited to: a usb disk, a Read-ONly Memory (ROM), a RaNdom Access Memory (RAM), a removable hard disk, a magnetic disk, or an optical disk, which can store computer programs.
Example 4
Embodiments of the present invention also provide an electronic device comprising a memory having a computer program stored therein and a processor arranged to run the computer program to perform the steps of any of the above method embodiments.
Optionally, the electronic apparatus may further include a transmission device and an input/output device, wherein the transmission device is connected to the processor, and the input/output device is connected to the processor.
Optionally, in this embodiment, the processor may be configured to execute the following steps by a computer program:
s1, acquiring a target body sensing action of the target object through the arranged body sensing camera;
s2, determining a target control command corresponding to the target somatosensory motion;
and S3, controlling the intelligent equipment according to the target control instruction.
Optionally, the specific examples in this embodiment may refer to the examples described in the above embodiments and optional implementation manners, and this embodiment is not described herein again.
It will be apparent to those skilled in the art that the modules or steps of the present invention described above may be implemented by a general purpose computing device, they may be centralized on a single computing device or distributed across a network of multiple computing devices, and alternatively, they may be implemented by program code executable by a computing device, such that they may be stored in a storage device and executed by a computing device, and in some cases, the steps shown or described may be performed in an order different than that described herein, or they may be separately fabricated into individual integrated circuit modules, or multiple ones of them may be fabricated into a single integrated circuit module. Thus, the present invention is not limited to any specific combination of hardware and software.
The above description is only a preferred embodiment of the present invention and is not intended to limit the present invention, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, or improvement made within the principle of the present invention should be included in the protection scope of the present invention.

Claims (10)

1. The equipment control method is applied to intelligent equipment and comprises the following steps:
acquiring a target somatosensory action of a target object through an arranged somatosensory camera;
determining a target control command corresponding to the target somatosensory action;
and controlling the intelligent equipment according to the target control instruction.
2. The method of claim 1, wherein the obtaining of the target somatosensory action of the target object through the arranged somatosensory camera comprises:
acquiring a plurality of RGB color images and a plurality of depth images collected by the somatosensory camera;
performing skeleton tracking according to the RGB color images and the depth images to obtain a skeleton image;
and determining the target somatosensory action according to the skeleton image.
3. The method of claim 2,
the body feeling camera comprises an RGB color camera and a depth camera, wherein the RGB color camera is used for collecting RGB color images, and the depth camera is used for collecting depth images.
4. The method of claim 1, wherein determining that the target somatosensory motion corresponds to the target control command comprises:
and determining a target control command corresponding to the target body sensing action according to a pre-stored corresponding relation between the body sensing action and the control command.
5. The method of claim 4, wherein determining the target control command corresponding to the target somatosensory motion according to a pre-stored correspondence between the somatosensory motion and the control instruction comprises at least one of:
if the target body sensation is taken as a first limb action, determining the target control instruction as a starting instruction according to the corresponding relation;
if the target body feeling is taken as a second limb action, determining the target control instruction as a shutdown instruction according to the corresponding relation;
if the target body feeling is taken as a third limb action, determining the target control instruction as a temperature control instruction according to the corresponding relation;
if the target body sensation movement is taken as a fourth limb body movement, determining that the target control instruction is a mode control instruction according to the corresponding relation;
and if the target body feeling is taken as the fifth limb action, determining the target control instruction as the humidity control instruction according to the corresponding relation.
6. The method according to any one of claims 1 to 5, wherein before acquiring the target somatosensory action of the target object by the arranged somatosensory camera, the method further comprises:
monitoring the target object through the camera;
and sending out voice information for prompting the somatosensory action.
7. The utility model provides a device control apparatus which characterized in that is applied to intelligent equipment, includes:
the acquisition module is used for acquiring target somatosensory motion of the target object through the set somatosensory camera;
the determining module is used for determining a target control command corresponding to the target somatosensory action;
and the control module is used for controlling the intelligent equipment according to the target control instruction.
8. The apparatus of claim 7, wherein the obtaining module comprises:
the acquisition unit is used for acquiring a plurality of RGB color images and a plurality of depth images collected by the somatosensory camera;
the skeleton tracking unit is used for carrying out skeleton tracking according to the RGB color images and the depth images to obtain a skeleton image;
and the determining unit is used for determining the target somatosensory motion according to the skeleton image.
9. A computer-readable storage medium, in which a computer program is stored, wherein the computer program is configured to carry out the method of any one of claims 1 to 6 when executed.
10. An electronic device comprising a memory and a processor, wherein the memory has stored therein a computer program, and wherein the processor is arranged to execute the computer program to perform the method of any of claims 1 to 6.
CN202011556721.2A 2020-12-24 2020-12-24 Equipment control method and device Pending CN112580551A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011556721.2A CN112580551A (en) 2020-12-24 2020-12-24 Equipment control method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011556721.2A CN112580551A (en) 2020-12-24 2020-12-24 Equipment control method and device

Publications (1)

Publication Number Publication Date
CN112580551A true CN112580551A (en) 2021-03-30

Family

ID=75139744

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011556721.2A Pending CN112580551A (en) 2020-12-24 2020-12-24 Equipment control method and device

Country Status (1)

Country Link
CN (1) CN112580551A (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103399637A (en) * 2013-07-31 2013-11-20 西北师范大学 Man-computer interaction method for intelligent human skeleton tracking control robot on basis of kinect
CN103760976A (en) * 2014-01-09 2014-04-30 华南理工大学 Kinect based gesture recognition smart home control method and Kinect based gesture recognition smart home control system

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103399637A (en) * 2013-07-31 2013-11-20 西北师范大学 Man-computer interaction method for intelligent human skeleton tracking control robot on basis of kinect
CN103760976A (en) * 2014-01-09 2014-04-30 华南理工大学 Kinect based gesture recognition smart home control method and Kinect based gesture recognition smart home control system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
安一宁: "《物联网技术在智能家居领域的应用》", 31 August 2020, 天津人民出版社 *

Similar Documents

Publication Publication Date Title
CN107358007B (en) It controls the method, apparatus of smart home system and calculates readable storage medium storing program for executing
CN106406119B (en) Service robot based on interactive voice, cloud and integrated intelligent Household monitor
CN106254848A (en) A kind of learning method based on augmented reality and terminal
CN110495819A (en) Control method, robot, terminal, server and the control system of robot
US11375559B2 (en) Communication connection method, terminal device and wireless communication system
CN107813306B (en) Robot and motion control method and device thereof
CN107682236B (en) Intelligent household interaction system and method based on computer image recognition
JP2020021200A (en) Communication system and communication method
CN109839827B (en) Gesture recognition intelligent household control system based on full-space position information
CN109901698A (en) A kind of intelligent interactive method, wearable device and terminal and system
CN110875944B (en) Communication connection method, device, terminal equipment and wireless communication system
CN107231476A (en) Mobile terminal and its scene mode setting method, device
CN104122999A (en) Intelligent device interaction method and system
CN108548267B (en) Air conditioner control method and user terminal
CN107168182A (en) A kind of system and method for Indoor Robot VR applications
CN114967490A (en) Household scene control method and device, intelligent door lock and control system
CN107070701A (en) The control method and device of internet of things home appliance equipment
CN112121406A (en) Object control method and device, storage medium and electronic device
CN112580551A (en) Equipment control method and device
CN109976169B (en) Internet television intelligent control method and system based on self-learning technology
CN116483199A (en) Method and device for sending control instruction, storage medium and electronic equipment
CN114143521A (en) Game projection method, projector and storage medium
CN112764349A (en) Clothes hanger control method, clothes hanger, system and storage medium
CN108351681A (en) Electronic equipment and its bracket
CN105302310B (en) A kind of gesture identifying device, system and method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20210330

RJ01 Rejection of invention patent application after publication