CN108536027B - Intelligent home control method and device and server - Google Patents

Intelligent home control method and device and server Download PDF

Info

Publication number
CN108536027B
CN108536027B CN201810287425.3A CN201810287425A CN108536027B CN 108536027 B CN108536027 B CN 108536027B CN 201810287425 A CN201810287425 A CN 201810287425A CN 108536027 B CN108536027 B CN 108536027B
Authority
CN
China
Prior art keywords
current user
face
preset features
judging
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810287425.3A
Other languages
Chinese (zh)
Other versions
CN108536027A (en
Inventor
杨锐
崔磊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Baidu Netcom Science and Technology Co Ltd
Original Assignee
Beijing Baidu Netcom Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Baidu Netcom Science and Technology Co Ltd filed Critical Beijing Baidu Netcom Science and Technology Co Ltd
Priority to CN201810287425.3A priority Critical patent/CN108536027B/en
Publication of CN108536027A publication Critical patent/CN108536027A/en
Application granted granted Critical
Publication of CN108536027B publication Critical patent/CN108536027B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B15/00Systems controlled by a computer
    • G05B15/02Systems controlled by a computer electric
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/418Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • G06V40/166Detection; Localisation; Normalisation using acquisition arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/168Feature extraction; Face representation
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/20Pc systems
    • G05B2219/26Pc applications
    • G05B2219/2642Domotique, domestic, home control, automation, smart house
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • General Health & Medical Sciences (AREA)
  • Automation & Control Theory (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Manufacturing & Machinery (AREA)
  • Quality & Reliability (AREA)
  • Image Analysis (AREA)
  • Circuit Arrangement For Electric Light Sources In General (AREA)

Abstract

The invention provides an intelligent home control method, an intelligent home control device and a server, wherein the method comprises the following steps: acquiring a face image of a current user; extracting preset features of the facial image, and judging the position and/or face orientation of the current user according to the extracted preset features; the preset features are obtained by training a plurality of face pictures, and the face pictures comprise a plurality of pictures indicating the position and/or face orientation of a user; and controlling the operation of the smart home according to the position and/or face orientation of the current user. In the embodiment of the invention, the position and/or face orientation of the current user is judged by extracting the preset features of the facial image of the user, and the preset features are obtained by training a plurality of face pictures of the user, so that the operation of the intelligent home can be controlled according to the position and/or face orientation of the user, and more fine control is realized.

Description

Intelligent home control method and device and server
Technical Field
The invention relates to the field of internet, in particular to an intelligent home control method, an intelligent home control device and a server.
Background
The rapid development of the technology brings more and more convenience to the life of people. Nowadays, smart homes have become increasingly popular in people's lives. The intelligent home can make people more comfortable and beautiful in home, and can also realize energy conservation and environmental protection, so that the intelligent home is more and more favored.
In the prior art, a user can be identified through a face identification device or a fingerprint identification device, and certain control is performed on the smart home through a wireless network. However, the prior art solutions generally identify the user, control the smart home when it is determined that the user has the right to control the smart home, and do not further identify the position and/or face orientation of the user, and control the operation of the smart home according to the position and/or face orientation of the user.
Disclosure of Invention
The embodiment of the invention provides an intelligent home control method, an intelligent home control device and a server, which are used for at least solving one or more technical problems in the prior art and at least providing a beneficial choice.
In a first aspect, an embodiment of the present invention provides an intelligent home control method, including: acquiring a face image of a current user;
extracting preset features of the facial image, and judging the position and/or face orientation of the current user according to the extracted preset features; the preset features are obtained by training a plurality of face pictures, and the face pictures comprise a plurality of pictures indicating the position and/or face orientation of a user; and
and controlling the operation of the smart home according to the position and/or face orientation of the current user.
With reference to the first aspect, in a first implementation manner of the first aspect, after the acquiring the face image of the current user, the method further includes:
judging whether the brightness of the current environment is smaller than a first threshold value; and
controlling a lighting device in the smart home to be turned on to reacquire the facial image of the current user when it is determined that the brightness of the current environment is less than the first threshold; and
the extracting preset features of the facial image and judging the position and/or face orientation of the current user according to the extracted preset features comprises:
and extracting preset features of the re-acquired face image, and judging the position and/or face orientation of the current user according to the extracted preset features.
With reference to the first aspect, in a second implementation manner of the first aspect, the determining the position and/or the face orientation of the current user according to the extracted preset features includes:
inputting the extracted preset features into a preset classifier, and judging the position and/or face orientation of the current user; the preset classifier is obtained by training a plurality of face pictures indicating the position and/or face orientation of a user.
With reference to the first aspect, in a third implementation manner of the first aspect, the acquiring a face image of a current user includes:
acquiring at least one facial image of a current user at predetermined time intervals;
the extracting preset features of the facial image and judging the position and/or face orientation of the current user according to the extracted preset features comprises:
extracting preset features of each face image in the at least one face image, and judging the traveling speed and/or direction of the current user according to the extracted preset features; and is
The controlling the operation of the smart home according to the position and/or the face orientation of the current user includes:
and controlling the operation of the smart home according to the traveling speed and/or direction of the current user.
With reference to the third implementation manner of the first aspect, the controlling the operation of the smart home according to the traveling speed and/or the direction of the current user includes:
controlling the lighting equipment in the intelligent home to be opened or closed according to the traveling speed and/or direction of the current user; or
And controlling the opening, closing or volume adjustment of the sound playing equipment in the smart home according to the traveling speed and/or direction of the current user.
In a second aspect, an embodiment of the present invention provides an intelligent home control device, including:
the acquisition module is used for acquiring a face image of a current user;
the judging module is used for extracting preset features of the facial image and judging the position and/or face orientation of the current user according to the extracted preset features; the preset features are obtained by training a plurality of face pictures, and the face pictures comprise a plurality of pictures indicating the position and/or face orientation of a user; and
and the control module is used for controlling the operation of the intelligent home according to the position and/or face orientation of the current user.
In a third aspect, an embodiment of the present invention provides a server, where the server includes:
one or more processors;
a storage configured to store one or more programs;
a communication interface configured to enable the processor and the storage device to communicate with an external device;
the one or more programs, when executed by the one or more processors, cause the one or more processors to implement the method of the first aspect as described above.
In a fourth aspect, an embodiment of the present invention provides a computer-readable storage medium, configured to store computer software instructions for the smart home control apparatus, where the computer software instructions include a program for executing the smart home control method in the first aspect to the smart home control apparatus.
Another technical scheme in the above technical scheme has the following advantages or beneficial effects: in the embodiment of the invention, the position and/or face orientation of the current user is judged by extracting the preset features of the facial image of the user, and the preset features are obtained by training a plurality of face pictures, so that the operation of the intelligent home can be controlled according to the position and/or face orientation of the user, and more detailed control is realized.
The foregoing summary is provided for the purpose of description only and is not intended to be limiting in any way. In addition to the illustrative aspects, embodiments, and features described above, further aspects, embodiments, and features of the present invention will be readily apparent by reference to the drawings and following detailed description.
Drawings
In the drawings, like reference numerals refer to the same or similar parts or elements throughout the several views unless otherwise specified. The figures are not necessarily to scale. It is appreciated that these drawings depict only some embodiments in accordance with the disclosure and are therefore not to be considered limiting of its scope.
Fig. 1 is a flowchart of a smart home control method according to an embodiment of the present invention;
fig. 2 is a flowchart of a smart home control method according to another embodiment of the present invention;
fig. 3 is a flowchart of a smart home control method according to another embodiment of the present invention;
fig. 4 is a schematic structural diagram of an intelligent home control device according to another embodiment of the invention;
fig. 5 is a schematic structural diagram of an intelligent home control device according to another embodiment of the invention;
fig. 6 is a schematic structural diagram of a server according to another embodiment of the present invention.
Detailed Description
In the following, only certain exemplary embodiments are briefly described. As those skilled in the art will recognize, the described embodiments may be modified in various different ways, all without departing from the spirit or scope of the present invention. Accordingly, the drawings and description are to be regarded as illustrative in nature, and not as restrictive.
Fig. 1 shows a flowchart of a smart home control method 100 according to an embodiment of the present invention. It should be noted that the method 100 may be applied to a single device, or may be embedded in some smart homes, for example, a smart speaker, a smart air conditioner, or the like, as long as the device has an image acquisition device and can realize face recognition. As shown in fig. 1, the method 100 may include:
s110: acquiring a face image of a current user;
s120: extracting preset features of the facial image, and judging the position and/or face orientation of the current user according to the extracted preset features;
here, the preset features may be obtained by training a plurality of face pictures, and the plurality of face pictures should include a plurality of pictures indicating the position and/or face orientation of the user. It will be appreciated that the preset features thus obtained better represent the user's position and/or face orientation.
The training mentioned in the embodiment of the present invention is a method of machine learning, and model parameters are obtained from a large amount of data through training. Moreover, when a plurality of face pictures are trained, the training can be carried out through a neural network technology. In particular, the training may be performed by a convolutional neural network or, preferably, a deep convolutional neural network. Convolutional neural networks are widely used for image and video recognition, recommendation systems, and natural language processing. The embodiment of the invention utilizes the convolutional neural network to carry out face recognition on the acquired face image. The convolutional neural network mainly comprises: convolutional layers, activation functions, pooling layers, and fully-connected layers. Common activation functions include sigmoid, tanh and ReLU, and can be selected according to actual conditions. The purpose of Pooling is to reduce the dimensionality of the input, which in embodiments of the present invention is a facial image, and the Pooling techniques commonly used today include Max-Pooling, Min-Pooling, and Average-Pooling.
In a specific implementation, the number of layers of the convolutional neural network can be selected as needed, and does not necessarily include only four layers.
In the embodiment of the invention, the features of the facial image of the user obtained through training can be called as the facial features. In particular, in order to more conveniently control smart homes, it is preferable to extract features related to the current position and/or face orientation of the user, for example, the face orientation may be represented by an angle between the face of the user and a camera, an angle of a pupil of the user, or a physical direction. Typically, the user's pupil angle represents the orientation of the user's eyes, indicating the direction of the user's gaze. When the sight line direction of the user is determined, the smart home in the sight line direction of the user can be controlled.
The intelligent home control system can be used for training face pictures of preset users, the preset users can be owners of the houses, and the preset users can also be people set by the owners of the houses to control the intelligent homes. It can be understood that, in addition to controlling the smart home, the identity of the current user can be identified. In particular, the features extracted from the face image of the current user may be compared with the features extracted from the face image of the preset user to determine the similarity therebetween, and when the similarity is greater than a certain threshold, it may be determined whether the current user is the preset user. If the current user is not the preset user, the preset user can be informed or an alarm is given.
In a preferred embodiment of the present invention, determining the current user position and/or face orientation according to the extracted preset features may include: inputting the extracted preset features into a preset classifier, and judging the position and/or face orientation of the current user; the preset classifier is obtained by training a plurality of face pictures indicating the position and/or face orientation of a user.
Classifiers are often used for data mining, the inputs of which are usually feature vectors and the outputs are usually also numerical values, but each numerical value indicates a different class. In particular, the classifier can also be generated using a neural network, in particular a convolutional neural network or a deep convolutional neural network.
Here, the position of the user may be a coordinate position of the user in a room, and a coordinate system needs to be established at this time, for example, the east-west direction may be used as an abscissa, the south-north direction may be used as an ordinate, and the coordinate system may be established with the southwest corner of the room as an origin, so as to obtain the position of the user; the position of the user may also be a position of each electronic device in the smart home, and in this case, the relative positions of the user and the device executing the method 100 and the user may be determined through face recognition, so as to obtain the relative positions of the user and other smart homes. Furthermore, it will be appreciated that the orientation of the user is also relative and can be set as desired.
S130: and controlling the operation of the smart home according to the position and/or face orientation of the current user.
Nowadays, smart homes include a variety of devices, such as smart speakers, smart televisions, smart air conditioners, and the like. In particular, the traditional household equipment can be upgraded and modified, so that the household equipment can be controlled. For example, electronics may be provided on the rails of the window covering to control the opening and closing of the window covering. Each device in the intelligent home can be connected through a wireless network, so that the control can be conveniently carried out.
Controlling the operation of the smart home according to the current position and/or face orientation of the user may include: controlling the opening and/or closing of the lighting device closest to the user according to the orientation and/or position of the user; and controlling the opening, closing, volume adjustment and the like of the intelligent sound box according to the orientation and/or position of the user. It can be understood that the smart home can be controlled by acquiring a whole-body image of the user and recognizing expressions, actions and the like of the user.
In some cases, for example, when the light is dark, it is difficult to accurately determine the position and/or the face orientation of the user through face recognition, and therefore, in another embodiment of the present invention, as shown in fig. 2, a smart home control method 200 is provided, where the method 200 includes:
s210: acquiring a face image of a current user;
s220: judging whether the brightness of the current environment is smaller than a first threshold value;
in particular, the determination of the brightness of the current environment may also be performed using face recognition techniques. For example, features of the acquired face image may be extracted and compared with features extracted for pictures above a first threshold brightness, and if the comparison result does not match, it is determined that the brightness of the current environment is below the first threshold. Here, the extracted features are obtained by training for a plurality of face pictures taken at a luminance higher than a first threshold. Alternatively, the extracted features may be input to a luminance classifier by using the luminance classifier obtained through training, so as to obtain a classification result of luminance. In particular, the luminance may be divided into several levels as the output of the classifier. In this case, the brightness of the illumination device may be adjusted according to the brightness level output by the classifier.
S230: controlling a lighting device in the smart home to be turned on to reacquire a facial image of the current user when it is determined that the brightness of the current environment is less than a first threshold;
it is to be understood that the illumination device referred to herein may be the illumination device closest to the device performing the method 200. In particular, if the device performing the method 200 is a lighting device, the lighting device may be controlled to turn on.
S240: extracting preset features of the re-acquired face image, and judging the position and/or face orientation of the current user according to the extracted preset features;
s250: and controlling the operation of the smart home according to the position and/or face orientation of the current user.
The execution of S240 and S250 is similar to S120 and S130, and is not described herein.
Present intelligent house can use in every corner in house, and intelligent house can all be installed to sitting room, bedroom, kitchen, bathroom etc. under this condition, can come to control intelligent house according to user's speed of advancing and/or direction to promote user experience.
Therefore, as shown in fig. 3, in a preferred embodiment of the present invention, a smart home control method 300 is provided, where the method 300 may include:
s310: acquiring at least one facial image of a current user at predetermined time intervals;
if the traveling speed and/or direction of the user is expected to be obtained, the acquired face image needs to be periodically updated, and in the embodiment of the invention, the face image of the current user is acquired at preset time intervals. The predetermined time interval here may be set according to actual needs, for example, may be set to one minute, and may be set to be longer or shorter. It can be understood that if the predetermined time interval is set to be shorter, the change of the position and/or direction of the user can be detected more timely, and then the traveling speed and/or direction of the user can be obtained, but the calculation amount is increased, and the response time is increased; if the predetermined time interval is set to be longer, the amount of calculation is reduced, and the response is accelerated, but the user may not be able to respond to the change in the position and/or direction of the user in time and cannot keep up with the progress of the user. Can be flexibly changed according to the requirement and can be adjusted at any time.
S320: extracting preset features of each face image in the at least one face image, and judging the traveling speed and/or direction of the current user according to the extracted preset features;
by extracting the features of each facial image, the user position and/or the face orientation corresponding to each facial image can be judged, and further the traveling speed and/or the direction of the user can be judged.
S330: and controlling the operation of the smart home according to the traveling speed and/or direction of the current user.
In one particular example, the turning on or off of lighting devices in a smart home may be controlled according to a determined speed and/or direction of travel of the current user; or controlling the opening and closing of the playing equipment in the smart home, volume adjustment and even video switching, and the like. In particular, the smart speaker may be turned off when it is determined that the user is traveling away from the smart speaker.
By executing the method 300, the operation of the smart home can be controlled more flexibly, and the user experience is enhanced.
Fig. 4 shows a schematic structural diagram of an intelligent home control device 400 according to another embodiment of the present invention, and as shown in fig. 4, the device 400 includes:
an obtaining module 410, configured to obtain a face image of a current user;
a judging module 420, configured to extract preset features of the facial image, and judge a position and/or a face orientation of the current user according to the extracted preset features; the preset features are obtained by training a plurality of face pictures, and the face pictures comprise a plurality of pictures indicating the position and/or face orientation of a user; and
and the control module 430 is configured to control the operation of the smart home according to the current user position and/or the face orientation.
In another embodiment of the present invention, as shown in fig. 5, there is provided a smart home control apparatus 500, where the apparatus 500 may include:
an obtaining module 510, configured to obtain a face image of a current user;
a brightness judging module 520, configured to judge whether the brightness of the current environment is smaller than a first threshold;
an image updating module 530, configured to control an illumination device in the smart home to turn on and instruct the obtaining module to re-obtain the face image of the current user when the brightness judging module determines that the brightness of the current environment is smaller than a first threshold;
a determining module 540, configured to extract preset features of the re-acquired face image, and determine a position and/or a face orientation of the current user according to the extracted preset features; the preset features are obtained by training a plurality of face pictures, and the face pictures comprise a plurality of pictures indicating the position and/or face orientation of a user; and
and a control module 550, configured to control an operation of the smart home according to the current user position and/or the face orientation.
Preferably, the determining module 420 or 540 is further configured to: inputting the extracted preset features into a preset position and/or face orientation classifier, and judging the position and/or face orientation of the current user; the preset position and/or face orientation classifier is obtained by training a plurality of face pictures indicating the position and/or face orientation of a user.
In a preferred embodiment of the present invention, the obtaining module 410 may be further configured to obtain at least one facial image of the current user at predetermined time intervals;
the judging module 40 may be further configured to extract a preset feature of each of the at least one facial image, and judge the traveling speed and/or direction of the current user according to the extracted preset feature; and
the control module 430 may be further configured to control the operation of the smart home according to the traveling speed and/or direction of the current user.
In this embodiment, preferably, the control module 430 may be further configured to:
controlling the lighting equipment in the intelligent home to be opened or closed according to the traveling speed and/or direction of the current user; or
And controlling the opening, closing or volume adjustment of the sound playing equipment in the smart home according to the traveling speed and/or direction of the current user.
Fig. 6 shows a schematic structural diagram of a server according to another embodiment of the present invention. As shown in fig. 6, the apparatus includes:
one or more processors 610;
a storage 620 configured to store one or more programs;
a communication interface 630 configured to enable the processor 610 and the storage 620 to communicate with external devices;
when the one or more programs are executed by the one or more processors 610, the one or more processors 610 are enabled to implement any one of the aforementioned smart home control methods.
According to another embodiment of the present invention, there is provided a computer-readable storage medium storing a computer program, which when executed by a processor, implements any one of the aforementioned smart home control methods.
In the description herein, references to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include at least one such feature. In the description of the present invention, "a plurality" means two or more unless specifically defined otherwise.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps of the process, and alternate implementations are included within the scope of the preferred embodiment of the present invention in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present invention.
The logic and/or steps represented in the flowcharts or otherwise described herein, e.g., an ordered listing of executable instructions that can be considered to implement logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. For the purposes of this description, a "computer-readable medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection (electronic device) having one or more wires, a portable computer diskette (magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable read-only memory (CDROM). Additionally, the computer-readable medium could even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via for instance optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner if necessary, and then stored in a computer memory.
It should be understood that portions of the present invention may be implemented in hardware, software, firmware, or a combination thereof. In the above embodiments, the various steps or methods may be implemented in software or firmware stored in memory and executed by a suitable instruction execution system. For example, if implemented in hardware, as in another embodiment, any one or combination of the following techniques, which are known in the art, may be used: a discrete logic circuit having a logic gate circuit for implementing a logic function on a data signal, an application specific integrated circuit having an appropriate combinational logic gate circuit, a Programmable Gate Array (PGA), a Field Programmable Gate Array (FPGA), or the like.
It will be understood by those skilled in the art that all or part of the steps carried by the method for implementing the above embodiments may be implemented by hardware related to instructions of a program, which may be stored in a computer readable storage medium, and when the program is executed, the program includes one or a combination of the steps of the method embodiments.
In addition, functional units in the embodiments of the present invention may be integrated into one processing module, or each unit may exist alone physically, or two or more units are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. The integrated module, if implemented in the form of a software functional module and sold or used as a separate product, may also be stored in a computer readable storage medium. The storage medium may be a read-only memory, a magnetic or optical disk, or the like.
The above description is only for the specific embodiment of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can easily conceive various changes or substitutions within the technical scope of the present invention, and these should be covered by the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the appended claims.

Claims (12)

1. The intelligent household control method is characterized by comprising the following steps:
acquiring a face image of a current user;
extracting preset features of the facial image, and judging the position and face orientation of the current user according to the extracted preset features; the preset features are obtained by training a plurality of face pictures, and the face pictures comprise a plurality of pictures indicating the position and face orientation of a user; and
and controlling the operation of the smart home according to the position and the face orientation of the current user.
2. The method of claim 1, wherein after said obtaining the image of the face of the current user, the method further comprises:
judging whether the brightness of the current environment is smaller than a first threshold value; and
controlling a lighting device in the smart home to be turned on to reacquire the facial image of the current user when it is determined that the brightness of the current environment is less than the first threshold; and
the extracting preset features of the facial image and judging the position and face orientation of the current user according to the extracted preset features comprises:
and extracting preset features of the re-acquired face image, and judging the position and face orientation of the current user according to the extracted preset features.
3. The method of claim 1, wherein the determining the position and the face orientation of the current user according to the extracted preset features comprises:
inputting the extracted preset features into a preset classifier, and judging the position and face orientation of the current user; the preset classifier is obtained by training a plurality of face pictures indicating the position and face orientation of a user.
4. The method of claim 1, wherein the obtaining the face image of the current user comprises:
acquiring at least one facial image of a current user at predetermined time intervals;
the extracting preset features of the facial image and judging the position and face orientation of the current user according to the extracted preset features comprises:
extracting preset features of each face image in the at least one face image, and judging the traveling speed and direction of the current user according to the extracted preset features; and is
The controlling the operation of the smart home according to the position and the face orientation of the current user comprises the following steps:
and controlling the operation of the smart home according to the traveling speed and the direction of the current user.
5. The method according to claim 4, wherein the controlling the operation of the smart home according to the traveling speed and the direction of the current user comprises:
controlling the lighting equipment in the intelligent home to be opened or closed according to the traveling speed and the direction of the current user; or
And controlling the opening, closing or volume adjustment of the sound playing equipment in the smart home according to the traveling speed and direction of the current user.
6. The utility model provides an intelligence house controlling means which characterized in that includes:
the acquisition module is used for acquiring a face image of a current user;
the judging module is used for extracting preset features of the facial image and judging the position and the face orientation of the current user according to the extracted preset features; the preset features are obtained by training a plurality of face pictures, and the face pictures comprise a plurality of pictures indicating the position and face orientation of a user; and
and the control module is used for controlling the operation of the intelligent home according to the position and the face orientation of the current user.
7. The apparatus of claim 6, further comprising:
the brightness judging module is used for judging whether the brightness of the current environment is smaller than a first threshold value or not; and
an image updating module, configured to control a lighting device in the smart home to turn on and instruct the obtaining module to obtain the facial image of the current user again when the brightness judging module determines that the brightness of the current environment is smaller than a first threshold;
and the judging module is further used for extracting the preset features of the re-acquired face image and judging the position and face orientation of the current user according to the extracted preset features.
8. The apparatus of claim 6, wherein the determining module is further configured to: inputting the extracted preset features into a preset classifier, and judging the position and face orientation of the current user; the preset classifier is obtained by training a plurality of face pictures indicating the position and face orientation of a user.
9. The apparatus of claim 6,
the acquisition module is further used for acquiring at least one facial image of the current user at preset time intervals;
the judging module is further used for extracting preset features of each face image in the at least one face image and judging the traveling speed and direction of the current user according to the extracted preset features; and is
The control module is further used for controlling the operation of the smart home according to the traveling speed and the traveling direction of the current user.
10. The apparatus of claim 9, wherein the control module is further configured to:
controlling the lighting equipment in the intelligent home to be opened or closed according to the traveling speed and the direction of the current user; or
And controlling the opening, closing or volume adjustment of the sound playing equipment in the smart home according to the traveling speed and direction of the current user.
11. A server, characterized in that the server comprises:
one or more processors;
a storage configured to store one or more programs;
a communication interface configured to cause the processor and the storage device to communicate with an external device;
the one or more programs, when executed by the one or more processors, cause the one or more processors to implement the method recited in any of claims 1-5.
12. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1-5.
CN201810287425.3A 2018-03-30 2018-03-30 Intelligent home control method and device and server Active CN108536027B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810287425.3A CN108536027B (en) 2018-03-30 2018-03-30 Intelligent home control method and device and server

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810287425.3A CN108536027B (en) 2018-03-30 2018-03-30 Intelligent home control method and device and server

Publications (2)

Publication Number Publication Date
CN108536027A CN108536027A (en) 2018-09-14
CN108536027B true CN108536027B (en) 2020-11-03

Family

ID=63483010

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810287425.3A Active CN108536027B (en) 2018-03-30 2018-03-30 Intelligent home control method and device and server

Country Status (1)

Country Link
CN (1) CN108536027B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109725946A (en) * 2019-01-03 2019-05-07 阿里巴巴集团控股有限公司 A kind of method, device and equipment waking up smart machine based on Face datection
CN112083795A (en) * 2019-06-12 2020-12-15 北京迈格威科技有限公司 Object control method and device, storage medium and electronic equipment
CN110941196A (en) * 2019-11-28 2020-03-31 星络智能科技有限公司 Intelligent panel, multi-level interaction method based on angle detection and storage medium
CN115484117A (en) * 2022-08-30 2022-12-16 海尔优家智能科技(北京)有限公司 Call answering method and device, storage medium and electronic device

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105700363B (en) * 2016-01-19 2018-10-26 深圳创维-Rgb电子有限公司 A kind of awakening method and system of smart home device phonetic controller
CN105843050A (en) * 2016-03-18 2016-08-10 美的集团股份有限公司 Intelligent household system, intelligent household control device and method
CN106569467A (en) * 2016-10-29 2017-04-19 深圳智乐信息科技有限公司 Method for selecting scene based on mobile terminal and system
CN106569410A (en) * 2016-10-29 2017-04-19 深圳智乐信息科技有限公司 Method and system for managing smart home

Also Published As

Publication number Publication date
CN108536027A (en) 2018-09-14

Similar Documents

Publication Publication Date Title
CN108536027B (en) Intelligent home control method and device and server
CN108919669B (en) Intelligent home dynamic decision method and device and service terminal
WO2021000791A1 (en) Method and apparatus for controlling smart home appliance, control device and storage medium
US20220317641A1 (en) Device control method, conflict processing method, corresponding apparatus and electronic device
EP3857860B1 (en) System and method for disambiguation of internet-of-things devices
US20200133211A1 (en) Electronic device and method for controlling electronic device thereof
CN112053683A (en) Voice instruction processing method, device and control system
CN111128157B (en) Wake-up-free voice recognition control method for intelligent household appliance, computer readable storage medium and air conditioner
CN108509049B (en) Method and system for inputting gesture function
CN111968644B (en) Intelligent device awakening method and device and electronic device
KR102557561B1 (en) Method and system for determining depth of information of an image
US11966317B2 (en) Electronic device and method for controlling same
CN111240217B (en) State detection method and device, electronic equipment and storage medium
CN111798811A (en) Screen backlight brightness adjusting method and device, storage medium and electronic equipment
CN111801650A (en) Electronic device and method of controlling external electronic device based on usage pattern information corresponding to user
CN114253190A (en) Intelligent shower control method and device, intelligent equipment and storage medium
CN110657561B (en) Air conditioner and voice instruction recognition method, control device and readable storage medium thereof
CN114554660B (en) Light control method, device, electronic equipment and storage medium
CN111989917B (en) Electronic device and control method thereof
CN110568770A (en) method for controlling intelligent household equipment and control equipment
KR20080006981A (en) System and method for offering intelligent home service
CN112699731A (en) Air conditioner music intelligent playing method and device based on human behavior recognition and air conditioner
CN110427801A (en) Intelligent home furnishing control method and device, electronic equipment and non-transient storage media
CN112815610A (en) Control method and device for ion generator in household appliance and household appliance
CN108415572A (en) module control method, device and storage medium applied to mobile terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant