CN108536027A - Intelligent home furnishing control method, device and server - Google Patents
Intelligent home furnishing control method, device and server Download PDFInfo
- Publication number
- CN108536027A CN108536027A CN201810287425.3A CN201810287425A CN108536027A CN 108536027 A CN108536027 A CN 108536027A CN 201810287425 A CN201810287425 A CN 201810287425A CN 108536027 A CN108536027 A CN 108536027A
- Authority
- CN
- China
- Prior art keywords
- active user
- face
- image
- default feature
- user
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B15/00—Systems controlled by a computer
- G05B15/02—Systems controlled by a computer electric
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B19/00—Programme-control systems
- G05B19/02—Programme-control systems electric
- G05B19/418—Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS], computer integrated manufacturing [CIM]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/161—Detection; Localisation; Normalisation
- G06V40/166—Detection; Localisation; Normalisation using acquisition arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/168—Feature extraction; Face representation
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/20—Pc systems
- G05B2219/26—Pc applications
- G05B2219/2642—Domotique, domestic, home control, automation, smart house
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02P—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
- Y02P90/00—Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
- Y02P90/02—Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]
Abstract
A kind of intelligent home furnishing control method of present invention proposition, device and server, the method includes:Obtain the face-image of active user;The default feature of the face-image is extracted, and judges position and/or the facial orientation of the active user according to the default feature extracted;Wherein, the default feature to multiple face pictures by being trained to obtain, and multiple described face pictures include the position for indicating user and/or the plurality of pictures of facial orientation;And the operation of smart home is controlled according to the position of the active user and/or facial orientation.In the embodiment of the present invention, by extract user's face image default feature come to active user position and/or facial orientation judge, and default feature is trained to obtain by multiple face pictures to user, in this way, the operation of smart home can be controlled according to the position of user and/or facial orientation, realize more fine control.
Description
Technical field
The present invention relates to internet arenas, and are specifically related to a kind of intelligent home furnishing control method, device and server.
Background technology
The high speed development of technology brings more and more facilities to people’s lives.Nowadays, smart home is gradual
It has spread in people’s lives.Smart home can not only make more comfortable and beautiful in people family, moreover it is possible to realize energy saving
Environmental protection, therefore obtained more and more favors.
In the prior art, it can identify user by recognition of face or fingerprint identification device, and pass through wireless network
Network carries out smart home certain control.However, the scheme of the prior art is typically to identify the identity of user, when determining user
When having permission to control smart home, smart home is controlled, there is no the further position of identification user and/or people
Face direction, and control according to the position of user and/or facial orientation the operation of smart home.
Invention content
A kind of intelligent home furnishing control method of offer of the embodiment of the present invention, device and server, at least to solve the prior art
In one or more technical problems, a kind of beneficial selection is at least provided.
In a first aspect, an embodiment of the present invention provides a kind of intelligent home furnishing control methods, including:Obtain the face of active user
Portion's image;
The default feature of the face-image is extracted, and judges the active user's according to the default feature extracted
Position and/or facial orientation;Wherein, the default feature is to multiple face pictures by being trained to obtain, and described more
It includes the position for indicating user and/or the plurality of pictures of facial orientation to open face picture;And
The operation of smart home is controlled according to the position of the active user and/or facial orientation.
With reference to first aspect, the present invention obtains active user's in the first embodiment of first aspect described
After face-image, the method further includes:
Judge whether the brightness of current environment is less than first threshold;And
When the brightness for determining current environment is less than the first threshold, the lighting apparatus controlled in the smart home is beaten
It opens to reacquire the face-image of the active user;And
The default feature of the extraction face-image, and the current use is judged according to the default feature extracted
The position at family and/or facial orientation, including:
Extract reacquire face-image default feature, and according to the default feature extracted judge it is described currently
The position of user and/or facial orientation.
With reference to first aspect, the present invention is described default according to what is extracted in second of embodiment of first aspect
Feature judges position and/or the facial orientation of the active user, including:
The default feature extracted is inputted and is preset in grader, judges position and/or the face court of the active user
To;Wherein, the default grader is by indicating that the position of user and/or multiple face pictures of facial orientation are trained
It arrives.
With reference to first aspect, the present invention is in the third embodiment of first aspect, the people for obtaining active user
Face image, including:
At least face-image of active user is obtained with predetermined time interval;
The default feature of the extraction face-image, and the current use is judged according to the default feature extracted
The position at family and/or facial orientation, including:
The default feature of every face-image in an at least face-image described in extraction, and it is default according to what is extracted
Feature judges gait of march and/or the direction of the active user;And
It is described that the operation of smart home is controlled according to the position and/or facial orientation of the active user, including:
The operation of smart home is controlled according to the gait of march of the active user and/or direction.
The third embodiment with reference to first aspect, the gait of march and/or direction according to the active user
The operation of smart home is controlled, including:
According to the opening of lighting apparatus in smart home described in the gait of march of the active user and/or direction controlling or
It is closed;Or
It is beaten according to audio playing device in smart home described in the gait of march of the active user and/or direction controlling
Open and close closes or volume adjustment.
Second aspect, an embodiment of the present invention provides a kind of intelligent home control devices, including:
Acquisition module, the face-image for obtaining active user;
Judgment module, the default feature for extracting the face-image, and judged according to the default feature extracted
The position of the active user and/or facial orientation;Wherein, the default feature is by being trained multiple face pictures
It arrives, and multiple described face pictures include the position for indicating user and/or the plurality of pictures of facial orientation;And
Control module, for being carried out to the operation of smart home according to the position and/or facial orientation of the active user
Control.
The third aspect, an embodiment of the present invention provides a kind of server, the server includes:
One or more processors;
Storage device is configured to store one or more programs;
Communication interface is configured to that the processor and storage device is made to be communicated with external equipment;
When one or more of programs are executed by one or more of processors so that one or more of places
Reason device realizes the method in above-mentioned first aspect.
Fourth aspect, an embodiment of the present invention provides a kind of computer readable storage mediums, for storing the intelligent family
Occupy the computer software instructions used in control device comprising for executing the intelligent home furnishing control method in above-mentioned first aspect
For the program involved by the intelligent home control device.
Another technical solution in above-mentioned technical proposal has the following advantages that or advantageous effect:In the embodiment of the present invention,
By extract the default feature of user's face image come to active user position and/or facial orientation judge, and it is pre-
If feature to multiple face pictures by being trained to obtain, in such manner, it is possible to be controlled according to the position of user and/or facial orientation
More fine control is realized in the operation of smart home processed.
Above-mentioned general introduction is merely to illustrate that the purpose of book, it is not intended to be limited in any way.Except foregoing description
Schematical aspect, except embodiment and feature, by reference to attached drawing and the following detailed description, the present invention is further
Aspect, embodiment and feature, which will be, to be readily apparent that.
Description of the drawings
In the accompanying drawings, unless specified otherwise herein, otherwise run through the identical reference numeral of multiple attached drawings and indicate same or analogous
Component or element.What these attached drawings were not necessarily to scale.It should be understood that these attached drawings are depicted only according to the present invention
Some disclosed embodiments, and should not serve to limit the scope of the present invention.
Fig. 1 is the flow chart according to the intelligent home furnishing control method of one embodiment of the invention;
Fig. 2 is the flow chart according to the intelligent home furnishing control method of another embodiment of the present invention;
Fig. 3 is the flow chart according to the intelligent home furnishing control method of another embodiment of the present invention;
Fig. 4 is the structural schematic diagram according to the intelligent home control device of another embodiment of the present invention;
Fig. 5 is the structural schematic diagram according to the intelligent home control device of another embodiment of the present invention;
Fig. 6 is the structural schematic diagram according to the server of another embodiment of the present invention.
Specific implementation mode
Hereinafter, certain exemplary embodiments are simply just described.As one skilled in the art will recognize that
Like that, without departing from the spirit or scope of the present invention, described embodiment can be changed by various different modes.
Therefore, attached drawing and description are considered essentially illustrative rather than restrictive.
Fig. 1 shows the flow chart of intelligent home furnishing control method 100 according to an embodiment of the invention.It needs to illustrate
It is that method 100 can be applied to some individual equipment, can also be embedded in certain smart homes, such as can be embedded in intelligence
In the equipment such as speaker, intelligent air condition, as long as the equipment has image acquiring device, recognition of face can be realized.Such as Fig. 1 institutes
Show, method 100 may include:
S110:Obtain the face-image of active user;
S120:It extracts the default feature of the face-image, and is judged according to the default feature extracted described current
The position of user and/or facial orientation;
Herein, default feature can be by being trained to obtain to multiple face pictures, and multiple face pictures should wrap
Include the position of instruction user and/or the plurality of pictures of facial orientation.It is understood that the default feature obtained in this way can be more preferable
Embody position and/or the facial orientation of user in ground.
That is mentioned in the embodiment of the present invention is trained for a kind of method of machine learning, is obtained from mass data by training
Model parameter.Also, it when being trained to multiple face pictures, can be carried out by nerual network technique.Particularly, Ke Yitong
It crosses convolutional neural networks or preferably depth convolutional neural networks is trained.Convolutional neural networks are widely used in image and video
Identification, commending system and natural language processing.The embodiment of the present invention is using convolutional neural networks to acquired face-image
Carry out recognition of face.Convolutional neural networks include mainly:Convolutional layer, activation primitive, pond layer and full articulamentum.Wherein, it commonly uses
Activation primitive have sigmoid, tanh and ReLU, can be selected according to actual conditions.Pond the purpose is to reduce the dimensions of input
Degree in the embodiment of the present invention, inputs as face-image, nowadays common pond technology includes Max-Pooling, Min-
Pooling and Average-Pooling.
In specific implementation, the number of plies of convolutional neural networks can be selected as needed, might not include only four
A layer.
In the embodiment of the present invention, by training obtain be face-image about user feature, be properly termed as face
Feature.Particularly, in order to more easily control smart home, current location and/or facial orientation phase with user are preferably extracted
The feature of pass, such as facial orientation can be indicated with the angle between user's face and camera, can also use user's
Pupil angle indicates, or can be indicated with geographic direction.Under normal conditions, the pupil angle of user indicates eyes of user
Direction, show the direction of user's sight.It, can be for the intelligent family on user's direction of visual lines when determining user's direction of visual lines
Residence is controlled.
The face picture that pre-set user can be directed to herein is trained, and pre-set user can be the owner in house,
Can be to be set as that the people of smart home can be controlled by homeowner.It is understood that in this way in addition to can be to intelligence
Household is controlled, and the identity of active user can also be identified.Particularly, can will scheme for the face of active user
Feature as extraction is compared with the feature that the face-image for pre-set user extracts, and determines similarity between the two,
Similarity can determine whether active user is pre-set user when being more than a certain threshold value.If active user is not pre-set user,
It can also notify pre-set user or alarm.
In a preferred embodiment of the invention, the position of active user is judged according to the default feature extracted
And/or facial orientation may include:The default feature extracted is inputted and is preset in grader, judges the position of the active user
It sets and/or facial orientation;Wherein, the default grader is by indicating the position of user and/or multiple faces of facial orientation
Picture is trained to obtain.
Grader is usually used in data mining, and input is typically feature vector, and output is generally also numerical value, but per number
Value indicates different classifications.Particularly, neural network, particularly convolutional neural networks or depth convolutional Neural net can also be utilized
Network next life constituent class device.
Herein, the position of user is either the coordinate position of user in a room, needs to establish coordinate system at this time, such as
Can be using east-west direction as abscissa, North and South direction establishes coordinate as ordinate, and using the southwest corner in room as origin
System, the position of user is obtained with this;The position of user can also be the position relative to each electronic equipment in smart home,
In such a case, it is possible to the relative position of user and the equipment and user that execute method 100 are judged by recognition of face, to
Obtain the relative position of user and other smart homes.Further, it is to be appreciated that the direction of user is also opposite,
It can be configured as needed.
S130:The operation of smart home is controlled according to the position of the active user and/or facial orientation.
Nowadays, smart home includes plurality of devices, such as intelligent sound box, smart television, intelligent air condition etc..Particularly, also
Upgrading can be carried out to traditional home equipment so that can control it.For example, can be on the guide rail of curtain
Electronic equipment is set, for controlling pulling open or being closed for curtain.Each equipment in smart home can be connected by wireless network
It connects, enabling easily controlled.
The operation of smart home is controlled according to the position of active user and/or facial orientation, may include:According to
The direction of user and/or the opening and/or closure of the nearest lighting apparatus of position command range user;According to user's direction and/
Or opening, closure, and/or the volume adjustment etc. of position control intelligent sound box.It is understood that can also be by obtaining user
Whole body images, identify expression, action of user etc. to control smart home.
In some cases, for example, light than it is dark when, it is difficult to the position to user and/or face court by recognition of face
To accurate judgement is carried out, therefore, in an alternative embodiment of the invention, as shown in Fig. 2, providing a kind of intelligent home furnishing control method
200, method 200 includes:
S210:Obtain the face-image of active user;
S220:Judge whether the brightness of current environment is less than first threshold;
Particularly, judge that the brightness of current environment can also be carried out using face recognition technology.It is obtained for example, can extract
The feature of the face-image taken, and the feature extracted by the feature extracted and for the picture higher than first threshold brightness
It is compared, if comparison result mismatches, it is determined that the brightness of current environment is less than first threshold.Herein, the spy extracted
Sign is obtained by training for multiple face pictures shot under the brightness higher than first threshold.Alternatively, can utilize logical
The brightness grader that training obtains is crossed, the feature of extraction is inputted into brightness grader, obtains the classification results of brightness.Particularly,
Brightness can be divided into several grades, the output as grader.In such a case, it is possible to the brightness exported according to grader
Grade adjusts the brightness of lighting apparatus.
S230:When the brightness for determining current environment is less than first threshold, the lighting apparatus in the smart home is controlled
It opens to reacquire the face-image of the active user;
It is understood that lighting apparatus mentioned here, can be the nearest photograph of the equipment of distance execution method 200
Bright equipment.Particularly, if the equipment for executing method 200 is a lighting apparatus, lighting apparatus opening can be controlled.
S240:The default feature of the face-image reacquired is extracted, and institute is judged according to the default feature extracted
State position and/or the facial orientation of active user;
S250:The operation of smart home is controlled according to the position of the active user and/or facial orientation.
The execution of S240 and S250 is similar to S120 and S130, and this will not be repeated here.
Smart home of today can be applied in the every nook and cranny in house, and parlor, bedroom, kitchen, toilet etc. can
Smart home is installed, in such a case, it is possible to be controlled to smart home according to the gait of march of user and/or direction
System, to promote user experience.
Therefore, as shown in figure 3, in a preferred embodiment of the invention, a kind of intelligent home furnishing control method is provided
300, this method 300 may include:
S310:At least face-image of active user is obtained with predetermined time interval;
If it is desire to obtaining gait of march and/or the direction of user, need to carry out periodically acquired face-image
It updates, in the embodiment of the present invention, the face-image of active user is obtained with predetermined time interval.Predetermined time interval herein can
It is arranged according to actual needs, such as could be provided as one minute, can also be arranged longer or shorter.It is understood that such as
Fruit predetermined time interval is arranged shorter, then it can be more timely detected the variation of the position and/or direction of user, into
And gait of march and/or the direction of user is obtained, but calculation amount can be increased, and then increase the response time;And if will make a reservation for
Time interval is arranged long, then can reduce calculation amount, accelerates response, but possibly can not in time to the position of user and/
Or the variation in direction responds, and does not catch up with the traveling of user.Can flexible as needed, and can adjust at any time.
S320:Described in extraction in an at least face-image every face-image default feature, and according to being extracted
Default feature judge gait of march and/or the direction of the active user;
By carrying out feature extraction to every face-image, it can be determined that go out the corresponding user location of every face-image
And/or facial orientation, and then may determine that gait of march and/or the direction of user.
S330:The operation of smart home is controlled according to the gait of march of the active user and/or direction.
It, can be according to identified, active user gait of march and/or direction controlling intelligence in a specific example
Lighting apparatus opens or closes in household;Alternatively, the opening, closure of playback equipment, volume adjustment are in control smart home
Switch to video, etc..Specifically, when determining user when traveling out intelligent sound box, intelligent sound box can be closed.
By executing method 300, the operation of smart home can be controlled more flexiblely, enhances user experience.
Fig. 4 shows the structural schematic diagram of intelligent home control device 400 according to another embodiment of the present invention, such as Fig. 4
Shown, device 400 includes:
Acquisition module 410, the face-image for obtaining active user;
Judgment module 420, the default feature for extracting the face-image, and sentenced according to the default feature extracted
The position of the disconnected active user and/or facial orientation;Wherein, the default feature is by being trained multiple face pictures
It obtains, and multiple described face pictures include the position for indicating user and/or the plurality of pictures of facial orientation;And
Control module 430, for according to the position of the active user and/or facial orientation to the operation of smart home into
Row control.
In an alternative embodiment of the invention, as shown in figure 5, providing a kind of intelligent home control device 500, the device
500 may include:
Acquisition module 510, the face-image for obtaining active user;
Brightness judgment module 520, for judging whether the brightness of current environment is less than first threshold;
Image update module 530, for determining that the brightness of current environment is less than first threshold in the brightness judgment module
When, the lighting apparatus controlled in the smart home is opened, and indicates that the acquisition module reacquires the active user
Face-image;
Judgment module 540, the default feature for extracting the face-image reacquired, and it is default according to what is extracted
Feature judges position and/or the facial orientation of the active user;Wherein, the default feature by multiple face pictures into
Row training obtains, and multiple described face pictures include the position for indicating user and/or the plurality of pictures of facial orientation;And
Control module 550, for according to the position of the active user and/or facial orientation to the operation of smart home into
Row control.
Preferably, judgment module 420 or 540 is further used for:By the default feature extracted input predeterminated position and/or
In facial orientation grader, position and/or the facial orientation of the active user are judged;Wherein, the predeterminated position and/or people
Face is towards grader by indicating that the position of user and/or multiple face pictures of facial orientation are trained to obtain.
In a preferred embodiment of the invention, acquisition module 410 can be further used for obtaining with predetermined time interval
Take at least face-image of the active user;
Judgment module 40 can be further used in an extraction at least face-image presetting for every face-image
Feature, and judge according to the default feature extracted gait of march and/or the direction of the active user;And
Control module 430 can be further used for gait of march and/or direction according to the active user to intelligent family
The operation in residence is controlled.
In this embodiment it is preferred that ground, control module 430 can be further used for:
According to the opening of lighting apparatus in smart home described in the gait of march of the active user and/or direction controlling or
It is closed;Or
It is beaten according to audio playing device in smart home described in the gait of march of the active user and/or direction controlling
Open and close closes or volume adjustment.
Fig. 6 shows the structural schematic diagram of server according to another embodiment of the present invention.As shown in fig. 6, the equipment packet
It includes:
One or more processors 610;
Storage device 620 is configured to store one or more programs;
Communication interface 630 is configured to that the processor 610 and storage device 620 is made to be communicated with external equipment;
When one or more of programs are executed by one or more of processors 610 so that one or more
A processor 610 realizes aforementioned any intelligent home furnishing control method.
According to another embodiment of the present invention, a kind of computer readable storage medium is provided, computer program is stored with,
The program realizes aforementioned any intelligent home furnishing control method when being executed by processor.
In the description of this specification, reference term " one embodiment ", " some embodiments ", " example ", " specifically show
The description of example " or " some examples " etc. means specific features, structure, material or spy described in conjunction with this embodiment or example
Point is included at least one embodiment or example of the invention.Moreover, particular features, structures, materials, or characteristics described
It may be combined in any suitable manner in any one or more of the embodiments or examples.In addition, without conflicting with each other, this
The technical staff in field can be by the spy of different embodiments or examples described in this specification and different embodiments or examples
Sign is combined.
In addition, term " first ", " second " are used for description purposes only, it is not understood to indicate or imply relative importance
Or implicitly indicate the quantity of indicated technical characteristic." first " is defined as a result, the feature of " second " can be expressed or hidden
Include at least one this feature containing ground.In the description of the present invention, the meaning of " plurality " is two or more, unless otherwise
Clear specific restriction.
Any process described otherwise above or method description are construed as in flow chart or herein, and expression includes
It is one or more for realizing specific logical function or process the step of executable instruction code module, segment or portion
Point, and the range of the preferred embodiment of the present invention includes other realization, wherein can not press shown or discuss suitable
Sequence, include according to involved function by it is basic simultaneously in the way of or in the opposite order, to execute function, this should be of the invention
Embodiment person of ordinary skill in the field understood.
Expression or logic and/or step described otherwise above herein in flow charts, for example, being considered use
In the order list for the executable instruction for realizing logic function, may be embodied in any computer-readable medium, for
Instruction execution system, device or equipment (system of such as computer based system including processor or other can be held from instruction
The instruction fetch of row system, device or equipment and the system executed instruction) it uses, or combine these instruction execution systems, device or set
It is standby and use.For the purpose of this specification, " computer-readable medium " can any can be included, store, communicating, propagating or passing
Defeated program is for instruction execution system, device or equipment or the dress used in conjunction with these instruction execution systems, device or equipment
It sets.The more specific example (non-exhaustive list) of computer-readable medium includes following:Electricity with one or more wiring
Interconnecting piece (electronic device), portable computer diskette box (magnetic device), random access memory (RAM), read-only memory
(ROM), erasable edit read-only storage (EPROM or flash memory), fiber device and portable read-only memory
(CDROM).In addition, computer-readable medium can even is that the paper that can print described program on it or other suitable Jie
Matter, because can be for example by carrying out optical scanner to paper or other media, then into edlin, interpretation or when necessary with other
Suitable method is handled electronically to obtain described program, is then stored in computer storage.
It should be appreciated that each section of the present invention can be realized with hardware, software, firmware or combination thereof.Above-mentioned
In embodiment, software that multiple steps or method can in memory and by suitable instruction execution system be executed with storage
Or firmware is realized.It, and in another embodiment, can be under well known in the art for example, if realized with hardware
Any one of row technology or their combination are realized:With the logic gates for realizing logic function to data-signal
Discrete logic, with suitable combinational logic gate circuit application-specific integrated circuit, programmable gate array (PGA), scene
Programmable gate array (FPGA) etc..
Those skilled in the art are appreciated that realize all or part of step that above-described embodiment method carries
Suddenly it is that relevant hardware can be instructed to complete by program, the program can be stored in a kind of computer-readable storage medium
In matter, which includes the steps that one or a combination set of embodiment of the method when being executed.
In addition, each functional unit in each embodiment of the present invention can be integrated in a processing module, it can also
That each unit physically exists alone, can also two or more units be integrated in a module.Above-mentioned integrated mould
The form that hardware had both may be used in block is realized, can also be realized in the form of software function module.The integrated module is such as
Fruit is realized in the form of software function module and when sold or used as an independent product, can also be stored in a computer
In readable storage medium storing program for executing.The storage medium can be read-only memory, disk or CD etc..
The above description is merely a specific embodiment, but scope of protection of the present invention is not limited thereto, any
Those familiar with the art in the technical scope disclosed by the present invention, can readily occur in its various change or replacement,
These should be covered by the protection scope of the present invention.Therefore, protection scope of the present invention should be with the guarantor of the claim
It protects subject to range.
Claims (12)
1. a kind of intelligent home furnishing control method, which is characterized in that including:
Obtain the face-image of active user;
The default feature of the face-image is extracted, and judges the position of the active user according to the default feature extracted
And/or facial orientation;Wherein, the default feature to multiple face pictures by being trained to obtain, and multiple described people
Face picture includes the position for indicating user and/or the plurality of pictures of facial orientation;And
The operation of smart home is controlled according to the position of the active user and/or facial orientation.
2. according to the method described in claim 1, it is characterized in that, it is described obtain active user face-image after, institute
The method of stating further includes:
Judge whether the brightness of current environment is less than first threshold;And
When the brightness for determining current environment is less than the first threshold, control lighting apparatus in the smart home open with
Reacquire the face-image of the active user;And
The default feature of the extraction face-image, and judge the active user's according to the default feature extracted
Position and/or facial orientation, including:
The default feature of the face-image reacquired is extracted, and the active user is judged according to the default feature extracted
Position and/or facial orientation.
3. according to the method described in claim 1, it is characterized in that, it is described judged according to the default feature extracted it is described current
The position of user and/or facial orientation, including:
The default feature extracted is inputted and is preset in grader, judges position and/or the facial orientation of the active user;Its
In, the default grader is by indicating that the position of user and/or multiple face pictures of facial orientation are trained to obtain.
4. according to the method described in claim 1, it is characterized in that, it is described obtain active user facial image, including:
At least face-image of active user is obtained with predetermined time interval;
The default feature of the extraction face-image, and judge the active user's according to the default feature extracted
Position and/or facial orientation, including:
The default feature of every face-image in an at least face-image described in extraction, and according to the default feature extracted
Judge gait of march and/or the direction of the active user;And
It is described that the operation of smart home is controlled according to the position and/or facial orientation of the active user, including:
The operation of smart home is controlled according to the gait of march of the active user and/or direction.
5. according to the method described in claim 4, it is characterized in that, the gait of march according to the active user and/or
Direction controls the operation of smart home, including:
According to the opening of lighting apparatus in smart home described in the gait of march of the active user and/or direction controlling or close
It closes;Or
According to the opening of audio playing device in smart home described in the gait of march of the active user and/or direction controlling,
Closure or volume adjustment.
6. a kind of intelligent home control device, which is characterized in that including:
Acquisition module, the face-image for obtaining active user;
Judgment module, the default feature for extracting the face-image, and according to described in the default feature judgement extracted
The position of active user and/or facial orientation;Wherein, the default feature is to multiple face pictures by being trained to obtain,
And multiple described face pictures include the position for indicating user and/or the plurality of pictures of facial orientation;And
Control module, for being controlled the operation of smart home according to the position and/or facial orientation of the active user.
7. device according to claim 6, which is characterized in that described device further includes:
Brightness judgment module, for judging whether the brightness of current environment is less than first threshold;And
Image update module, when for determining that the brightness of current environment is less than first threshold in the brightness judgment module, control
Lighting apparatus in the smart home is opened, and indicates that the acquisition module reacquires the face figure of the active user
Picture;
And the judgment module is further used for the default feature for the face-image that extraction reacquires, and according to being extracted
Default feature judge position and/or the facial orientation of the active user.
8. device according to claim 6, which is characterized in that the judgment module is further used for:It is pre- by what is extracted
If feature input is preset in grader, position and/or the facial orientation of the active user are judged;Wherein, the default classification
Device is by indicating that the position of user and/or multiple face pictures of facial orientation are trained to obtain.
9. device according to claim 6, which is characterized in that
The acquisition module is further used for obtaining at least face-image of the active user with predetermined time interval;
The judgment module is further used for the default feature of every face-image in an at least face-image described in extraction, and
And gait of march and/or the direction of the active user is judged according to the default feature extracted;And
The control module is further used for the operation to smart home according to the gait of march and/or direction of the active user
It is controlled.
10. device according to claim 9, which is characterized in that the control module is further used for:
According to the opening of lighting apparatus in smart home described in the gait of march of the active user and/or direction controlling or close
It closes;Or
According to the opening of audio playing device in smart home described in the gait of march of the active user and/or direction controlling,
Closure or volume adjustment.
11. a kind of server, which is characterized in that the server includes:
One or more processors;
Storage device is configured to store one or more programs;
Communication interface is configured to that the processor and the storage device is made to be communicated with external equipment;
When one or more of programs are executed by one or more of processors so that one or more of processors
Realize the method as described in any in claim 1-5.
12. a kind of computer readable storage medium, is stored with computer program, which is characterized in that the program is held by processor
The method as described in any in claim 1-5 is realized when row.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810287425.3A CN108536027B (en) | 2018-03-30 | 2018-03-30 | Intelligent home control method and device and server |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810287425.3A CN108536027B (en) | 2018-03-30 | 2018-03-30 | Intelligent home control method and device and server |
Publications (2)
Publication Number | Publication Date |
---|---|
CN108536027A true CN108536027A (en) | 2018-09-14 |
CN108536027B CN108536027B (en) | 2020-11-03 |
Family
ID=63483010
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810287425.3A Active CN108536027B (en) | 2018-03-30 | 2018-03-30 | Intelligent home control method and device and server |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108536027B (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109725946A (en) * | 2019-01-03 | 2019-05-07 | 阿里巴巴集团控股有限公司 | A kind of method, device and equipment waking up smart machine based on Face datection |
CN110941196A (en) * | 2019-11-28 | 2020-03-31 | 星络智能科技有限公司 | Intelligent panel, multi-level interaction method based on angle detection and storage medium |
CN112083795A (en) * | 2019-06-12 | 2020-12-15 | 北京迈格威科技有限公司 | Object control method and device, storage medium and electronic equipment |
CN115484117A (en) * | 2022-08-30 | 2022-12-16 | 海尔优家智能科技(北京)有限公司 | Call answering method and device, storage medium and electronic device |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105700363A (en) * | 2016-01-19 | 2016-06-22 | 深圳创维-Rgb电子有限公司 | Method and system for waking up smart home equipment voice control device |
CN105843050A (en) * | 2016-03-18 | 2016-08-10 | 美的集团股份有限公司 | Intelligent household system, intelligent household control device and method |
CN106569410A (en) * | 2016-10-29 | 2017-04-19 | 深圳智乐信息科技有限公司 | Method and system for managing smart home |
CN106569467A (en) * | 2016-10-29 | 2017-04-19 | 深圳智乐信息科技有限公司 | Method for selecting scene based on mobile terminal and system |
-
2018
- 2018-03-30 CN CN201810287425.3A patent/CN108536027B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105700363A (en) * | 2016-01-19 | 2016-06-22 | 深圳创维-Rgb电子有限公司 | Method and system for waking up smart home equipment voice control device |
CN105843050A (en) * | 2016-03-18 | 2016-08-10 | 美的集团股份有限公司 | Intelligent household system, intelligent household control device and method |
CN106569410A (en) * | 2016-10-29 | 2017-04-19 | 深圳智乐信息科技有限公司 | Method and system for managing smart home |
CN106569467A (en) * | 2016-10-29 | 2017-04-19 | 深圳智乐信息科技有限公司 | Method for selecting scene based on mobile terminal and system |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109725946A (en) * | 2019-01-03 | 2019-05-07 | 阿里巴巴集团控股有限公司 | A kind of method, device and equipment waking up smart machine based on Face datection |
CN112083795A (en) * | 2019-06-12 | 2020-12-15 | 北京迈格威科技有限公司 | Object control method and device, storage medium and electronic equipment |
CN110941196A (en) * | 2019-11-28 | 2020-03-31 | 星络智能科技有限公司 | Intelligent panel, multi-level interaction method based on angle detection and storage medium |
CN115484117A (en) * | 2022-08-30 | 2022-12-16 | 海尔优家智能科技(北京)有限公司 | Call answering method and device, storage medium and electronic device |
Also Published As
Publication number | Publication date |
---|---|
CN108536027B (en) | 2020-11-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108536027A (en) | Intelligent home furnishing control method, device and server | |
CN108050674A (en) | Control method and device, the terminal of air-conditioning equipment | |
US20220317641A1 (en) | Device control method, conflict processing method, corresponding apparatus and electronic device | |
CN101860704B (en) | Display device for automatically closing image display and realizing method thereof | |
CN109188928A (en) | Method and apparatus for controlling smart home device | |
CN108153158A (en) | Switching method, device, storage medium and the server of household scene | |
CN107272607A (en) | A kind of intelligent home control system and method | |
CN107566874A (en) | Far field speech control system based on television equipment | |
CN106250012A (en) | Screen intensity and color temperature adjusting method, device and terminal unit | |
CN109147782A (en) | Control method, device and the air-conditioning of air-conditioning | |
CN108292311A (en) | Device and method for handling metadata | |
CN106649780A (en) | Information providing method and device based on artificial intelligence | |
CN108174096A (en) | Method, apparatus, terminal and the storage medium of acquisition parameters setting | |
US11341825B1 (en) | Implementing deterrent protocols in response to detected security events | |
CN108398906B (en) | Apparatus control method, device, electric appliance, total control equipment and storage medium | |
CN107642877A (en) | Air conditioning control method, device and air conditioner | |
CN107504643A (en) | The control method and device of air-conditioning | |
CN107560062A (en) | A kind of air conditioning control device, method and air-conditioning | |
CN106910496A (en) | Intelligent electrical appliance control and device | |
CN109164713B (en) | Intelligent household control method and device | |
CN107968890A (en) | theme setting method, device, terminal device and storage medium | |
CN110276320A (en) | Guard method, device, equipment and storage medium based on recognition of face | |
CN110989390A (en) | Smart home control method and device | |
CN107438019A (en) | Smart home learning control method, device and system | |
KR20180051729A (en) | Air conditioner and control method thereof |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |