CN110543102B - Method and device for controlling intelligent household equipment and computer storage medium - Google Patents

Method and device for controlling intelligent household equipment and computer storage medium Download PDF

Info

Publication number
CN110543102B
CN110543102B CN201810534044.0A CN201810534044A CN110543102B CN 110543102 B CN110543102 B CN 110543102B CN 201810534044 A CN201810534044 A CN 201810534044A CN 110543102 B CN110543102 B CN 110543102B
Authority
CN
China
Prior art keywords
user
image
action
area
adjacent images
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810534044.0A
Other languages
Chinese (zh)
Other versions
CN110543102A (en
Inventor
吴临峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Gree Electric Appliances Inc of Zhuhai
Original Assignee
Gree Electric Appliances Inc of Zhuhai
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Gree Electric Appliances Inc of Zhuhai filed Critical Gree Electric Appliances Inc of Zhuhai
Priority to CN201810534044.0A priority Critical patent/CN110543102B/en
Publication of CN110543102A publication Critical patent/CN110543102A/en
Application granted granted Critical
Publication of CN110543102B publication Critical patent/CN110543102B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B15/00Systems controlled by a computer
    • G05B15/02Systems controlled by a computer electric
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/418Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM]
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/20Pc systems
    • G05B2219/26Pc applications
    • G05B2219/2642Domotique, domestic, home control, automation, smart house

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Manufacturing & Machinery (AREA)
  • Quality & Reliability (AREA)
  • Image Analysis (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The invention discloses a method and a device for controlling intelligent household equipment and a computer storage medium, which are used for solving the technical problem that the intelligent household equipment is not controlled accurately in the prior art. The method comprises the following steps: acquiring a user image through an image acquisition device to obtain a user atlas; generating an action path of the user based on the change of the user position in the user atlas; predicting user behavior based on the action path and the historical behavior data of the user; and controlling the appointed intelligent household equipment to execute corresponding actions according to the user behavior.

Description

Method and device for controlling intelligent household equipment and computer storage medium
Technical Field
The invention relates to the field of smart home, in particular to a method and a device for controlling smart home equipment and a computer storage medium.
Background
With the continuous progress of science and technology, people enjoy the convenience brought by the technological progress and are simultaneously puzzled by new problems brought by the technological progress.
In intelligent house, can open intelligent household equipment according to the time that the user set up is automatic, if, can set up 8 in the morning: 00 automatically opening the curtain; the intelligent household appliances in the corresponding scene can be controlled according to the scene selected by the user, and if the user selects the scene of the living room, the lamps and the television in the living room can be automatically started; or, when the user needs to take food from the intelligent refrigerator, the intelligent refrigerator can automatically start the boosting system to open the refrigerator.
Although the intelligent furniture devices in the intelligent home provide various conveniences for users, the conditions of misoperation often occur, and a lot of trouble is brought to the users. If, when the user just passes by the intelligent refrigerator, the intelligent refrigerator still can detect the existence of user this moment, and then misjudge the user and need open intelligent refrigerator, just then has started the boosting system automatically and has let the refrigerator open to cause the malfunction of intelligent refrigerator, bring the puzzlement for the user.
In view of this, how to accurately control the smart home devices and reduce the occurrence of false operations becomes an urgent technical problem to be solved.
Disclosure of Invention
The invention provides a method and a system for controlling intelligent household equipment, which are used for solving the technical problem that the intelligent household equipment is not controlled accurately in the prior art.
In a first aspect, to solve the above technical problem, a technical solution of a method for controlling an intelligent home device provided in an embodiment of the present invention is as follows:
acquiring a user image through an image acquisition device to obtain a user atlas;
generating an action path of the user based on the change of the user position in the user atlas;
predicting user behavior based on the action path and the historical behavior data of the user;
and controlling the appointed intelligent household equipment to execute corresponding actions according to the user behavior.
The action path of the user is generated according to the change of the user position in the collected user atlas, the user behavior is predicted according to the action path and the historical behavior data of the user, and finally the appointed intelligent furniture equipment is controlled to execute corresponding actions according to the predicted user behavior. Therefore, the accuracy rate and the response speed of awakening the intelligent household equipment can be effectively improved, and the intelligent household equipment can be accurately controlled.
With reference to the first aspect, in a first possible implementation manner of the first aspect, generating an action path of a user based on a change in a user location in the user atlas includes:
determining the action direction of the user in the two adjacent images in the user atlas based on the user atlas and the installation position of the image collector;
and generating the action path according to the acquisition time sequence of each user image and the action direction of the user in the two adjacent images.
With reference to the first possible implementation manner of the first aspect, in a second possible implementation manner of the first aspect, the determining the action direction of the user in two adjacent images in the user map set includes:
determining the occupied area of the appointed user part in the corresponding user image from the first image and the second image respectively to obtain a first area of the first image and a second area of the second image; wherein the two adjacent images comprise the first image and the second image, and the acquisition time of the second image is later than that of the first image;
and determining the action direction according to the comparison result of the second area and the first area and the installation position of the image collector.
The action direction of the user can be quickly determined by comparing the area occupied by the specified user part in the first image and the second image in the two adjacent user images.
With reference to the first or second possible implementation manner of the first aspect, in a third possible implementation manner of the first aspect, the generating the action path according to the acquisition time sequence of each image and the action directions of the users in all two adjacent images includes:
and generating the action path according to the acquisition time sequence of each image based on the position information of each image acquirer in the living map and the action direction of the user in the two adjacent images.
By drawing all the movement directions on the home map according to the acquisition time of each image, the action path of the user can be quickly generated.
With reference to the first aspect, in a fourth possible implementation manner of the first aspect, the predicting user behavior based on the action path and the historical user behavior data includes:
determining the movement direction of the user relative to the image collector based on the action path and the installation position of the image collector;
and predicting the user behavior from the historical user behavior data according to the motion direction and the current time information.
In a second aspect, an embodiment of the present invention provides an apparatus for controlling smart home devices, including:
the acquisition unit is used for acquiring a user image through the image acquisition device to obtain a user atlas;
the generating unit is used for generating an action path of the user based on the change of the user position in the user atlas;
the prediction unit is used for predicting user behavior based on the action path and the historical user behavior data;
and the control unit is used for controlling the appointed intelligent household equipment to execute corresponding actions according to the user behaviors.
With reference to the second aspect, in a first possible implementation manner of the second aspect, the generating unit is specifically configured to:
determining the action direction of the user in the two adjacent images in the user atlas based on the user atlas and the installation position of the image collector;
and generating the action path according to the acquisition time sequence of each user image and the action direction of the user in the two adjacent images.
With reference to the first possible implementation manner of the second aspect, in a second possible implementation manner of the second aspect, the generating unit is further configured to:
determining the occupied area of the appointed user part in the corresponding user image from the first image and the second image respectively to obtain a first area of the first image and a second area of the second image; wherein the two adjacent images comprise the first image and the second image, and the acquisition time of the second image is later than that of the first image;
and determining the action direction according to the comparison result of the second area and the first area and the installation position of the image collector.
With reference to the first or second possible implementation manner of the second aspect, in a third possible implementation manner of the second aspect, the generating unit is further configured to:
and generating the action path according to the acquisition time sequence of each image based on the position information of each image acquirer in the living map and the action direction of the user in the two adjacent images.
With reference to the third possible implementation manner of the second aspect, in a fourth possible implementation manner of the second aspect, the prediction unit is specifically configured to:
determining the movement direction of the user relative to the image collector based on the action path and the installation position of the image collector;
and predicting the user behavior from the historical user behavior data according to the motion direction and the current time information.
In a third aspect, an embodiment of the present invention further provides an apparatus for controlling an intelligent home device, including:
at least one processor, and
a memory coupled to the at least one processor;
wherein the memory stores instructions executable by the at least one processor, and the at least one processor performs the method according to the first aspect by executing the instructions stored by the memory.
In a fourth aspect, an embodiment of the present invention further provides a computer-readable storage medium, including:
the computer readable storage medium stores computer instructions which, when executed on a computer, cause the computer to perform the method of the first aspect as described above.
Through the technical solutions in one or more of the above embodiments of the present invention, the embodiments of the present invention have at least the following technical effects:
in the embodiment provided by the invention, the action path of the user is generated according to the change of the user position in the collected user atlas, then the action path and the historical action data of the user are used for predicting the user action, and finally the appointed intelligent furniture equipment is controlled to execute the corresponding action according to the predicted user action. Therefore, the accuracy rate and the response speed of awakening the intelligent household equipment can be effectively improved, and the intelligent household equipment can be accurately controlled.
Drawings
Fig. 1 is a flowchart for controlling smart home devices according to an embodiment of the present invention;
fig. 2 is a schematic diagram of an installation position of an image collector in a user's home according to an embodiment of the present invention;
fig. 3 is a schematic diagram of determining an action direction by acquiring images of a user in a bedroom through an image acquirer 7 according to an embodiment of the present invention;
FIG. 4 is a diagram illustrating a user's action path generation according to an embodiment of the present invention;
fig. 5 is a schematic diagram illustrating that a designated smart home device is controlled to execute a corresponding action according to a user behavior in a user working scene according to an embodiment of the present invention;
fig. 6 is a schematic diagram illustrating that, when a child is cared for, a specific smart home device is controlled to execute a corresponding action according to a user behavior according to an embodiment of the present invention;
fig. 7 is a schematic structural diagram of an apparatus for controlling an intelligent furniture device according to an embodiment of the present invention.
Detailed Description
The embodiment of the invention provides a method and a device for controlling intelligent household equipment and a computer storage medium, and aims to solve the technical problem that the intelligent household equipment cannot be controlled accurately in the prior art.
In order to solve the technical problems, the general idea of the embodiment of the present application is as follows:
the method for controlling the intelligent household equipment comprises the following steps: acquiring a user image through an image acquisition device to obtain a user atlas; generating an action path of the user based on the change of the user position in the user atlas; predicting user behavior based on the action path and the historical behavior data of the user; and controlling the appointed intelligent household equipment to execute corresponding actions according to the user behavior.
According to the scheme, the action path of the user is generated according to the change of the user position in the collected user atlas, the user behavior is predicted according to the action path and the historical behavior data of the user, and finally the appointed intelligent furniture equipment is controlled to execute corresponding actions according to the predicted user behavior. Therefore, the accuracy rate and the response speed of awakening the intelligent household equipment can be effectively improved, and the intelligent household equipment can be accurately controlled.
In order to better understand the technical solutions of the present invention, the following detailed descriptions of the technical solutions of the present invention are provided with the accompanying drawings and the specific embodiments, and it should be understood that the specific features in the embodiments and the examples of the present invention are the detailed descriptions of the technical solutions of the present invention, and are not limitations of the technical solutions of the present invention, and the technical features in the embodiments and the examples of the present invention may be combined with each other without conflict.
Referring to fig. 1, an embodiment of the present invention provides a method for controlling an intelligent home device, which is applied to an intelligent home control system, and the processing procedure of the method is as follows.
Step 101: and acquiring the user image through an image acquisition device to obtain a user atlas.
The specific image collector can be a camera installed on the intelligent household equipment, a normal state camera installed in a public space area and used for security protection, and an infrared camera installed in a private space.
For example, referring to fig. 2, the image collector 7 is installed at a doorway of a bedroom, the image collector 8 is installed at a doorway of a kitchen, and the image collector is located in a private space, so that the image collectors 7 and 8 are infrared cameras which do not record pictures but can detect the moving direction of a user; the image collector 1 is arranged on a passageway, the image collector 2 is arranged in a living room, and the image collector 3 is arranged in a restaurant, which all belong to a public area, so that a normal camera of a security system is arranged; and the image collector 4 is a camera of the intelligent refrigerator, the image collector 5 is a camera of the intelligent television, and the image collector 6 is a camera of the entrance door. The cameras can completely collect user images of all rooms in the house.
It should be understood that these cameras may capture video images of users, and may also capture photo images, and the specific user image is not limited herein as to whether it is a video or a photo. The user image acquisition is performed for each user, but not for one user, and specifically, which user is currently predicted, and may be determined by performing face recognition on the acquired user image.
After the user image is acquired, step 102 may be performed.
Step 102: an action path of the user is generated based on a change in the user's location in the user atlas.
The specific steps for generating the action path of the user are as follows:
firstly, the action direction of the user in two adjacent images in the user atlas is determined based on the installation position of the user atlas and the image collector.
Specifically, the action direction of a user in two adjacent images in a user image set is determined, namely, the occupied area of a specified user part in the corresponding user image is determined from the first image and the second image respectively, and the first area of the first image and the second area of the second image are obtained; the two adjacent images comprise a first image and a second image, and the acquisition time of the second image is later than that of the first image; and then determining the action direction according to the comparison result of the second area and the first area and the installation position of the image collector.
For example, please refer to fig. 3, which specifically illustrates the user leaving the bedroom.
In the process that the user moves from the position 1 to the position 3, the image collector 7 at the door of the bedroom collects the user image 1 of the user at the position 1, the user image 2 at the position 2 and the user image 3 at the position 3, the area occupied by the user head portrait (assumed as the designated user part) in the user image 1 can be calculated to be 300 pixel points, the user head portrait in the user image 2 is 600 pixel points, the user head portrait in the user image 3 is 900 pixel points from the collected user images, since the image collector 7 is installed at the doorway of the bedroom, it can be determined according to the image imaging principle that the closer the user is to the image collector 7, the larger the area of the user image of the same part acquired by the image acquirer 7 is, the more the user can determine that the action direction of the user moves from the position 1 to the position 2 and then to the position 3 according to the size of the head portrait of the user.
It should be understood that the above example is only an illustrative example, and in an actual use process, if the image collector collects a video image, a user avatar in a video frame image may be extracted, and the area size of the user avatar may be compared frame by frame (or video frames may be extracted at equal time intervals), so as to determine the motion direction of the user.
And secondly, generating a motion path according to the acquisition time sequence of each user image and the motion direction of the user in the two adjacent images. Specifically, the action path is generated according to the collection time sequence of each image based on the position information of each image collector in the living map and the action direction of the user in two adjacent images.
For example, the moving direction of the user in two adjacent images can be determined in the monitoring area of each image collector in fig. 4 in the manner of fig. 3, that is, the user moves from a bedroom to the front of an intelligent refrigerator in a kitchen, passes through a passageway, a dining room and finally reaches the front of the refrigerator in the kitchen during the moving process. Then, all the action directions are connected in series according to the acquisition time of the user image to form an action track of the user in the monitoring area of each image acquirer, then the action tracks in the monitoring areas corresponding to the image acquirers are spliced together to form an action path of the user in a user map, and please refer to fig. 4 specifically.
It should be understood that the home map may be a house type map as shown in fig. 1, and the home map may be preset by a user, or may be automatically generated by each image collector installed in the home of the user according to a respective monitoring area, and the home map obtained in what manner is not limited herein.
After the action path of the user is generated, step 103 and step 104 can be executed.
Step 103: and predicting the user behavior based on the action path and the historical behavior data of the user.
Step 104: and controlling the appointed intelligent household equipment to execute corresponding actions according to the user behavior.
Specifically, the step of predicting the user behavior is to determine the movement direction of the user relative to the image collector based on the action path and the installation position of the image collector; and predicting the user behavior from the historical behavior data of the user according to the motion direction and the current time information.
The historical user behavior data are obtained according to the past operation records and action paths of each user, and when the user behavior is predicted, the historical behavior data of all users can be comprehensively analyzed through a specified algorithm, so that the user behavior is accurately predicted by combining the action paths of the users.
Furthermore, when the historical behavior data of the user is comprehensively analyzed, the habitual operation action of the user can be extracted from the historical behavior data, and the specified service can be provided for the user according to the habitual action. For example, when a plurality of users are inconvenient to browse menus in a kitchen and the historical behavior data of the users are comprehensively analyzed, the conventional action of hanging gesture page turning can be analyzed and added into the system through the method, so that when the gesture of hanging page turning occurs when the users cook, the electronic menus are controlled to automatically turn pages. By the method, not only can private customized service be realized, but also improvement on comprehensive big data can be realized.
The action for controlling the smart home device to execute may specifically be:
when all the image collectors do not detect indoor activities, the security system is automatically turned on, and some electric appliances which are not turned off are turned off. Or, when the image collector 2 identifies that no person is in the living room, the sweeping robot is controlled to automatically start working. Alternatively, when the user turns off the gas range and leaves the kitchen, the hood will be controlled to turn off after two minutes. Or, when an abnormal condition (such as the balcony door is opened, but the user is not detected to exit from the bedroom) is detected during the sleeping period of the user, the early warning reminding is carried out on the user.
The step of controlling the intelligent household equipment to execute the corresponding action by predicting the user behavior can be that when the fact that the user bends down in front of the intelligent refrigerator is detected, the lower-layer boosting system is opened in advance; when the situation that the user opens the entrance door and the intelligent key ring is in the living room is detected, the key ring is controlled to ring for reminding.
In the following, through several specific use scenarios, user behaviors are predicted according to historical data in the recording process and the later period of historical data, and the work of the specified smart home devices is controlled according to the predicted user behaviors.
Scene one, user working
Referring to fig. 5, the image collector acquires the user image to obtain the action path of the user as getting up from the bedroom on monday morning 6:40 minutes of the data are passed through a bedroom door (recorded by the image collector 7), pass through a passageway to reach a dining room area (recorded by the image collectors 1 and 3), and pass through a kitchen door (recorded by the image collector 8) to open a cold storage area (recorded by the image collector 4) of the intelligent refrigerator.
In the action path, the user behavior data is: it takes about 6 seconds to pass the image collector 7, 8 for collection, then within 5 minutes the curtains will be opened, the television will be opened etc. and will leave home at 7:10 (recorded by the image collector 6).
Through long-term (such as one week or one month) recording, the historical behavior data of the user in the morning of 6:40-7:00 of the working day can be counted as: the user can leave the bedroom in the morning 6:40, pass through the passageway and the dining room, and enter the kitchen for use (without stooping action). When the user passes through the image collectors 7 and 8, the time is about 6 seconds, then the curtain is opened within 5 minutes, the intelligent television is opened, and the like, and the time is about 7:10 will leave the home.
In the actual use process, for example, the image collector collects the user image of getting up of the user in morning 6:38 of Monday, so that the historical behavior data of 6:40-7:00 of morning of working day is predicted to be executed by the user, and the appointed intelligent furniture equipment is made to execute the following actions in the process of generating the action path of the user according to the historical behavior data: in the following step 6: when the user is determined to be close to the image collector 7 in 40 minutes, judging whether to turn on the corridor and the kitchen lamp of the restaurant according to the light, and turning on the water purifier; when the user is determined to be close to the image collector 8, the corridor lamp is turned off, and the intelligent refrigerator starts the boosting system; the intelligent door lock is opened from the inside and then closed when the intelligent door lock is divided into a part 7:10 and a part right (when the image collector 6 determines that the user arrives); and when all the image collectors do not detect the movement of the object, all the intelligent household equipment needing to be closed are closed, and the security system is automatically started.
Scene two, nursing children
Referring to fig. 6, the image collector acquires the user image to obtain that the action path of the child user on the weekend is 8:00 children left the bedroom and passed through the aisle and waited for about 15 minutes in the dining room area, and then all the time in the morning will be in the living room area. The balcony door is normally closed in one day and cannot be opened until about 7 o' clock in the evening due to safety considerations. Wherein, children are small but have basic safety consciousness, and parents do not study or rest at home at every family or child at weekends.
In the above action path, the behavior data of the user (child) on the weekend are: the user can not go out in the morning at 8:00, and the user can not visit in the morning, so that the user is short in personal except for entertainment equipment, and the household appliance can not be used generally.
The historical behavior data of the user (child) on the weekend can be counted as the behavior data also described above through recording for a long time (such as a week or a month).
In the actual use process, if the user (child) gets up in 8:00 morning on saturday, the user is predicted to be in the action path of saturday according to historical behavior data, and the specified smart home equipment is made to execute the following actions: when it is determined in the morning that a user (child) gets up to be close to the image collector 5 at 8:00, whether the light needs to be turned on or not can be detected according to the light, the living room curtain is completely opened, and a message can be pushed to remind parents of getting up, the child tries to go to a balcony after dinner, the image collector No. 2 and the image collector 6 detect that the user (child) tries to open the balcony window for a long time, then the message is pushed to remind the parents of reminding the child of trying to enter the balcony, and the parents call home to tell the child not to enter the balcony. And then, the child enters the living room area and uses the intelligent television within the range of the image collector 2 for a long time. The screen of the intelligent television is paused at 12:00 noon, and the child is reminded to go to the next-door neighbor to eat lunch 15 minutes later. In the afternoon, the child moves in the neighbor home, the image collector 4 detects that the visitor visits, pushes a message to the parent, and the parent calls the neighbor to help and receive the call until the parent returns home at night.
It should be understood that, in the above method, the identity of the user, specifically, whether the user is a mother, a father, a child, or a visitor, may be determined by the collected user characteristics, such as facial image, height, frequency of appearance, and the like, or may be set by the user, and specifically, the manner in which the user in the user image is identified is not limited herein.
Based on the same inventive concept, an embodiment of the present invention provides an apparatus for controlling smart home devices, where specific implementation manners of a control method of the apparatus may refer to descriptions in the method embodiment, and repeated descriptions are omitted, please refer to fig. 7, and the apparatus includes:
the acquisition unit 701 is used for acquiring a user image through an image acquisition device to obtain a user atlas;
a generating unit 702, configured to generate an action path of the user based on a change in the user position in the user atlas;
a prediction unit 703, configured to predict a user behavior based on the action path and the user historical behavior data;
and the control unit 704 is configured to control the specified smart home device to execute a corresponding action according to the user behavior.
Optionally, the generating unit 702 is specifically configured to:
determining the action direction of the user in the two adjacent images in the user atlas based on the user atlas and the installation position of the image collector;
and generating the action path according to the acquisition time sequence of each user image and the action direction of the user in the two adjacent images.
Optionally, the generating unit 702 is further configured to:
determining the occupied area of the appointed user part in the corresponding user image from the first image and the second image respectively to obtain a first area of the first image and a second area of the second image; wherein the two adjacent images comprise the first image and the second image, and the acquisition time of the second image is later than that of the first image;
and determining the action direction according to the comparison result of the second area and the first area and the installation position of the image collector.
Optionally, the generating unit 702 is further configured to:
and generating the action path according to the acquisition time sequence of each image based on the position information of each image acquirer in the living map and the action direction of the user in the two adjacent images.
Optionally, the prediction unit 703 is specifically configured to:
determining the movement direction of the user relative to the image collector based on the action path and the installation position of the image collector;
and predicting the user behavior from the historical user behavior data according to the motion direction and the current time information.
Based on the same inventive concept, the embodiment of the invention provides a device for controlling intelligent household equipment, which comprises: at least one processor, and
a memory coupled to the at least one processor;
the memory stores instructions executable by the at least one processor, and the at least one processor executes the method for controlling the smart home devices by executing the instructions stored in the memory.
Based on the same inventive concept, an embodiment of the present invention further provides a computer-readable storage medium, including:
the computer readable storage medium stores computer instructions which, when executed on a computer, cause the computer to execute the method for controlling the smart home device as described above.
In the embodiment provided by the invention, the action path of the user is generated according to the change of the user position in the collected user atlas, then the action path and the historical action data of the user are used for predicting the user action, and finally the appointed intelligent furniture equipment is controlled to execute the corresponding action according to the predicted user action. Therefore, the accuracy rate and the response speed of awakening the intelligent household equipment can be effectively improved, and the intelligent household equipment can be accurately controlled.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, embodiments of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, embodiments of the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
Embodiments of the present invention are described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present invention without departing from the spirit and scope of the invention. Thus, if such modifications and variations of the present invention fall within the scope of the claims of the present invention and their equivalents, the present invention is also intended to include such modifications and variations.

Claims (8)

1. A method for controlling intelligent household equipment is applied to an intelligent household control system and is characterized by comprising the following steps:
acquiring a user image through an image acquisition device to obtain a user atlas;
determining the action direction of the user in the two adjacent images in the user atlas based on the user atlas and the installation position of the image collector; generating an action path of the user according to the acquisition time sequence of each user image and the action direction of the user in the two adjacent images;
predicting user behavior based on the action path and the historical behavior data of the user;
controlling the appointed intelligent household equipment to execute corresponding actions according to the user behaviors;
determining the action direction of the user in two adjacent images in the user image set comprises the following steps:
determining the occupied area of the appointed user part in the corresponding user image from the first image and the second image respectively to obtain a first area of the first image and a second area of the second image; wherein the two adjacent images comprise the first image and the second image, and the acquisition time of the second image is later than that of the first image;
and determining the action direction according to the comparison result of the second area and the first area and the installation position of the image collector.
2. The method of claim 1, wherein generating the motion path in the order of time of acquisition of each image and the direction of motion of the user in all two adjacent images comprises:
and generating the action path according to the acquisition time sequence of each image based on the position information of each image acquirer in the living map and the action direction of the user in the two adjacent images.
3. The method of claim 1, wherein predicting user behavior based on the action path and user historical behavior data comprises:
determining the movement direction of the user relative to the image collector based on the action path and the installation position of the image collector;
and predicting the user behavior from the historical user behavior data according to the motion direction and the current time information.
4. The utility model provides a control smart home devices's device which characterized in that includes:
the acquisition unit is used for acquiring a user image through the image acquisition device to obtain a user atlas;
the generating unit is used for determining the action direction of the user in the two adjacent images in the user atlas based on the user atlas and the installation position of the image collector; generating an action path of the user according to the acquisition time sequence of each user image and the action direction of the user in the two adjacent images;
the prediction unit is used for predicting user behavior based on the action path and the historical user behavior data;
the control unit is used for controlling the appointed intelligent household equipment to execute corresponding actions according to the user behaviors;
the generating unit is further configured to:
determining the occupied area of the appointed user part in the corresponding user image from the first image and the second image respectively to obtain a first area of the first image and a second area of the second image; wherein the two adjacent images comprise the first image and the second image, and the acquisition time of the second image is later than that of the first image;
and determining the action direction according to the comparison result of the second area and the first area and the installation position of the image collector.
5. The apparatus of claim 4, wherein the generating unit is further to:
and generating the action path according to the acquisition time sequence of each image based on the position information of each image acquirer in the living map and the action direction of the user in the two adjacent images.
6. The apparatus of claim 4, wherein the prediction unit is specifically configured to:
determining the movement direction of the user relative to the image collector based on the action path and the installation position of the image collector;
and predicting the user behavior from the historical user behavior data according to the motion direction and the current time information.
7. The utility model provides a control smart home devices's device which characterized in that includes:
at least one processor, and
a memory coupled to the at least one processor;
wherein the memory stores instructions executable by the at least one processor, the at least one processor performing the method of any one of claims 1-3 by executing the instructions stored by the memory.
8. A computer-readable storage medium characterized by:
the computer readable storage medium stores computer instructions that, when executed on a computer, cause the computer to perform the method of any of claims 1-3.
CN201810534044.0A 2018-05-29 2018-05-29 Method and device for controlling intelligent household equipment and computer storage medium Active CN110543102B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810534044.0A CN110543102B (en) 2018-05-29 2018-05-29 Method and device for controlling intelligent household equipment and computer storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810534044.0A CN110543102B (en) 2018-05-29 2018-05-29 Method and device for controlling intelligent household equipment and computer storage medium

Publications (2)

Publication Number Publication Date
CN110543102A CN110543102A (en) 2019-12-06
CN110543102B true CN110543102B (en) 2020-12-04

Family

ID=68701103

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810534044.0A Active CN110543102B (en) 2018-05-29 2018-05-29 Method and device for controlling intelligent household equipment and computer storage medium

Country Status (1)

Country Link
CN (1) CN110543102B (en)

Families Citing this family (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110532962A (en) * 2019-08-30 2019-12-03 上海秒针网络科技有限公司 Detection method and device, storage medium and the electronic device of track
CN111288513A (en) * 2020-01-10 2020-06-16 余姚市远望电器有限公司 Control method and system of range hood, computer storage medium and range hood
CN111427287B (en) * 2020-02-20 2021-11-16 珠海格力电器股份有限公司 Intelligent kitchen control method and device, electronic equipment and storage medium
CN111586951B (en) * 2020-04-30 2021-09-14 珠海格力电器股份有限公司 Control method, control device, system and storage medium of hotel lighting device
CN112050944B (en) * 2020-08-31 2023-12-08 深圳数联天下智能科技有限公司 Gate position determining method and related device
CN112652099B (en) * 2020-12-29 2023-05-09 深圳市欧瑞博科技股份有限公司 Intelligent control method for user authority, intelligent communication equipment and computer readable storage medium
CN112782988A (en) * 2020-12-30 2021-05-11 深圳市微网力合信息技术有限公司 Control method of intelligent household curtain based on Internet of things
CN112863126A (en) * 2020-12-31 2021-05-28 苏州圣珀软件科技有限公司 Decentralized home monitoring system
CN112838968B (en) * 2020-12-31 2022-08-05 青岛海尔科技有限公司 Equipment control method, device, system, storage medium and electronic device
CN113156829A (en) * 2021-04-23 2021-07-23 广东海火虚拟现实技术服务有限公司 Equipment starting method and system
CN113110089A (en) * 2021-04-29 2021-07-13 广东电网有限责任公司 Household electrical appliance optimized use management method, device, equipment and medium
CN113110094B (en) * 2021-05-18 2021-10-22 珠海瑞杰电子科技有限公司 Intelligent home control system based on Internet of things
CN113158990A (en) * 2021-05-19 2021-07-23 云米互联科技(广东)有限公司 HomeMap-based child nursing method and device
CN113516051A (en) * 2021-05-19 2021-10-19 云米互联科技(广东)有限公司 HomeMap-based security monitoring method and device
CN114415527A (en) * 2021-12-07 2022-04-29 珠海格力电器股份有限公司 Smart home pre-starting method and device
CN114285689B (en) * 2021-12-22 2023-02-17 珠海格力电器股份有限公司 Device control method, device, electronic device and computer-readable storage medium
CN115016311B (en) * 2022-07-06 2023-05-23 慕思健康睡眠股份有限公司 Intelligent device control method, device, equipment and storage medium
CN115512479B (en) * 2022-09-09 2024-04-09 北海市冠标智慧声谷科技有限责任公司 Method for managing reception information and back-end equipment

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8532839B2 (en) * 2009-06-22 2013-09-10 Johnson Controls Technology Company Systems and methods for statistical control and fault detection in a building management system
US20150241860A1 (en) * 2014-02-24 2015-08-27 Raid And Raid, Inc., D/B/A Ruminate Intelligent home and office automation system
CN104899564B (en) * 2015-05-29 2019-01-25 中国科学院上海高等研究院 A kind of human body behavior real-time identification method
CN106549833B (en) * 2015-09-21 2020-01-21 阿里巴巴集团控股有限公司 Control method and device for intelligent household equipment
CN106383450A (en) * 2016-11-10 2017-02-08 北京工商大学 Smart home user behavior analyzing system and smart home user behavior analyzing method based on big data
CN106842972A (en) * 2017-03-14 2017-06-13 上海斐讯数据通信技术有限公司 The forecast Control Algorithm and system of a kind of intelligent home device
CN107065591A (en) * 2017-06-13 2017-08-18 重庆城市管理职业学院 A kind of method and system controlled for indoor electric appliance
CN107665230B (en) * 2017-06-21 2021-06-01 海信集团有限公司 Training method and device of user behavior prediction model for intelligent home control
CN108038418B (en) * 2017-11-14 2020-08-25 珠海格力电器股份有限公司 Garbage cleaning method and device

Also Published As

Publication number Publication date
CN110543102A (en) 2019-12-06

Similar Documents

Publication Publication Date Title
CN110543102B (en) Method and device for controlling intelligent household equipment and computer storage medium
US11861750B2 (en) Unattended smart property showing
US11120559B2 (en) Computer vision based monitoring system and method
EP3316583B1 (en) Timeline-video relationship presentation for alert events
US20220122435A1 (en) Systems and Methods for Categorizing Motion Events
US9588640B1 (en) User interface for video summaries
US9805567B2 (en) Temporal video streaming and summaries
US9449229B1 (en) Systems and methods for categorizing motion event candidates
US9386281B2 (en) Image surveillance and reporting technology
KR100978011B1 (en) System and method for adapting the ambience of a local environment according to the location and personal preferences of people in the local environment
US20170078767A1 (en) Video searching for filtered and tagged motion
WO2015184700A1 (en) Device and method for automatic monitoring and autonomic response
US20170076156A1 (en) Automatically determining camera location and determining type of scene
US10706699B1 (en) Projector assisted monitoring system
US11315394B1 (en) Integrated doorbell devices
US10777057B1 (en) Premises security system with audio simulating occupancy
JP2011090408A (en) Information processor, and action estimation method and program of the same
CN111552189A (en) Method for starting scene mode, intelligent home controller and storage medium
CN109976174A (en) Home intelligent appliance control method, device and intelligent panel based on resident's habit
WO2017046704A1 (en) User interface for video summaries
EP3616095A1 (en) Computer vision based monitoring system and method
CN109932920A (en) A kind of smart home temprature control method, device and intelligent panel
US11869104B2 (en) Visitor-tailored property configuration
CN112911154B (en) Snapshot method, server and computer storage medium
US11305416B1 (en) Dynamic arrangement of motorized furniture

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant