CN115802561A - Intelligent household lighting method and device and storage medium - Google Patents

Intelligent household lighting method and device and storage medium Download PDF

Info

Publication number
CN115802561A
CN115802561A CN202211562041.0A CN202211562041A CN115802561A CN 115802561 A CN115802561 A CN 115802561A CN 202211562041 A CN202211562041 A CN 202211562041A CN 115802561 A CN115802561 A CN 115802561A
Authority
CN
China
Prior art keywords
user
area
activity
lighting
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211562041.0A
Other languages
Chinese (zh)
Inventor
王春涛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Ledmy Co ltd
Original Assignee
Shenzhen Ledmy Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Ledmy Co ltd filed Critical Shenzhen Ledmy Co ltd
Priority to CN202211562041.0A priority Critical patent/CN115802561A/en
Publication of CN115802561A publication Critical patent/CN115802561A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02BCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO BUILDINGS, e.g. HOUSING, HOUSE APPLIANCES OR RELATED END-USER APPLICATIONS
    • Y02B20/00Energy efficient lighting technologies, e.g. halogen lamps or gas discharge lamps
    • Y02B20/40Control techniques providing energy savings, e.g. smart controller or presence detection

Landscapes

  • Circuit Arrangement For Electric Light Sources In General (AREA)

Abstract

The method comprises the steps of obtaining first image information of a first preset area outside a target area, extracting character features of the first image information, judging whether a user currently located in the first preset area has lighting starting permission, if so, determining to-be-lit equipment needing to be started, and controlling to start the to-be-lit equipment. The method and the device have the effect of widening the use scene of the smart home.

Description

Intelligent household lighting method and device and storage medium
Technical Field
The present application relates to the field of smart home lighting, and in particular, to a method, an apparatus, and a storage medium for smart home lighting.
Background
The intelligent household lighting is a distributed wireless remote measurement, remote control and remote communication control system which is formed by technologies such as a computer, a wireless communication data transmission technology, a spread spectrum power carrier communication technology, computer intelligent information processing, energy-saving electric appliance control and the like, so that intelligent control over household lighting equipment and even household living equipment is realized.
However, in the current smart home lighting, a user needs to control the smart home lighting to be turned on in advance through an authorization device, or control the smart home lighting in a voice instruction mode, that is, the user needs to operate the authorization device in advance, or perform field voice control, and when the user is inconvenient to operate the authorization device (for example, the user carries a large number of articles with both hands) and perform field voice control (deaf-mute or throat inflammation), the smart home lighting cannot be used, so that the current smart home lighting has a limited available scene.
Disclosure of Invention
In order to widen the use scene of the intelligent household lighting, the application provides an intelligent household lighting method, an intelligent household lighting device and a storage medium.
In a first aspect, the application provides a method for smart home lighting, which adopts the following technical scheme:
a method of smart home lighting, comprising:
acquiring first image information of a first preset area outside a target area;
extracting character features of the first image information, and judging whether a user currently in the first preset area has lighting starting permission or not;
if the user has the lighting starting permission, determining equipment to be lighted needing to be started;
and controlling the device to be illuminated to be turned on.
By adopting the technical scheme, the target area is an area which a user is about to enter, the first preset area is a specific area which is set in advance, the first image of the first preset area outside the target area is obtained, so that whether the user currently positioned in the first preset area is a user having the permission to enter the target area or not is judged, character feature extraction is carried out on the first image information, whether the user currently positioned in the first preset area has the permission to enter the target area or not is judged, namely, whether the user has the permission to turn on the illumination or not is judged, when the user has the permission to turn on the illumination, the illumination device to be turned on is determined, so that the illumination device to be turned on is controlled to be turned on subsequently, the effect that whether the illumination device needs to be turned on or not can be actively judged by the illumination of the intelligent home, and the effect that the illumination device which needs to be turned on is determined is achieved, so that the user does not need to issue an instruction, the illumination device can be controlled to be turned on, and the use scene of the illumination of the intelligent home is widened.
In another possible implementation manner, the performing of the person feature extraction on the first image information and determining whether a user currently located in the first preset area has an illumination turning-on permission includes:
extracting the character characteristics of the first image information to obtain character characteristic information;
carrying out identity matching on the figure characteristic information in a preset figure information base, wherein figure characteristic information corresponding to each user with lighting starting permission is stored in the preset figure information base;
and if the matching is successful, determining that the user has the lighting starting permission.
By adopting the technical scheme, the character feature extraction is carried out on the first image information to obtain the character feature information, the preset character information base is a preset character information base, and the character feature information corresponding to each user with the lighting starting permission is stored in the character information base, so that the character feature information is subjected to identity matching in the preset character information base, whether the user corresponding to the first image information has the permission of entering the target area or not can be judged, whether the current user has the permission of starting the lighting equipment or not is determined, and when the matching is successful, the character feature information of the user exists in the preset character information base, and therefore the user can be determined to have the lighting starting permission.
In another possible implementation manner, the method further includes:
performing target identification on the first image information;
judging whether the user carries an article or not based on a target identification result;
wherein, the device to be illuminated which needs to be turned on is determined, and the method comprises the following steps:
if the user carries an article, determining the type information of the article, and determining the to-be-turned-on lighting equipment based on the type information, wherein the type information corresponds to the lighting equipment;
if the user does not carry articles, determining identity information of the user, acquiring a historical activity track of the user based on the identity information, and determining to-be-turned-on equipment based on the historical activity track.
By adopting the technical scheme, because the position that the user that whether the user carries article correspondence needs to go to after getting into the target area is different to make the lighting apparatus of treating that needs to open different, consequently can be at first through carrying out target identification to first image information, so that according to the target identification result, judge whether the user carries article, thereby be convenient for determine the lighting apparatus of treating that needs to open. Specifically, when the user carries an article, it indicates that the user needs to place the article in a specific area of the article after entering a target area, and since the different lighting devices corresponding to the different type information of the article are different, the type information of the article can be determined, and then the to-be-turned-on lighting device can be determined according to the type information. Therefore, the effect of determining the equipment to be illuminated when the user carries the article is achieved. When the user does not carry articles, because the historical activity track corresponding to the user exists after each user enters the target area, the identity information of the user can be determined, so that the historical activity track belonging to the user can be determined according to the identity information of the user, and the to-be-illuminated equipment needing to be turned on can be determined according to the historical activity track of the user. And then reach when the user does not carry article, determine the effect of waiting the lighting apparatus.
In another possible implementation manner, the historical activity track is a historical activity track including a second preset area; the historical activity track comprises at least two activity tracks, and each activity track corresponds to at least two activity areas;
wherein, based on the historical activity track, determining the equipment to be lighted to be turned on comprises:
determining a target track from the historical activity tracks, wherein the target track is the track with the largest occurrence frequency in the historical activity tracks;
determining an activity area contained in the target track, and determining lighting equipment corresponding to the activity area;
and determining the lighting equipment corresponding to the activity area as the equipment to be lighted needing to be turned on.
By adopting the technical scheme, the second preset area is an area set in advance, the historical activity tracks are activity tracks comprising the second preset area, and the historical activity tracks of each user in the target area are possibly at least two, so that the target tracks can be determined from the historical activity tracks, wherein the tracks with the largest occurrence frequency in the historical activity tracks can be determined as the target tracks, and the activity areas contained in the target tracks can be determined due to the fact that each historical activity track corresponds to at least two activity areas, so that the lighting equipment corresponding to the activity areas is determined, the lighting equipment is determined as the to-be-switched-on lighting equipment, and the to-be-switched-on lighting equipment can be determined according to the historical activity tracks.
In another possible implementation manner, the target area includes at least one active area, and each active area corresponds to a lighting device, and the controlling the device to be lighted is turned on, and then the method further includes:
acquiring the activity times of the user in each activity area within each preset fixed time period;
determining the activity area with the most activity times as a common activity area corresponding to each preset fixed time period;
acquiring current time and a current activity area where the user is currently located;
determining a preset fixed time period based on the current time;
if the current activity area does not belong to a target common activity area and the user is detected to be in the edge area of the current activity area, controlling lighting equipment of the target common activity area to be turned on, wherein the target common activity area is a common activity area corresponding to the preset fixed time period.
By adopting the technical scheme, the target area comprises at least one activity area, each activity area corresponds to the lighting equipment, the preset fixed time period is a time period set in advance, when the user is in the target area, the lighting can be conveniently provided for the user, the activity times of the user in each activity area in each preset fixed time period can be obtained, the activity area with the largest activity times is determined as the common activity area corresponding to each preset fixed time period, the current time and the current activity area where the user is currently located are obtained, the preset fixed time period corresponding to the current time of the user is determined according to the current time subsequently, whether the current user is in the target common activity area or not is conveniently judged, and the target common activity area is the common activity area corresponding to the preset fixed time period to which the current time belongs. When the current activity area does not belong to the target common activity area and the user is located in the edge area of the current activity area, the fact that the user is likely to leave the current activity area is shown, the lighting device of the target common activity area can be controlled to be turned on, and therefore the effect of predicting the activity area of the user is achieved, the intelligent home lighting is enabled to be more intelligent, and the experience of the user is improved.
In another possible implementation manner, controlling the device to be illuminated to be turned on further includes:
if the fact that the user leaves an activity area corresponding to a certain lighting device is detected, calculating the time length of the user leaving the activity area corresponding to the certain lighting device;
and if the time length reaches the preset time length, controlling the certain lighting equipment to be closed.
By adopting the technical scheme, when the condition that the user leaves the activity area corresponding to the lighting equipment for a short time or leaves the activity area for a long time is detected, the preset time is the time set in advance, and the time for the user to leave the activity area is judged whether to be overlong, so that the time for the user to leave the activity area corresponding to the lighting equipment is calculated, when the time reaches the preset time, the time for the user to leave the activity area corresponding to the lighting equipment is longer, the lighting equipment can be controlled to be turned off, and the waste of electric quantity is reduced.
In another possible implementation manner, the controlling the device to be illuminated to turn on further includes:
acquiring second image information of the user;
performing behavior recognition on the second image information to obtain a behavior recognition result;
and adjusting the light intensity of the equipment to be illuminated based on the behavior recognition result.
By adopting the technical scheme, the second image information of the user is acquired, the behavior of the second image information is recognized, the behavior recognition result is obtained, and the light intensity of the lighting equipment can be adjusted according to the behavior recognition result because the light intensity required by different behavior states of the user is different, so that the effect of intelligently providing lighting requirements for the user is achieved, and the experience of the user is improved.
The second aspect, this application provides a device of intelligence house illumination, adopts following technical scheme:
an intelligent home lighting device, comprising:
the first acquisition module is used for acquiring first image information of a first preset area outside the target area;
the first judgment module is used for extracting character features of the first image information and judging whether a user currently in the first preset area has lighting starting permission or not;
the first determining module is used for determining the equipment to be lighted needing to be lighted when the user has the lighting starting permission;
and the first control module is used for controlling the equipment to be illuminated to be started.
By adopting the technical scheme, the target area is an area which a user is about to enter, the first preset area is a specific area which is set in advance, the first image of the first preset area outside the target area is obtained through the first obtaining module, so that whether the user currently located in the first preset area is a user having the authority of entering the target area or not is judged, character feature extraction is carried out on the first image information, the first judging module can judge whether the user currently located in the first preset area has the authority of entering the target area or not, namely whether the user has the authority of turning on the illumination or not, when the user has the authority of turning on the illumination, the first determining module determines the to-be-illuminated equipment which needs to be turned on, so that the following first control module controls the to-be-illuminated equipment to be turned on, the intelligent household illumination can be actively judged whether the to-be-turned on illumination equipment or not, the specific effect of the to-be-turned-on illumination equipment is determined, the user does not need to issue an instruction, the turning-on of the household illumination equipment can be controlled, and the use scene of the intelligent household illumination is widened.
In another possible implementation manner, when performing person feature extraction on the first image information and determining whether a user currently located in the first preset area has an illumination-on right, the first determining module is specifically configured to:
extracting the character features of the first image information to obtain character feature information;
carrying out identity matching on the figure characteristic information in a preset figure information base, wherein figure characteristic information corresponding to each user with the lighting starting permission is stored in the preset figure information base;
and if the matching is successful, determining that the user has the lighting starting permission.
In another possible implementation manner, the apparatus further includes:
the first identification module is used for carrying out target identification on the first image information;
and the second judgment module is used for judging whether the user carries an article or not based on the target identification result.
In another possible implementation manner, when determining the device to be illuminated that needs to be turned on, the first determining module is specifically configured to:
if the user carries an article, determining the type information of the article, and determining the equipment to be illuminated to be turned on based on the type information, wherein the type information corresponds to the illumination equipment;
if the user does not carry articles, determining identity information of the user, acquiring a historical activity track of the user based on the identity information, and determining to-be-turned-on equipment based on the historical activity track.
In another possible implementation manner, when determining the to-be-illuminated device to be turned on based on the historical activity track, the first determining module is specifically configured to:
determining a target track from the historical activity tracks, wherein the target track is the track with the largest occurrence frequency in the historical activity tracks;
determining an activity area contained in the target track, and determining lighting equipment corresponding to the activity area;
and determining the lighting equipment corresponding to the activity area as the equipment to be lighted needing to be turned on.
In another possible implementation manner, the apparatus further includes:
the second acquisition module is used for acquiring the activity times of the user in each activity area within each preset fixed time period;
the second determining module is used for determining the activity area with the largest activity frequency as the common activity area corresponding to each preset fixed time period;
a third obtaining module, configured to obtain a current time and a current activity area where the user is currently located;
the third determining module is used for determining the preset fixed time period based on the current time;
and the second control module is used for controlling the lighting equipment of the target common activity area to be turned on when the current activity area does not belong to the target common activity area and the user is detected to be in the edge area of the current activity area, wherein the target common activity area is the common activity area corresponding to the preset fixed time period.
In another possible implementation manner, the apparatus further includes:
the calculating module is used for calculating the time length of the user leaving the activity area corresponding to a certain lighting device when the user is detected to leave the activity area corresponding to the certain lighting device;
and the third control module is used for controlling the certain lighting equipment to be closed when the duration reaches the preset duration.
In another possible implementation manner, the apparatus further includes:
the fourth acquisition module is used for acquiring second image information of the user;
the second identification module is used for carrying out behavior identification on the second image information to obtain a behavior identification result;
and the adjusting module is used for adjusting the light intensity of the equipment to be illuminated based on the behavior recognition result.
In a third aspect, the present application provides an electronic device, which adopts the following technical solutions:
an electronic device, comprising:
at least one processor;
a memory;
at least one application, wherein the at least one application is stored in the memory and configured to be executed by the at least one processor, the at least one application configured to: a method of performing smart home lighting according to any one of the possible implementations of the first aspect.
In a fourth aspect, the present application provides a computer-readable storage medium, which adopts the following technical solutions:
a computer-readable storage medium, which, when executed in a computer, causes the computer to perform the method of smart home lighting according to any one of the first aspect.
In summary, the present application includes at least one of the following beneficial technical effects:
1. the method comprises the steps that a target area is an area which a user is about to enter, a first preset area is a specific area which is set in advance, a first image of the first preset area outside the target area is obtained, so that whether the user currently located in the first preset area is a user having the authority of entering the target area or not is judged, character feature extraction is carried out on first image information, whether the user currently located in the first preset area has the authority of entering the target area or not is judged, namely the lighting starting authority is judged, when the user has the lighting starting authority, lighting equipment to be started is determined, so that the lighting equipment to be started is controlled to be started subsequently, the intelligent household lighting can actively judge whether the lighting equipment needs to be started or not, the effect of the specific lighting equipment to be started is determined, the user does not need to issue an instruction, the lighting equipment can be controlled to be started, and the use scene of the intelligent household lighting is widened;
2. the target area comprises at least one activity area, each activity area corresponds to lighting equipment, the preset fixed time period is a time period set in advance, when a user is in the target area, lighting can be provided for the user conveniently, the activity times of the user in each activity area in each preset fixed time period can be obtained, the activity area with the largest activity times is determined to be a common activity area corresponding to each preset fixed time period, the current time and the current activity area where the user is located are obtained, the preset fixed time period corresponding to the current time of the user can be determined according to the current time subsequently, and therefore whether the current user is in the target common activity area or not is conveniently judged, and the target common activity area is the common activity area corresponding to the preset fixed time period to which the current time belongs. When the current activity area does not belong to the target common activity area and the user is located in the edge area of the current activity area, the fact that the user is likely to leave the current activity area is shown, the lighting device of the target common activity area can be controlled to be turned on, and therefore the effect of predicting the activity area of the user is achieved, the intelligent home lighting is enabled to be more intelligent, and the experience of the user is improved.
Drawings
Fig. 1 is a schematic flowchart of a method for smart home lighting in an embodiment of the present application.
Fig. 2 is a schematic structural diagram of an intelligent home lighting device in an embodiment of the present application.
Fig. 3 is a schematic structural diagram of an electronic device in an embodiment of the present application.
Detailed Description
The present application is described in further detail below with reference to figures 1-3.
A person skilled in the art, after reading the present specification, may make modifications to the present embodiments as necessary without inventive contribution, but only within the scope of the claims of the present application are protected by patent laws.
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
In addition, the term "and/or" herein is only one kind of association relationship describing an associated object, and means that there may be three kinds of relationships, for example, a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the character "/" herein generally indicates that the former and latter associated objects are in an "or" relationship, unless otherwise specified.
The embodiments of the present application will be described in further detail with reference to the drawings attached hereto.
The embodiment of the application provides an intelligent home lighting method, which is executed by electronic equipment, wherein the electronic equipment can be a server or terminal equipment, the server can be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, and a cloud server for providing cloud computing service. The terminal device may be a smart phone, a tablet computer, a notebook computer, a desktop computer, and the like, but is not limited thereto, the terminal device and the server may be directly or indirectly connected through a wired or wireless communication manner, and an embodiment of the present application is not limited thereto, as shown in fig. 1, the method includes: step S101, step S102, step S103, and step S104, wherein,
step S101, first image information of a first preset area outside a target area is obtained.
For the embodiment of the application, the first preset area is an area set in advance, the target area is an area to which a user is about to enter, and the area is provided with a device for smart home illumination. Assume that the target area is a house and the first predetermined area is a door of the house. The method comprises the steps of obtaining first image information of a door opening outside a house so as to judge whether a user at the door opening has the right to enter the house or not according to the first image information subsequently, and judging whether lighting equipment in the house needs to be controlled to be turned on or not.
In the embodiment of the application, an image acquisition device such as a camera can be installed outside the target area, and the first image information is acquired through the image acquisition device such as the camera.
And step S102, extracting character features of the first image information, and judging whether a user currently in a first preset area has lighting starting permission.
For the embodiment of the application, the character feature extraction is carried out on the first image information, and whether the user in the first preset area has the lighting starting permission or not is judged, so that whether the lighting equipment needs to be started or not is clearly known. And supposing that the first image information A is the image information of the user A, extracting the character characteristics of the first image information A, and judging whether the user A has the lighting starting permission.
Step S103, if the user has the lighting starting authority, the equipment to be lighted needing to be started is determined.
To this application embodiment, connect the example, when user A has the right of opening the illumination, explain that need open the lighting apparatus in the house, because the quantity of the lighting apparatus in the house is more, consequently need determine the lighting apparatus of treating that specifically need open to avoid opening the lighting apparatus in whole house, make the power consumption great, thereby cause the waste of electric energy.
And step S104, controlling the device to be illuminated to be turned on.
For the embodiment of the application, the lighting equipment to be lighted is controlled to be turned on, so that the lighting equipment can be controlled by the user A under the condition that authorized equipment (such as a mobile phone and other terminal equipment) is not operated or controlled by voice, and the mode of controlling the lighting of the smart home is more diversified, so that the lighting equipment can be controlled when the user A belongs to the deaf and both hands carry articles, and the use scene of the lighting of the smart home is widened.
In a possible implementation manner of the embodiment of the present application, when the step S102 performs the character feature extraction on the first image information and determines whether the user currently located in the first preset area has the right to turn on the illumination, the method specifically includes step S1021 (not shown), step S1022 (not shown), and step S1023 (not shown), wherein,
step S1021, extracting the character feature of the first image information to obtain character feature information.
For the embodiment of the present application, taking step S102 as an example, the first image information a is image information of the user a, and the person feature extraction is performed on the first image information a to obtain person feature information a of the user a. So as to judge whether the user A has the lighting starting authority or not according to the character characteristic information A of the user A.
In the embodiment of the present application, when extracting the character features of the first image information, the gray scale image information of the first image information may be obtained by performing gray scale processing on the first image information, and then performing edge detection on the gray scale image information to obtain the contour information of the first image information, so as to filter the contour in the first image information, determine the contour information belonging to the user a, and determine the character feature information of the user a from the first image information according to the contour information of the user a.
Step S1022, matching the identity of the person feature information in a preset portrait information base.
The preset portrait information base stores figure characteristic information corresponding to each user with the lighting starting permission.
For the embodiment of the application, the preset portrait information base is a portrait information base set in advance, and the preset portrait information base stores the person characteristic information corresponding to each user with the lighting starting permission, so that whether the user A has the lighting starting permission or not can be judged in a mode of carrying out identity matching on the person characteristic information A of the user A in the preset portrait information base.
In step S1023, if the matching is successful, it is determined that the user has the lighting-on permission.
For the embodiment of the application, when the person characteristic information A is successfully matched in the preset portrait information base, the person characteristic information of the user A can be found in the preset portrait information base, and the stored portrait characteristic information in the preset portrait information base is the portrait characteristic information of the user with the lighting starting permission, so that when the person characteristic information A is successfully matched, the user A is indicated to have the lighting starting permission. Therefore, the effect of judging whether the user A has the lighting starting permission is achieved.
In the embodiment of the application, the person feature information of at least two users may exist in the first image information, the person feature information of the at least two users can be respectively subjected to identity matching in the preset person information base, and when the person feature information of one user is successfully matched, it is indicated that a user with the lighting starting permission exists in the at least one user, that is, the lighting device in the target area can be controlled to be started.
One possible implementation manner of the embodiment of the present application further includes step S105 (not shown in the figure) and step S106 (not shown in the figure), wherein,
step S105, performs object recognition on the first image information.
For the embodiment of the application, the positions of the user who needs to go to after the user who carries the object enters the target area are different, so that the to-be-switched-on lighting equipment is different, the target recognition can be firstly carried out on the first image information, the user can be conveniently judged whether to carry the object according to the target recognition result, and the to-be-switched-on lighting equipment which needs to be switched on can be conveniently and accurately determined subsequently.
In the embodiment of the present application, when performing the target identification on the first image information, the hand image information of the user in the first image information may be determined first, and the target identification may be performed on the hand image information of the user.
And step S106, judging whether the user carries the article or not based on the target identification result.
For the embodiment of the application, the target recognition result after the target recognition is performed on the first image information a is assumed to be the target recognition result a, and whether the user a carries an article is judged according to the target recognition result a, so that the subsequent determination of the to-be-illuminated device to be turned on is facilitated.
In this embodiment, the target recognition result may be all feature information in the hand image information of the user a, and it is determined whether the user a carries an article by determining whether the all feature information includes feature information other than the hand feature information of the user a.
In a possible implementation manner of the embodiment of the present application, when determining the device to be illuminated that needs to be turned on in step S103, the method specifically includes step S1031 (not shown in the figure) and step S1032 (not shown in the figure), wherein,
and step S1031, if the user carries the article, determining the type information of the article, and determining the to-be-illuminated device to be turned on based on the type information.
Wherein the category information corresponds to the lighting device.
To this application embodiment, every article all have kind of information, and every kind of information corresponds there is lighting apparatus, consequently when detecting that the user carries article, determines the kind information of article, is convenient for determine the lighting apparatus that the user probably needs to use according to kind information to reach and determine the effect of the equipment of treating lighting that needs to open.
If the articles carried by the user are vegetables and the type information of the articles is determined to be dishes, when the user enters a house, the active area which the user may need to enter is a kitchen, namely, the lighting device corresponding to the dishes can be set as a kitchen lighting device, and therefore the kitchen lighting device can be determined to be a device to be lighted which needs to be turned on at present.
Step S1032, if the user does not carry an article, determining identity information of the user, obtaining a historical activity track of the user based on the identity information, and determining a device to be turned on based on the historical activity track.
For the embodiment of the application, when it is detected that the user does not carry the device, the device to be illuminated needing to be turned on can be determined by determining the identity information of the user and acquiring the historical activity track of the user according to the identity information. Assuming that the identity information of user a is wife, the historical activity tracks of user a may be hallway-kitchen, hallway-living room-master-bedroom, and hallway-living room. Assuming that the identity information of the user B is a son, the historical activity track of the user B may be an entrance-living-toy area and an entrance-living-child room.
In a possible implementation manner of the embodiment of the application, the historical activity track is a historical activity track including a second preset area; the historical activity track comprises at least two activity tracks, and each activity track corresponds to at least two activity areas; step S1032 is to determine the to-be-illuminated device to be turned on based on the historical activity trace, and includes step S10321 (not shown in the figure), step S10322 (not shown in the figure), and step S10323 (not shown in the figure), wherein,
in step S10321, a target trajectory is determined from the historical activity trajectories.
The target track is the track with the largest occurrence frequency in the historical activity tracks.
For the embodiment of the present application, the second preset area is an area set in advance, and it is assumed that the second preset area is an entrance. The obtained historical activity tracks are activity tracks comprising an entrance, and each time the user enters the house from the entrance, the heading activity areas of the user may be different, so that at least two historical activity tracks are provided for each user, and each activity track has a starting point and an end point, so that at least two activity areas correspond to each activity track. The target track is the track with the most occurrence times in the historical activity tracks. Assuming that the times of generating the three activity tracks of the hallway-kitchen, the hallway-living room-main lying and the hallway-living room by the user a are respectively 30 times, 10 times and 15 times, the target track of the user a is the hallway-kitchen, and the target track hallway-kitchen is determined from the hallway-kitchen, the hallway-living room-main lying and the hallway-living room, so that the activity track which is most likely to be used by the user a at present can be predicted, and therefore the lighting equipment which needs to be turned on at present can be conveniently judged.
Step S10322, determining an active area included in the target trajectory, and determining an illumination device corresponding to the active area.
For the embodiment of the application, each track is composed of at least two activity areas, so that the activity areas contained in the target track can be determined, and each activity area corresponds to the lighting device, so that after the activity areas contained in the target track are determined, the lighting devices corresponding to the activity areas can be determined, and the lighting devices to be turned on can be determined subsequently.
Taking step S10321 as an example, it is determined that the target trajectory of the user a is an entrance and a kitchen, the corresponding activity areas needing to be illuminated are an entrance and a kitchen, and the illumination devices are an entrance illumination device and a kitchen illumination device respectively according to the activity areas being the entrance and the kitchen.
Step S10323, determine the lighting device corresponding to the active area as the device to be illuminated that needs to be turned on.
For the embodiment of the present application, taking step S10322 as an example, the entrance lighting device and the kitchen lighting device are determined as to-be-illuminated devices to be turned on. Thereby achieving the effect of determining the equipment to be illuminated.
In a possible implementation manner of the embodiment of the present application, step S104 further includes step S107 (not shown), step S108 (not shown), step S109 (not shown), step S110 (not shown), and step S111 (not shown), wherein,
and step S107, acquiring the activity times of the user in each activity area in each preset fixed time period.
For the embodiment of the application, the preset fixed time period is a time period set in advance, and assuming that the preset fixed time period is 1. So as to subsequently determine the common activity area corresponding to each preset fixed time period. Assume that user a appears 1 time in the kitchen, 1 time in the living room, and 5 times in the toilet at 1.
Step S108, the activity area with the most activity times is determined as the common activity area corresponding to each preset fixed time period.
For the embodiment of the present application, taking step S107 as an example, the toilet may be used as the common activity area corresponding to user a in the range of 1. The intelligent home lighting system is convenient for predicting the activity area which the user may go to according to the commonly used activity area in the follow-up process, and the corresponding lighting equipment is used in advance, so that the intelligent home lighting function is more humanized, and the experience of the user is improved.
Step S109, acquiring the current time and the current activity area where the user is currently located.
For the embodiment of the application, the current time is obtained, so that the activity area which the user may go to is predicted according to the current time. Assume that the current time is 1. And acquiring a current activity area where the user is currently located so as to conveniently judge whether the current user is in a frequently-used activity area of the user at the current time predicted by the electronic equipment. Assume that the current activity area of user a is the bedroom.
And step S110, determining the corresponding preset fixed time period based on the current time.
For the embodiment of the present application, taking step S107 as an example, where the current time is 1. And determining the preset fixed time period to which the current time belongs so as to determine the activity area which is most possible to go to by the user A at the current time in the follow-up process.
And step S111, if the current activity area does not belong to the target common activity area and the user is detected to be in the edge area of the current activity area, controlling the lighting equipment of the target common activity area to be turned on.
And the target common activity area is a common activity area corresponding to the preset fixed time period.
For the embodiment of the application, the target commonly-used activity area is a commonly-used activity area corresponding to the preset fixed time period, the step S108 is taken as an example, and the step S110 is taken as an example, it may be determined that the target commonly-used activity area is a toilet, the step S109 is taken as an example, the current activity area of the user a is a bedroom, that is, the current activity area of the user a does not belong to the target commonly-used activity area, that is, when the user a moves, it indicates that the user a may go to the toilet, and in order to further determine whether the user a needs to go to the toilet, it may be detected whether the user a is located in an edge area of the bedroom, and when the user a is located in the edge area of the bedroom, it indicates that the user a has a high possibility of leaving the bedroom to go to the toilet, and therefore, the lighting device of the toilet may be controlled to be turned on, thereby facilitating the user to go to the toilet, and further improving the experience of the user using the smart home lighting.
In a possible implementation manner of the embodiment of the present application, step S104 further includes step S112 (not shown in the figure) and step S113 (not shown in the figure), wherein,
step S112, if it is detected that the user leaves the activity area corresponding to the certain lighting device, calculating a duration for the user to leave the activity area corresponding to the certain lighting device.
For the embodiment of the present application, it is assumed that a certain lighting device is a lighting device in a living room, and the corresponding active area is the living room. When the user A is detected to leave the living room, the time length for the user A to leave the living room is calculated, so that the user A is judged to leave the living room for a long time, for example, the user A returns to a bedroom for rest. Or leave the living room for a short time, such as to go to the kitchen for food.
In the embodiment of the present application, the manner of determining whether the user leaves the active area may be that a thermal imaging instrument is installed in the active area to obtain thermal imaging of the active area, and whether the thermal imaging of the active area includes thermal imaging of the user is determined, so as to determine whether the user leaves the active area. The camera can be arranged in the activity area to collect image information in the activity area, and whether the user leaves the activity area is determined by analyzing the image information. Other ways of determining whether the user has left the active area are also possible.
And step S113, if the duration reaches the preset duration, controlling a certain lighting device to be turned off.
For the embodiment of the application, the preset duration is a duration set in advance, and is used as a standard for judging whether the time that the current user a leaves the activity area is too long. Assuming that the preset time is 5min, the time for the user A to leave the living room is 5min, and the preset time is reached, the time for the user A to leave the living room is longer, so that the possibility that the user does not return to the living room is higher, and therefore the lighting equipment in the living room can be controlled to be turned off, and the effect of reducing electric energy waste is achieved.
In the embodiment of the application, a user can set the preset time length through an input device such as a keyboard, a mouse or a touch screen. The method can also be used for recording the time length that the user leaves the activity area for each time for the electronic equipment, calculating the average value of the time lengths that the user leaves the activity area for a plurality of times for a short time, and obtaining the preset time length corresponding to the activity area, because at least one activity area is included in the target area, each activity area can be enabled to correspond to the preset time length, so that the result when the user is judged to leave the activity area for a short time or leave the activity area for a long time is more accurate, the time for controlling the lighting equipment to be turned off by the electronic equipment is more accurate, the situation that the times of turning on and turning off the lighting equipment are increased due to misjudgment is reduced, the service life of the lighting equipment is shortened due to frequent turning on and turning off of the lighting equipment, whether the current lighting equipment needs to be turned off or not is more accurately determined, and the service life of the lighting equipment is longer.
In a possible implementation manner of the embodiment of the present application, step S104 further includes step S114 (not shown), step S115 (not shown), and step S116 (not shown), wherein,
and step S114, acquiring second image information of the user.
For the embodiment of the application, the second image information of the user is obtained, so that the current behavior state of the user can be determined subsequently. In the embodiment of the application, an image acquisition device such as a camera may be installed in the target area, and the second image information of the user is acquired through the image acquisition device such as the camera.
And step S115, performing behavior recognition on the second image information to obtain a behavior recognition result.
For the embodiment of the application, the second image information is subjected to behavior recognition to obtain a behavior recognition result, so that the current behavior of the user can be determined conveniently. It is assumed that after the second image information of the user a is subjected to behavior recognition, a behavior recognition result obtained is that the user a watches television in a living room.
And step S116, adjusting the light intensity of the equipment to be illuminated based on the behavior recognition result.
For the embodiment of the present application, taking step S115 as an example, when the user a watches a television in a living room, the light intensity of the lighting device in the living room may be adjusted, so that the viewing environment of the user a is more comfortable, and the experience of the user is improved.
In the embodiment of the application, when the light intensity of the lighting device in the living room is adjusted, the light intensity can be adjusted according to the common light intensity in the user history during film watching, and further, according to the obtained content of the film in a specific time, the types of the film, such as horror film, thriller film, love film and the like, can be determined, and the optimal film watching environment of the user a is determined, for example, in the history of the user a, when the horror film is watched, 3-level light intensity is used, and when the behavior that the user a is watching the film is detected and the type of the film is horror film, the light intensity is adjusted to 3-level light intensity.
Furthermore, the corresponding behavior state of the user when using different light intensities can be recorded, the light intensity to which the behavior state of the user belongs is judged, and the light intensity is adjusted according to the light intensity to which the behavior state belongs.
The above embodiment introduces a method for smart home lighting from the perspective of a method flow, and the following embodiment introduces a device for smart home lighting from the perspective of a virtual module or a virtual unit, which is described in detail in the following embodiment.
The embodiment of the present application provides a device 20 for smart home lighting, as shown in fig. 2, the device 20 for smart home lighting specifically may include:
a first obtaining module 201, configured to obtain first image information of a first preset region outside a target region;
the first judging module 202 is configured to perform character feature extraction on the first image information, and judge whether a user currently located in a first preset area has an illumination starting permission;
the first determining module 203 is used for determining the to-be-illuminated equipment to be turned on when the user has the lighting turning-on permission;
the first control module 204 is configured to control the device to be illuminated to be turned on.
For the embodiment of the application, the target area is an area which a user is about to enter, the first preset area is a specific area which is set in advance, the first acquisition module 201 is used for acquiring the first image of the first preset area outside the target area, so as to judge whether the user currently located in the first preset area is a user having authority to enter the target area, and character feature extraction is performed on the first image information, so that the first judgment module 202 can judge whether the user currently located in the first preset area has the authority to enter the target area, that is, whether the user has the authority to turn on the lighting, when the user has the authority to turn on the lighting, the first determination module 203 determines the to-be-turned-on lighting equipment which needs to be turned on, so that the following first control module 204 controls the to-be-turned-on lighting equipment to be turned on, so that the intelligent home lighting can actively judge whether the lighting equipment needs to be turned on, and the effect of the specific lighting equipment which needs to be turned on is determined, so that the user does not need to issue an instruction, the lighting equipment can be turned on can be controlled, and the use scene of the intelligent home lighting is widened.
In a possible implementation manner of the embodiment of the application, the first determining module 202 is specifically configured to, when performing character feature extraction on the first image information and determining whether a user currently located in the first preset area has a lighting-on right:
extracting the character features of the first image information to obtain character feature information;
carrying out identity matching on the figure characteristic information in a preset figure information base, wherein figure characteristic information corresponding to each user with the lighting starting permission is stored in the preset figure information base;
and if the matching is successful, determining that the user has the lighting starting permission.
In a possible implementation manner of the embodiment of the present application, the apparatus 20 further includes:
the first identification module is used for carrying out target identification on the first image information;
and the second judgment module is used for judging whether the user carries the article or not based on the target identification result.
In a possible implementation manner of the embodiment of the present application, when determining a device to be illuminated that needs to be turned on, the first determining module 203 is specifically configured to:
if the user carries an article, determining the type information of the article, and determining the to-be-turned-on lighting equipment based on the type information, wherein the type information corresponds to the lighting equipment;
and if the user does not carry articles, determining the identity information of the user, acquiring the historical activity track of the user based on the identity information, and determining the to-be-turned-on equipment based on the historical activity track.
In a possible implementation manner of the embodiment of the present application, when determining, based on the historical activity track, the device to be illuminated that needs to be turned on, the first determining module 203 is specifically configured to:
determining a target track from the historical activity tracks, wherein the target track is the track with the largest occurrence frequency in the historical activity tracks;
determining an activity area contained in the target track, and determining lighting equipment corresponding to the activity area;
and determining the lighting equipment corresponding to the activity area as the equipment to be lighted to be turned on.
In a possible implementation manner of the embodiment of the present application, the apparatus 20 further includes:
the second acquisition module is used for acquiring the activity times of the user in each activity area within each preset fixed time period;
the second determining module is used for determining the activity area with the maximum activity times as a common activity area corresponding to each preset fixed time period;
the third acquisition module is used for acquiring the current time and the current activity area where the user is currently located;
the third determining module is used for determining the preset fixed time period based on the current time;
and the second control module is used for controlling the lighting equipment of the target common activity area to be turned on when the current activity area does not belong to the target common activity area and the user is detected to be in the edge area of the current activity area, and the target common activity area is a common activity area corresponding to the preset fixed time period.
In a possible implementation manner of the embodiment of the present application, the apparatus 20 further includes:
the calculating module is used for calculating the time length of the user leaving the activity area corresponding to the certain lighting equipment when the user is detected to leave the activity area corresponding to the certain lighting equipment;
and the third control module is used for controlling a certain lighting device to be turned off when the duration reaches the preset duration.
In a possible implementation manner of the embodiment of the present application, the apparatus 20 further includes:
the fourth acquisition module is used for acquiring second image information of the user;
the second identification module is used for carrying out behavior identification on the second image information to obtain a behavior identification result;
and the adjusting module is used for adjusting the light intensity of the equipment to be illuminated based on the behavior recognition result.
It can be clearly understood by those skilled in the art that, for convenience and simplicity of description, reference may be made to the corresponding process in the foregoing method embodiment for the specific working process of the smart home lighting apparatus 20 described above, and details are not repeated herein.
In this embodiment, the first obtaining module 201, the second obtaining module, the third obtaining module, and the fourth obtaining module may be the same obtaining module, may also be different obtaining modules, and may also be partially the same obtaining module. The first determining module 202 and the second determining module may be the same determining module or different determining modules. The first determining module 203, the second determining module, and the third determining module may be the same determining module, may be different determining modules, or may be partially the same determining module. The first control module 204, the second control module, and the third control module may be the same control module, may be different control modules, or may be partially the same control module. The first identification module and the second identification module may be the same identification module or different identification modules.
In an embodiment of the present application, an electronic device is provided, and as shown in fig. 3, an electronic device 30 shown in fig. 3 includes: a processor 301 and a memory 303. Wherein the processor 301 is coupled to the memory 303, such as via bus 302. Optionally, the electronic device 30 may also include a transceiver 304. It should be noted that the transceiver 304 is not limited to one in practical applications, and the structure of the electronic device 30 is not limited to the embodiment of the present application.
The Processor 301 may be a CPU (Central Processing Unit), a general-purpose Processor, a DSP (Digital Signal Processor), an ASIC (Application Specific Integrated Circuit), an FPGA (Field Programmable Gate Array) or other Programmable logic device, a transistor logic device, a hardware component, or any combination thereof. Which may implement or execute the various illustrative logical blocks, modules, and circuits described in connection with the disclosure herein. The processor 301 may also be a combination of computing functions, e.g., comprising one or more microprocessors in combination, a DSP and a microprocessor in combination, or the like.
Bus 302 may include a path that transfers information between the above components. The bus 302 may be a PCI (Peripheral Component Interconnect) bus, an EISA (Extended Industry Standard Architecture) bus, or the like. The bus 302 may be divided into an address bus, a data bus, a control bus, and the like. For ease of illustration, only one thick line is shown in fig. 3, but this does not represent only one bus or one type of bus.
The Memory 303 may be a ROM (Read Only Memory) or other type of static storage device that can store static information and instructions, a RAM (Random Access Memory) or other type of dynamic storage device that can store information and instructions, an EEPROM (Electrically Erasable Programmable Read Only Memory), a CD-ROM (Compact Disc Read Only Memory) or other optical Disc storage, optical Disc storage (including Compact Disc, laser Disc, optical Disc, digital versatile Disc, blu-ray Disc, etc.), a magnetic disk storage medium or other magnetic storage device, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer, but is not limited to these.
The memory 303 is used for storing application program codes for executing the scheme of the application, and the processor 301 controls the execution. The processor 301 is configured to execute application program code stored in the memory 303 to implement the aspects illustrated in the foregoing method embodiments.
Among them, electronic devices include but are not limited to: mobile terminals such as mobile phones, notebook computers, digital broadcast receivers, PDAs (personal digital assistants), PADs (tablet computers), PMPs (portable multimedia players), in-vehicle terminals (e.g., in-vehicle navigation terminals), and the like, and fixed terminals such as digital TVs, desktop computers, and the like. But also a server, etc. The electronic device shown in fig. 3 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present application.
The embodiment of the present application provides a computer readable storage medium, on which a computer program is stored, and when the computer program runs on a computer, the computer is enabled to execute the corresponding content in the foregoing method embodiment. Compared with the prior art, in the embodiment of the application, the target area is an area which a user is about to enter, the first preset area is a specific area which is set in advance, the first image of the first preset area outside the target area is obtained, so that whether the user currently located in the first preset area is a user having the authority to enter the target area or not is judged, character feature extraction is carried out on the first image information, whether the user currently located in the first preset area has the authority to enter the target area or not is judged, namely, whether the user has the authority to turn on the illumination or not is judged, when the user has the authority to turn on the illumination, the device to be illuminated which needs to be turned on is determined, so that the device to be illuminated is controlled to be turned on, the lighting device to be illuminated can be controlled to be turned on, the effect that the lighting device needs to be turned on or not to be turned on actively judged by the intelligent household lighting can be achieved, the effect that the user does not need to give an instruction, and the lighting device to be turned on can be controlled, and the use scene of the intelligent household lighting can be widened.
It should be understood that, although the steps in the flowcharts of the figures are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and may be performed in other orders unless explicitly stated herein. Moreover, at least a portion of the steps in the flow chart of the figure may include multiple sub-steps or multiple stages, which are not necessarily performed at the same time, but may be performed at different times, and the order of execution is not necessarily sequential, but may be performed alternately or alternately with other steps or at least a portion of the sub-steps or stages of other steps.
The foregoing is only a partial embodiment of the present application, and it should be noted that, for those skilled in the art, various modifications and decorations can be made without departing from the principle of the present application, and these modifications and decorations should also be regarded as the protection scope of the present application.

Claims (10)

1. A method for smart home lighting, comprising:
acquiring first image information of a first preset area outside a target area;
extracting character features of the first image information, and judging whether a user currently in the first preset area has lighting starting permission or not;
if the user has the lighting starting permission, determining equipment to be lighted needing to be started;
and controlling the equipment to be illuminated to be turned on.
2. The method for smart home lighting according to claim 1, wherein the extracting character features of the first image information and determining whether a user currently located in the first preset area has a lighting-on permission comprise:
extracting the character features of the first image information to obtain character feature information;
carrying out identity matching on the figure characteristic information in a preset figure information base, wherein figure characteristic information corresponding to each user with lighting starting permission is stored in the preset figure information base;
and if the matching is successful, determining that the user has the lighting starting permission.
3. The method of claim 1, further comprising:
performing target identification on the first image information;
judging whether the user carries an article or not based on a target identification result;
wherein, the device to be illuminated which needs to be turned on is determined, and the method comprises the following steps:
if the user carries an article, determining the type information of the article, and determining the equipment to be illuminated to be turned on based on the type information, wherein the type information corresponds to the illumination equipment;
if the user does not carry articles, determining identity information of the user, acquiring a historical activity track of the user based on the identity information, and determining to-be-turned-on equipment based on the historical activity track.
4. The intelligent home lighting method according to claim 3, wherein the historical activity track is a historical activity track comprising a second preset area; the historical activity track comprises at least two activity tracks, and each activity track corresponds to at least two activity areas;
wherein, based on the historical activity track, determining the equipment to be lighted to be turned on comprises:
determining a target track from the historical activity tracks, wherein the target track is the track with the largest occurrence frequency in the historical activity tracks;
determining an activity area contained in the target track, and determining lighting equipment corresponding to the activity area;
and determining the lighting equipment corresponding to the activity area as the equipment to be lighted needing to be turned on.
5. The method according to claim 1, wherein the target area comprises at least one activity area, each activity area corresponds to a lighting device, and the controlling the lighting device to be turned on further comprises:
acquiring the activity times of the user in each activity area within each preset fixed time period;
determining the activity area with the most activity times as a common activity area corresponding to each preset fixed time period;
acquiring current time and a current activity area where the user is currently located;
determining a preset fixed time period based on the current time;
if the current activity area does not belong to a target common activity area and the user is detected to be in the edge area of the current activity area, controlling lighting equipment of the target common activity area to be turned on, wherein the target common activity area is a common activity area corresponding to the preset fixed time period.
6. The method according to claim 5, wherein the device to be illuminated is controlled to be turned on, and then the method further comprises:
if the fact that the user leaves an activity area corresponding to a certain lighting device is detected, calculating the time length of the user leaving the activity area corresponding to the certain lighting device;
and if the duration reaches the preset duration, controlling the certain lighting equipment to be closed.
7. The method according to claim 1, wherein the controlling the device to be illuminated to be turned on further comprises:
acquiring second image information of the user;
performing behavior recognition on the second image information to obtain a behavior recognition result;
and adjusting the light intensity of the equipment to be illuminated based on the behavior recognition result.
8. The utility model provides a device of intelligence house illumination which characterized in that includes:
the first acquisition module is used for acquiring first image information of a first preset area outside the target area;
the first judging module is used for extracting character features of the first image information and judging whether a user in the first preset area has lighting starting permission or not;
the first determining module is used for determining the equipment to be lighted needing to be turned on when the user has the lighting turning permission;
and the first control module is used for controlling the equipment to be illuminated to be started.
9. An electronic device, comprising:
at least one processor;
a memory;
at least one application, wherein the at least one application is stored in the memory and configured to be executed by the at least one processor, the at least one application configured to: a method of performing smart home lighting according to any one of claims 1 to 7.
10. A computer-readable storage medium, on which a computer program is stored, which, when the computer program is executed in a computer, causes the computer to perform the method of smart home lighting according to any one of claims 1 to 7.
CN202211562041.0A 2022-12-07 2022-12-07 Intelligent household lighting method and device and storage medium Pending CN115802561A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211562041.0A CN115802561A (en) 2022-12-07 2022-12-07 Intelligent household lighting method and device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211562041.0A CN115802561A (en) 2022-12-07 2022-12-07 Intelligent household lighting method and device and storage medium

Publications (1)

Publication Number Publication Date
CN115802561A true CN115802561A (en) 2023-03-14

Family

ID=85417563

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211562041.0A Pending CN115802561A (en) 2022-12-07 2022-12-07 Intelligent household lighting method and device and storage medium

Country Status (1)

Country Link
CN (1) CN115802561A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116685033A (en) * 2023-06-21 2023-09-01 惠州兴通成机电技术有限公司 Intelligent control system for lamp

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116685033A (en) * 2023-06-21 2023-09-01 惠州兴通成机电技术有限公司 Intelligent control system for lamp
CN116685033B (en) * 2023-06-21 2024-01-12 惠州兴通成机电技术有限公司 Intelligent control system for lamp

Similar Documents

Publication Publication Date Title
CN108447480B (en) Intelligent household equipment control method, intelligent voice terminal and network equipment
CN104424484B (en) Application program switching, the method and device for adding access information
CN105446143B (en) information processing method and electronic equipment
CN107527048B (en) Fingerprint identification method and device, storage medium and mobile terminal
CN109033247B (en) Application program management method and device, storage medium and terminal
CN107085380A (en) A kind of intelligent domestic system customer location determination methods and electronic equipment
CN107943570B (en) Application management method and device, storage medium and electronic equipment
CN115802561A (en) Intelligent household lighting method and device and storage medium
CN112445410B (en) Touch event identification method and device and computer readable storage medium
CN113206774A (en) Control method and device of intelligent household equipment based on indoor positioning information
CN107871000A (en) Audio frequency playing method, device, storage medium and electronic equipment
CN114755931A (en) Control instruction prediction method and device, storage medium and electronic device
CN112654957B (en) Suspended window control method and related products
CN107545052A (en) Information recommendation method, device, mobile terminal and storage medium
CN108646908A (en) A kind of terminal screen control method, device, terminal and storage medium
CN107145221A (en) A kind of information processing method and electronic equipment
WO2021213263A1 (en) Call window control method and apparatus, mobile terminal, and readable storage medium
CN108960213A (en) Method for tracking target, device, storage medium and terminal
CN106707741B (en) Electrical equipment control method and device
CN105373720B (en) A kind of module control method and device applied to mobile terminal
CN111093059A (en) Monitoring method and related equipment
CN106557039A (en) A kind of information recommendation method and electronic equipment
CN112417197B (en) Sorting method, sorting device, machine readable medium and equipment
CN113383311B (en) Application processing method and device, storage medium and electronic equipment
CN111415191B (en) User classification method, device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination