CN114189969B - Lamp control method, device, electronic equipment and computer readable storage medium - Google Patents

Lamp control method, device, electronic equipment and computer readable storage medium Download PDF

Info

Publication number
CN114189969B
CN114189969B CN202111671534.3A CN202111671534A CN114189969B CN 114189969 B CN114189969 B CN 114189969B CN 202111671534 A CN202111671534 A CN 202111671534A CN 114189969 B CN114189969 B CN 114189969B
Authority
CN
China
Prior art keywords
target
activity scene
user
target user
lamp
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111671534.3A
Other languages
Chinese (zh)
Other versions
CN114189969A (en
Inventor
章勇
郑天航
张正华
孙国涛
丁冉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Suzhou Op Lighting Co Ltd
Original Assignee
Suzhou Op Lighting Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Suzhou Op Lighting Co Ltd filed Critical Suzhou Op Lighting Co Ltd
Priority to CN202111671534.3A priority Critical patent/CN114189969B/en
Publication of CN114189969A publication Critical patent/CN114189969A/en
Application granted granted Critical
Publication of CN114189969B publication Critical patent/CN114189969B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/105Controlling the light source in response to determined parameters
    • H05B47/115Controlling the light source in response to determined parameters by determining the presence or movement of objects or living beings
    • H05B47/125Controlling the light source in response to determined parameters by determining the presence or movement of objects or living beings by using cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/175Controlling the light source by remote control
    • H05B47/19Controlling the light source by remote control via wireless transmission
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02BCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO BUILDINGS, e.g. HOUSING, HOUSE APPLIANCES OR RELATED END-USER APPLICATIONS
    • Y02B20/00Energy efficient lighting technologies, e.g. halogen lamps or gas discharge lamps
    • Y02B20/40Control techniques providing energy savings, e.g. smart controller or presence detection

Abstract

The application discloses a lamp control method, a device, electronic equipment and a computer readable storage medium, which relate to the technical field of automatic control and are used for solving the problem of complicated operation of manually adjusting lighting parameters of a lamp by a user in the related technology, wherein the lamp control method comprises the following steps: acquiring a target image of a target user; determining a target activity scene where the target user is located according to the target image; and controlling the lighting parameters of the lamp according to the target activity scene, so that the lighting parameters of the lamp are matched with the target activity scene.

Description

Lamp control method, device, electronic equipment and computer readable storage medium
Technical Field
The present disclosure relates to the field of automatic control technologies, and in particular, to a method and apparatus for controlling a lamp, an electronic device, and a computer readable storage medium.
Background
In daily life and work, when using lamps and lanterns to throw light on, people generally can adjust the illumination parameter of lamps and lanterns to satisfy the illumination demand under different activity scenes.
In the related art, a user generally manually adjusts lighting parameters (such as brightness, color temperature, etc.) of a lamp to make the lamp exert a better lighting effect. For example, the user can flexibly adjust the lighting parameters of the lamp by manually rotating a knob on the lamp, manually pressing buttons with different grades on the lamp or manually touching a touch control area on the lamp, and the like, so as to meet the lighting requirements of the user in different activity scenes.
However, the related art has a problem that the operation of manually adjusting the lighting parameters of the lamp by the user is complicated. Specifically, whenever the behavior activity of the user changes, the user needs to manually adjust the lighting parameters of the lamp again to adapt to the activity scene of the user, and the repeated adjustment makes the operation more complicated.
Disclosure of Invention
The embodiment of the application provides a lamp control method, a lamp control device, electronic equipment and a computer readable storage medium, which solve the problem of complicated operation of manually adjusting lighting parameters of a lamp by a user in the related technology.
In a first aspect, an embodiment of the present application provides a method for controlling a lamp, including:
acquiring a target image of a target user;
determining a target activity scene where the target user is located according to the target image;
and controlling the lighting parameters of the lamp according to the target activity scene, so that the lighting parameters of the lamp are matched with the target activity scene.
Optionally, in an embodiment of the present application, controlling the lighting parameter of the light fixture according to the target activity scene, so that the lighting parameter of the light fixture matches with the target activity scene includes:
determining a target illumination parameter corresponding to the target activity scene according to a pre-established corresponding relation between the activity scene and the illumination parameter;
And setting the target illumination parameter as the current illumination parameter of the lamp, so that the current illumination parameter of the lamp is matched with the target activity scene.
Optionally, in an embodiment of the present application, determining, according to the target image, a target activity scene in which the target user is located includes:
determining a target user in the target image and a target object associated with the behavior of the target user in the target image;
and determining a target activity scene where the target user is based on the action of the target user and/or the position relation between the target user and the target object.
Optionally, in an embodiment of the present application, the determining the target user in the target image and the target object associated with the behavior of the target user in the target image includes:
inputting the target image into a pre-established deep learning model;
determining a first target area and a second target area in the target image through the deep learning model;
and determining a target user in the target image and a target object associated with the behavior of the target user in the target image according to the first target area and the second target area.
Optionally, in an embodiment of the present application, the target activity scene includes a first target activity scene, a second target activity scene, and a third target activity scene;
the determining, based on the action of the target user and/or the positional relationship between the target user and the target object, the target activity scene in which the target user is located includes:
determining that the target user is in a first target activity scene when the action of the target user indicates that the target user is in a learning state and the distance between the target user and the target object is smaller than a first threshold value;
determining that the target user is in a second target activity scene when the action of the target user indicates that the target user is in a learning state and the distance between the target user and the target object is greater than or equal to a first threshold;
determining that the target user is in a third target activity scene when the action of the target user indicates that the target user is in a relaxed state or the position relationship between the target user and the target object indicates that the target user is in a relaxed state;
wherein, in case the illumination parameter comprises a luminance value, a first target luminance value matching the first target activity scene > a second target luminance value matching the second target activity scene > a third target luminance value matching the third target activity scene;
Wherein, in case the illumination parameter comprises a color temperature value, a first target color temperature value matching the first target activity scene > a second target color temperature value matching the second target activity scene > a third target color temperature value matching the third target activity scene.
In a second aspect, an embodiment of the present application provides a lamp control device, including:
the acquisition module is used for acquiring a target image of a target user;
the determining module is used for determining a target activity scene where the target user is located according to the target image;
and the control module is used for controlling the lighting parameters of the lamp according to the target activity scene so that the lighting parameters of the lamp are matched with the target activity scene.
Optionally, in an embodiment of the present application, the control module is configured to:
determining a target illumination parameter corresponding to the target activity scene according to a pre-established corresponding relation between the activity scene and the illumination parameter;
and setting the target illumination parameter as the current illumination parameter of the lamp, so that the current illumination parameter of the lamp is matched with the target activity scene.
Optionally, in an embodiment of the present application, the determining module includes:
A first determining module, configured to determine a target user in the target image and a target object associated with a behavior of the target user in the target image;
and the second determining module is used for determining a target activity scene where the target user is located based on the action of the target user and/or the position relation between the target user and the target object.
Optionally, in an embodiment of the present application, the first determining module is configured to:
inputting the target image into a pre-established deep learning model;
determining a first target area and a second target area in the target image through the deep learning model;
and determining a target user in the target image and a target object associated with the behavior of the target user in the target image according to the first target area and the second target area.
Optionally, in an embodiment of the present application, the target activity scene includes a first target activity scene, a second target activity scene, and a third target activity scene;
the second determining module is configured to:
determining that the target user is in a first target activity scene when the action of the target user indicates that the target user is in a learning state and the distance between the target user and the target object is smaller than a first threshold value;
Determining that the target user is in a second target activity scene when the action of the target user indicates that the target user is in a learning state and the distance between the target user and the target object is greater than or equal to a first threshold;
determining that the target user is in a third target activity scene when the action of the target user indicates that the target user is in a relaxed state or the position relationship between the target user and the target object indicates that the target user is in a relaxed state;
wherein, in case the illumination parameter comprises a luminance value, a first target luminance value matching the first target activity scene > a second target luminance value matching the second target activity scene > a third target luminance value matching the third target activity scene;
wherein, in case the illumination parameter comprises a color temperature value, a first target color temperature value matching the first target activity scene > a second target color temperature value matching the second target activity scene > a third target color temperature value matching the third target activity scene.
In a third aspect, embodiments of the present application provide an electronic device, which includes the light fixture control apparatus according to the second aspect.
In a fourth aspect, embodiments of the present application provide a computer readable storage medium having stored thereon a program or instructions which, when executed by a processor, implement the steps of the method according to the first aspect.
The above-mentioned at least one technical scheme that this application embodiment adopted can reach following beneficial effect:
acquiring a target image of a target user; determining a target activity scene where the target user is located according to the target image; and controlling the lighting parameters of the lamp according to the target activity scene, so that the lighting parameters of the lamp are matched with the target activity scene. Therefore, under the condition that the user performs different actions, the target action scene of the target user can be analyzed according to the target image of the target user, and the illumination parameters of the lamp are automatically adjusted according to the target action scene, so that the illumination parameters of the lamp are matched with the target action scene, a good illumination effect is achieved, compared with the related art, the lamp adjusting device does not need to be manually operated by the user to adjust the illumination parameters of the lamp, and the lamp adjusting efficiency is higher.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiments of the application and together with the description serve to explain the application and do not constitute an undue limitation to the application. In the drawings:
Fig. 1 is a schematic flowchart of a lamp control method provided in an embodiment of the present application;
FIG. 2 is a schematic flow chart of another method for controlling a luminaire according to an embodiment of the present application;
FIG. 3 is a schematic flow chart of another method for controlling a luminaire according to an embodiment of the present application;
FIG. 4 is a schematic flow chart of a process for determining a target user and a target object provided in an embodiment of the present application;
FIG. 5 is a schematic flow chart of a process for determining a target activity scenario in which a target user is located according to an embodiment of the present application;
fig. 6 is a schematic block diagram of a lamp control device according to an embodiment of the present application.
FIG. 7 is a schematic block diagram of an electronic device according to an embodiment of the present application;
FIG. 8 is a schematic block diagram of a lighting device control apparatus according to an embodiment of the present application;
FIG. 9 is a schematic block diagram of another luminaire control device provided in an embodiment of the present application;
reference numerals illustrate:
800-a luminaire control device; 801-a luminaire; 8011-a wireless signal receiving chip; 8012-light source; 8013-power converter; 802-luminaire controller; 8021-a wireless signal transmitting chip; 8022-first connection interface; 803-image processing means; 8031-a second connection interface; 804-image acquisition device.
Detailed Description
For the purposes, technical solutions and advantages of the present application, the technical solutions of the present application will be clearly and completely described below with reference to specific embodiments of the present application and corresponding drawings. It will be apparent that the described embodiments are only some, but not all, of the embodiments of the present application. All other embodiments, which can be made by one of ordinary skill in the art without undue burden from the present disclosure, are within the scope of the present disclosure.
The features of the terms "first", "second", and the like in the description and in the claims of this application may be used for descriptive or implicit inclusion of one or more such features. In the description of the present application, unless otherwise indicated, the meaning of "a plurality" is two or more. Furthermore, in the description and claims, "and/or" means at least one of the connected objects, and the character "/", generally means that the associated object is an "or" relationship.
The technical solutions provided in the embodiments of the present application are specifically described below with reference to fig. 1 to 9.
Fig. 1 is a schematic flowchart of a lamp control method according to an embodiment of the present application.
As shown in fig. 1, the lamp control method provided in the embodiment of the present application may include:
step 110: acquiring a target image of a target user;
step 120: determining a target activity scene where the target user is located according to the target image;
step 130: and controlling the lighting parameters of the lamp according to the target activity scene so that the lighting parameters of the lamp are matched with the target activity scene.
In step 110, the target user may be a user located within the illumination range of the luminaire. The target user may be one user or a plurality of users, which is not particularly limited in this application.
In step 110, it is known from the target image of the target user that there is a body or a part of a limb portion of the target user in the target image. For example, there are upper limbs, lower limbs, trunk, head, and the like of the target user in the target image, and the present application is not particularly limited.
In step 110, the target image of the target user may be acquired in different manners.
For example, in step 110, the target image of the target user may be acquired in real time by the image acquisition device, and further the target image of the target user acquired in real time by the image acquisition device is acquired. The image capturing device may be a device such as a video camera, a still camera, a monitor, or the like, and is not particularly limited in this application.
For another example, in step 110, the target image of the target user may be acquired by the image acquisition device once every predetermined time period (e.g., 1 minute, 30 seconds, etc.), and then the target image of the target user acquired by the image acquisition device during the predetermined time period may be acquired.
Of course, the target image of the target user may also be acquired by other manners, which is not particularly limited in this application.
In step 120, the target activity scene in which the target user is located is related to the activity behavior of the target user in the surrounding environment. Different types of activity behavior, as well as different target activity scenarios. For example, in daily life, the target activity scene in which the target user is located may be an office scene, an entertainment scene, a rest scene, a learning scene, a dining scene, a cooking scene, a shower scene, or the like, according to different daily activity behaviors of the user. Of course, other activity scenarios are also possible, and the present application is not particularly limited.
In step 120, the specific determination manner of determining the target activity scene where the target user is located according to the target image may be various, which is not particularly limited in this application. The following is an example.
For example, in step 120, the user's limb motion in the target image may be identified, and the target activity scene in which the target user is located may be determined based on the user's limb motion. For example, if it is identified that the limb motion of the user in the target image is a running motion or a warming motion, it may be determined that the target activity scene in which the target user is located is an exercise scene. If the limb action of the user in the target image is identified as the lying action, the target activity scene where the target user is located can be determined to be a rest scene.
For another example, in step 120, a type of target object associated with the behavior of the target user in the target image may be identified, and a target activity scene in which the target user is located may be determined based on the type of target object associated with the behavior of the target user, and a positional relationship between the target user and the target object. For example, if it is identified that the target user is standing on the treadmill or yoga mat in the target image (the type of the target object is the treadmill or yoga mat, and the target object is located below the target user), it may be determined that the target activity scene where the target user is located is an exercise scene. If it is recognized that the user holds a book or a pen in the target image (the type of the target object is a book or a pen, the target object is located in front of the target user), it can be determined that the target activity scene in which the target user is located is a learning scene, and so on.
Of course, the present application may also determine the target activity scenario where the target user is located by other manners, and the present application is not limited specifically.
In step 130, there are various specific embodiments for controlling the lighting parameters of the lamp, which are not particularly limited in this application.
For example, in step 130, the illumination parameter matched with the target activity scene may be a preset fixed value, and the illumination parameter of the lamp may be adjusted according to the preset fixed value of the illumination parameter matched with the target activity scene.
For another example, in step 130, the illumination parameters matched with the target activity scene may be illumination parameter values within a preset range, and then the illumination parameters of the lamp may be adjusted according to a preset range of illumination parameters matched with the target activity scene in a certain order (for example, from large to small, or from small to large).
Of course, the lighting parameters of the lamp can be controlled in other manners, and the application is not particularly limited.
In step 130, taking as an example that the illumination parameter matching the target activity scene is a target illumination parameter, the target illumination parameter is associated with the target activity scene, and the target illumination parameter is different for different target activity scenes. In the target activity scene, when the lamp performs illumination according to the target illumination parameters, the target user can perform target activity in a proper illumination environment.
In step 130, the lighting parameters of the lamp can be automatically adjusted according to the target activity scene, so that the lighting parameters of the lamp are matched with the target activity scene, a good lighting effect is achieved, compared with the related art, the lighting parameters of the lamp are not required to be adjusted by manual operation of a user, and the efficiency of adjusting the lamp is higher.
According to the lamp control method provided by the embodiment of the application, the target image of the target user is obtained; determining a target activity scene where the target user is located according to the target image; and controlling the lighting parameters of the lamp according to the target activity scene, so that the lighting parameters are matched with the target activity scene. Therefore, under the condition that the user performs different actions, the target action scene of the target user can be analyzed according to the target image of the target user, and the illumination parameters of the lamp are automatically adjusted according to the target action scene, so that the illumination parameters of the lamp are matched with the target action scene, a good illumination effect is achieved, compared with the related art, the lamp adjusting device does not need to be manually operated by the user to adjust the illumination parameters of the lamp, and the lamp adjusting efficiency is higher.
The above-mentioned step 130 refers to controlling the lighting parameters of the lamp according to the target activity scene, and a specific control manner of controlling the lighting parameters of the lamp is illustrated in fig. 2.
Fig. 2 is a schematic flowchart of another lamp control method according to an embodiment of the present application.
As shown in fig. 2, in a specific embodiment, the method for controlling a lamp provided in the embodiment of the present application may include:
Step 210: acquiring a target image of a target user;
step 220: determining a target activity scene where the target user is located according to the target image;
step 230: determining a target illumination parameter corresponding to the target activity scene according to a pre-established corresponding relation between the activity scene and the illumination parameter;
step 240: and setting the target illumination parameter as the current illumination parameter of the lamp, so that the current illumination parameter of the lamp is matched with the target activity scene.
The specific content of step 210 may refer to step 110, and will not be described herein.
The specific content of step 220 may refer to step 120, and will not be described herein.
Wherein step 230 and step 240 may be sub-steps of step 130.
In step 230, a correspondence between the active scene and the lighting parameters may be pre-established. The specific establishment process can be as follows: and obtaining illumination parameter conditions with the best user experience effect under each type of activity scene in various types of activity scenes through a simulation test, and establishing a corresponding relation between the activity scene and the illumination parameters. Of course, the correspondence between the active scene and the lighting parameter may also be determined by other manners, which is not particularly limited in this application.
In step 230, the illumination parameters may include: color temperature and/or brightness. The pre-established correspondence between the activity scene and the lighting parameters may include: and (3) a pre-established corresponding relation between the active scene and the color temperature and/or a pre-established corresponding relation between the active scene and the brightness.
In step 230, in case the illumination parameter is a color temperature, a target color temperature corresponding to the target activity scene may be determined according to a pre-established correspondence between the activity scene and the color temperature.
In step 230, in the case where the illumination parameter is brightness, a target brightness corresponding to the target active scene may be determined according to a pre-established correspondence between the active scene and brightness.
In step 240, a current lighting parameter of the luminaire may be obtained, and in case the current lighting parameter of the luminaire is different from the target lighting parameter, the target lighting parameter is set to the current lighting parameter of the luminaire, such that the current lighting parameter matches the target activity scene.
According to the lamp control method provided by the embodiment of the application, the target image of the target user can be obtained; determining a target activity scene where the target user is located according to the target image; determining a target illumination parameter corresponding to the target activity scene according to a pre-established corresponding relation between the activity scene and the illumination parameter; the target lighting parameter is set to the current lighting parameter of the luminaire such that the current lighting parameter matches the target active scene. Therefore, the current lighting parameters of the lamp can be accurately adjusted to be a fixed lighting parameter value based on the corresponding relation between the pre-established activity scene and the lighting parameters, the response speed of adjusting the lighting parameters of the lamp is high without gradually adjusting according to the change trend of the lighting parameters of the lamp.
The step 120 also refers to determining the target activity scene where the target user is located according to the target image of the target user, and the specific embodiment of determining the target activity scene is illustrated in fig. 3.
Fig. 3 is a schematic flowchart of another lamp control method according to an embodiment of the present application.
As shown in fig. 3, in a specific embodiment, the method for controlling a lamp provided in the embodiment of the present application may include:
step 310: acquiring a target image of a target user;
step 320: determining a target user in the target image and a target object associated with the behavior of the target user in the target image;
step 330: determining a target activity scene where the target user is located based on the action of the target user and/or the position relation between the target user and the target object;
step 340: and controlling the lighting parameters of the lamp according to the target activity scene, so that the lighting parameters of the lamp are matched with the target activity scene.
Step 310 may refer to the specific content of step 110, and will not be described herein.
Wherein step 320 and step 330 may be sub-steps of step 120.
Step 340 may refer to the specific content of step 130, which is not described herein.
In step 320, the determination of the target user in the target image may specifically be determining a location area of the target user in the target image. For example, determining a rectangular area of the target user in the target image may specifically determine center point coordinates and size information of the rectangular area of the target user contained in the target image.
In step 320, determining the target user in the target image may also be determining a behavioral action (e.g., limb action) of the target user in the target image. For example, in daily life, the behavioral actions of the target user in the determined target image are work, study, exercise, eating, and the like.
In step 320, the target object associated with the behavior of the target user may be an object associated with the behavior of the target user or other user associated with the behavior of the target user. The present application is not particularly limited.
In step 320, determining a target object in the target image that is associated with the behavior of the target user may specifically be determining a location area of the target object in the target image. For example, determining a rectangular region of the target object in the target image may specifically determine center point coordinates and size information of the rectangular region of the target object contained in the target image.
In step 320, determining a target object in the target image that is associated with the behavior of the target user may also be determining a category of the target object. For example, in daily life, the category of the target object in the determination target image is books, portable terminals, exercise equipment, tableware, and the like.
In step 330, a target activity scene in which the target user is located may be determined based on the action of the target user. For example, if the action of the target user is reading, it may be determined that the target activity scene in which the target user is located is a reading scene.
In step 330, the positional relationship between the target user and the target object may be a spatial positional relationship between the target user and the target object. For example, if the target object is a bed and the target user is located above the bed, it may be determined that the target activity scene in which the target user is located is a rest scene.
In step 330, the positional relationship between the target user and the target object may also be a distance positional relationship between the target user and the target object. For example, if the target object is a bed, the distance between the target user and the bed is smaller than the first threshold, and it may be determined that the target activity scene in which the target user is located is a rest scene.
Further, in step 330, the target activity scene where the target user is located may be determined based on the action of the target user and the positional relationship between the target user and the target object, which may improve the accuracy of determining the target activity scene. For example, if the target object is a book, the position area of the target user in the target image and the position area of the target object in the target image are contacted or partially overlapped, the distance between the target user and the target object can be determined to be relatively short, the target object is closely related to the target activity of the target user, and further, the target activity scene where the target user is located is more accurately determined to be a reading scene.
According to the lamp control method provided by the embodiment of the application, a target image of a target user is obtained; determining a target user in the target image and a target object associated with the behavior of the target user in the target image; determining a target user and a target object associated with the behavior of the target user in the target image; determining a target activity scene where the target user is based on the behavior of the target user and the position relationship between the target user and the target object; and controlling the lighting parameters of the lamp according to the target activity scene, so that the lighting parameters of the lamp are matched with the target activity scene. Therefore, the close correlation between the target object and the activity behavior of the target user can be determined according to the position relationship between the target user and the target object, and the target activity scene where the target user is located can be determined more accurately according to the position relationship between the target user and the target object.
The above step 320 refers to determining the target user and the target object associated with the behavior of the target user in the target image, and the specific embodiment of step 320 is illustrated in fig. 4.
Fig. 4 is a schematic flow chart of a process for determining a target user and a target object according to an embodiment of the present application.
As shown in fig. 4, in a specific embodiment, the step 320 may include:
step 410: inputting the target image into a pre-established deep learning model;
step 420: determining a first target area and a second target area in the target image through the deep learning model;
step 430: and determining a target user in the target image and a target object associated with the behavior of the target user in the target image according to the first target area and the second target area.
Wherein step 410, step 420 and step 430 may be sub-steps of step 320.
In step 410, a deep learning model may be used to detect whether a target object is present in the target image; if the target object exists in the target image, the center point coordinate and the size information of the target rectangular frame area where the target object is located can be determined.
In step 410, the deep learning model may be pre-trained. Specifically, a large number of user image samples can be input into the neural network to perform supervised learning (the rectangular area where the target object is located in each user image sample in the supervised learning mode is in a marked state), so as to obtain a deep learning model. The training of the deep learning model belongs to a mature technology in the field, and is not repeated here.
In step 420, a first target region and a second target region in the target image may be determined by a deep learning model. The first target area comprises a first target object, and the second target area comprises a second target object.
In order to accurately determine the location of the target user and the location of the target object associated with the behavior of the target user in the target image, the location of the first target region of the first target object and the location of the second target region of the second target object may be determined in step 420. The following describes an example of the target detection algorithm.
For example, in the target detection algorithm, the loss function may be defined as:
in the above formula, i is the ith anchor point, and i boundary rectangular frames can be generated by taking the i anchor points as central points; p is p i Refers to the prediction probability that the ith anchor is the target object; anchor can be divided into positive and negative categories, where anchor is positive pi=1, and anchor is negative pi=0; t is t i Is a 4-dimensional vector, in particular, t i In (x, y, w, h), x, y, w, h represent the center point coordinates, width, and height, respectively, of the bounding rectangular box.Representing a true value associated with a positive anchor; n (N) cls And L cls Positive and negative classification training for anchor points; n (N) reg And L reg And (5) carrying out regression processing on the anchor boundary rectangular box. The parameter lambda is used to balance N cls And N reg With a large gap between them.
In this way, the rectangular frame region containing the target object in the target image can be predicted by the target detection algorithm under the condition that the loss function is the minimum value. That is, the position of the first target region of the first target object and the position of the second target region of the second target object may be determined with the loss function being the minimum.
In step 430, the first target object included in the first target area and the second target object included in the second target area may be classified, respectively, to determine one of the first target object and the second target object as a target user, and the other of the first target object and the second target object as a target object associated with a behavior of the target user. The specific way of classifying the first target object and the second target object belongs to a mature technology in the field, and is not described herein.
Therefore, the position of the target user in the target image and the position of the target object associated with the behavior of the target user can be accurately determined according to the deep learning model, and further, the position relation between the target user and the target object can be accurately obtained, and the target activity scene where the target user is located can be more accurately determined.
The above step 330 refers to determining the target activity scene in which the target user is located based on the behavior of the target user and the positional relationship between the target user and the target object, and the specific embodiment of determining the target activity scene in which the target user is located is illustrated in fig. 5.
Fig. 5 is a schematic flowchart of a process for determining a target activity scene in which a target user is located according to an embodiment of the present application.
In another specific embodiment, the target activity scene may include a first target activity scene, a second target activity scene, and a third target activity scene, as shown in fig. 5, the step 330 may include:
step 510: determining that the target user is in a first target activity scene when the action of the target user indicates that the target user is in a learning state and the distance between the target user and the target object is smaller than a first threshold value;
Step 520: determining that the target user is in a second target activity scene when the action of the target user indicates that the target user is in a learning state and the distance between the target user and the target object is greater than or equal to a first threshold;
step 530: and determining that the target user is in a third target activity scene when the action of the target user indicates that the target user is in a relaxed state or the position relationship between the target user and the target object indicates that the target user is in a relaxed state.
Wherein step 510, step 520 and step 530 may be sub-steps of step 320. The order of execution between step 510, step 520 and step 530 is not particularly limited.
In steps 510 to 530, the target activity scene may be divided into a first target activity scene, a second target activity scene, and a third target activity scene. Specifically, the illumination parameter matched with the first target activity scene is a first target illumination parameter, the illumination parameter matched with the second target activity scene is a second target illumination parameter, and the illumination parameter matched with the third target activity scene is a third target illumination parameter.
In daily life, the user has higher requirements on brightness in a learning state, and has lower requirements on brightness in a relaxation state; and in the learning state of the user, the closer the user is to the target object, the higher the brightness requirement is. And under the condition that the illumination parameter is a brightness value, the brightness requirement matched with the first target activity scene is highest, and the brightness requirement matched with the third target activity scene is lowest.
For example, in the case where the illumination parameter is a luminance value, the first target illumination parameter that matches the first target activity scene is a first target luminance value, the second target illumination parameter that matches the second target activity scene is a second target luminance value, and the third target illumination parameter that matches the third target activity scene is a third target luminance value. The first target brightness value, the second target brightness value and the third target brightness value are rated as follows: the first target luminance value > the second target luminance value > the third target luminance value.
Accordingly, the specific implementation manner of the step 130 may be:
under the condition that the target activity scene is switched from the second target activity scene or the third target activity scene to the first target activity scene, the brightness of the lamp is increased according to the first target activity scene, so that the brightness of the lamp is matched with the first target activity scene;
Or under the condition that the target activity scene is switched from the first target activity scene to the second target activity scene, reducing the brightness of the lamp according to the second target activity scene, so that the brightness of the lamp is matched with the second target activity scene;
or under the condition that the target activity scene is switched from a third target activity scene to a second target activity scene, the brightness of the lamp is increased according to the second target activity scene, so that the brightness of the lamp is matched with the second target activity scene;
or under the condition that the target activity scene is switched from the first target activity scene or the second target activity scene to the third target activity scene, reducing the brightness of the lamp according to the third target activity scene, so that the brightness of the lamp is matched with the third target activity scene.
In daily life, the color temperature value requirement of a user in a learning state can be higher, so that the color tone is luminescent, and the target user can concentrate on mental learning; the color temperature value requirement of the user in a relaxed state can be lower, so that the color tone is warmer, and the target user is facilitated to relax; and in the learning state of the user, the further the user is from the target object, the lower the requirement on the color temperature value is. And under the condition that the illumination parameter is a color temperature value, the color temperature value matched with the first target activity scene is the largest, and the color temperature matched with the third target activity scene is the smallest.
For example, in the case where the illumination parameter is a color temperature value, the first target illumination parameter that matches the first target activity scene is a first target color temperature value, the second target illumination parameter that matches the second target activity scene is a second target color temperature value, and the third target illumination parameter that matches the third target activity scene is a third target color temperature value. Wherein the first target color temperature value, the second target color temperature value and the third target color temperature value are rated as follows: the first target color temperature value > the second target color temperature value > the third target color temperature value.
Accordingly, the specific implementation manner of the step 130 may be:
under the condition that the target activity scene is switched from the second target activity scene or the third target activity scene to the first target activity scene, the color temperature of the lamp is increased according to the first target activity scene, so that the color temperature of the lamp is matched with the first target activity scene;
or under the condition that the target activity scene is switched from the first target activity scene to the second target activity scene, reducing the color temperature of the lamp according to the second target activity scene, so that the color temperature of the lamp is matched with the second target activity scene;
Or under the condition that the target activity scene is switched from a third target activity scene to a second target activity scene, the color temperature of the lamp is increased according to the second target activity scene, so that the color temperature of the lamp is matched with the second target activity scene;
or under the condition that the target activity scene is switched from the first target activity scene or the second target activity scene to the third target activity scene, reducing the color temperature of the lamp according to the third target activity scene, so that the color temperature of the lamp is matched with the third target activity scene.
Of course, the brightness and the color temperature of the lamp can be adjusted at the same time, and specific content can refer to the two cases, and details are not repeated here.
In this way, the target activity scene is divided into the first target activity scene, the second target activity scene and the third target activity scene, and under the condition that the target activity scene where the user is located is switched, the illumination parameters of the lamp can be adjusted in a proper range according to the determined change direction (such as up or down) according to the grade of the illumination parameters between the first target activity scene, the second target activity scene and the third target activity scene, so that the adjustment of the illumination parameters of the lamp has universality, can be suitable for various activity scenes, and has a wider application range.
According to the lamp control method provided by the embodiment of the application, the execution main body can be a lamp control device. In the embodiment of the present application, a method for executing a lamp control by a lamp control device is taken as an example, and the lamp control device provided in the embodiment of the present application is described.
As shown in fig. 6, the embodiment of the present application further provides a lamp control device 600, which may include:
an acquisition module 601, configured to acquire a target image of a target user;
a determining module 602, configured to determine, according to the target image, a target activity scene in which the target user is located;
the control module 603 is configured to control lighting parameters of a luminaire according to the target activity scene, so that the lighting parameters of the luminaire are matched with the target activity scene.
According to the lamp control device provided by the embodiment of the application, the acquisition module is used for acquiring the target image of the target user; the determining module is used for determining a target activity scene where the target user is located according to the target image; and the control module is used for controlling the lighting parameters of the lamp according to the target activity scene so that the lighting parameters of the lamp are matched with the target activity scene. Therefore, under the condition that the user performs different actions, the target action scene of the target user can be analyzed according to the target image of the target user, and the illumination parameters of the lamp are automatically adjusted according to the target action scene, so that the illumination parameters of the lamp are matched with the target action scene, a good illumination effect is achieved, compared with the related art, the lamp adjusting device does not need to be manually operated by the user to adjust the illumination parameters of the lamp, and the lamp adjusting efficiency is higher.
Optionally, in an embodiment of the present application, the control module is configured to:
determining a target illumination parameter corresponding to the target activity scene according to a pre-established corresponding relation between the activity scene and the illumination parameter;
and setting the target illumination parameter as the current illumination parameter of the lamp, so that the current illumination parameter of the lamp is matched with the target activity scene.
Therefore, the current lighting parameters of the lamp can be accurately adjusted to be a fixed lighting parameter value based on the corresponding relation between the pre-established activity scene and the lighting parameters, the response speed of adjusting the lighting parameters of the lamp is high without gradually adjusting according to the change trend of the lighting parameters of the lamp.
Optionally, in an embodiment of the present application, the determining module includes:
a first determining module, configured to determine a target user in the target image and a target object associated with a behavior of the target user in the target image;
and the second determining module is used for determining a target activity scene where the target user is located based on the action of the target user and/or the position relation between the target user and the target object.
Therefore, the close correlation between the target object and the activity behavior of the target user can be determined according to the position relationship between the target user and the target object, and the target activity scene where the target user is located can be determined more accurately according to the position relationship between the target user and the target object.
Optionally, in an embodiment of the present application, the first determining module is configured to:
inputting the target image into a pre-established deep learning model;
determining a first target area and a second target area in the target image through the deep learning model;
and determining a target user in the target image and a target object associated with the behavior of the target user in the target image according to the first target area and the second target area.
Therefore, the position of the target user in the target image and the position of the target object associated with the behavior of the target user can be accurately determined according to the deep learning model, and further, the position relation between the target user and the target object can be accurately obtained, and the target activity scene where the target user is located can be more accurately determined.
Optionally, in an embodiment of the present application, the target activity scene includes a first target activity scene, a second target activity scene, and a third target activity scene;
The second determining module is configured to:
determining that the target user is in a first target activity scene when the action of the target user indicates that the target user is in a learning state and the distance between the target user and the target object is smaller than a first threshold value;
determining that the target user is in a second target activity scene when the action of the target user indicates that the target user is in a learning state and the distance between the target user and the target object is greater than or equal to a first threshold;
determining that the target user is in a third target activity scene when the action of the target user indicates that the target user is in a relaxed state or the position relationship between the target user and the target object indicates that the target user is in a relaxed state;
wherein, in case the illumination parameter comprises a luminance value, a first target luminance value matching the first target activity scene > a second target luminance value matching the second target activity scene > a third target luminance value matching the third target activity scene;
wherein, in case the illumination parameter comprises a color temperature value, a first target color temperature value matching the first target activity scene > a second target color temperature value matching the second target activity scene > a third target color temperature value matching the third target activity scene.
In this way, the target activity scene is divided into the first target activity scene, the second target activity scene and the third target activity scene, and under the condition that the target activity scene where the user is located is switched, the illumination parameters of the lamp can be adjusted in a proper range according to the determined change direction (such as up or down) according to the grade of the illumination parameters between the first target activity scene, the second target activity scene and the third target activity scene, so that the adjustment of the illumination parameters of the lamp has universality, can be suitable for various activity scenes, and has a wider application range.
Optionally, as shown in fig. 7, the embodiment of the present application further provides an electronic device 700, including a processor 701 and a memory 702, where the memory 702 stores a program or an instruction that can be executed on the processor 701, and the program or the instruction when executed by the processor 701 implements each step of the embodiment of the method for controlling a lamp, and can achieve the same technical effects, so that repetition is avoided, and no further description is given here.
The electronic device in the embodiment of the application includes the mobile electronic device and the non-mobile electronic device described above.
The electronic device in the embodiment of the application may include the above-mentioned lamp control device.
The embodiment of the application further provides a computer readable storage medium, on which a program or an instruction is stored, and when the program or the instruction is executed by a processor, the processes of the embodiment of the lamp control method are implemented, and the same technical effects can be achieved, so that repetition is avoided, and no further description is given here.
Wherein the processor is a processor in the electronic device described in the above embodiment. The readable storage medium includes computer readable storage medium such as computer readable memory ROM, random access memory RAM, magnetic or optical disk, etc.
In addition, based on the same conception of the lamp control method provided by the embodiment of the application, the embodiment of the application also provides lamp control equipment. The following specifically describes a lamp control device provided in the embodiment of the present application, taking fig. 8 and fig. 9 as an example.
Fig. 8 is a schematic block diagram of a lamp control device according to an embodiment of the present application.
As shown in fig. 8, a lamp control device 800 provided in an embodiment of the present application may include:
the lamp device comprises a lamp 801, a lamp controller 802, an image processing device 803 and an image acquisition device 804, wherein the image acquisition device 804 is connected with the image processing device 803, the image processing device 803 is connected with the lamp controller 802, and the lamp controller 802 is connected with the lamp 801.
The image processing device 803 may implement some or all of the steps of the above-described embodiment of the luminaire control method, and the image processing device 803 may be a device with an image processing function, such as a computer, which is not described herein.
Wherein the image acquisition device 804 is connected to the image processing device 803, and the image processing device 803 can be used for receiving the target image acquired by the image acquisition device 804. The image capturing device 804 may be a device having an image capturing function such as a color camera, a video camera, a monitor, or the like.
The image processing device 803 is connected to the luminaire controller 802, and the image processing device 803 may generate a control instruction according to the steps of the luminaire control method in the method embodiment, where the control instruction carries information of luminaire lighting parameters matched with the target activity scene.
Further, the image processing device 803 may send a control instruction to the luminaire controller 802, so that the luminaire controller 802 controls the lighting parameters of the luminaire 801 according to the control instruction, so that the lighting parameters of the luminaire 801 match the target activity scene.
According to the embodiment of the method provided by the application, the image processing device can analyze the target activity scene of the target user according to the target image of the target user acquired by the image acquisition device, generate the control instruction carrying the target illumination parameter information according to the target activity scene, and transmit the control instruction from the image processing device to the lamp controller, and the lamp controller automatically adjusts the illumination parameter of the lamp according to the control instruction.
According to the lamp control device, the lamp control device comprises a lamp, a lamp controller, an image processing device and an image acquisition device, wherein the image acquisition device is connected with the image processing device, the image processing device is connected with the lamp controller, and the lamp controller is connected with the lamp. Like this, the lamps and lanterns controlgear that this embodiment provided introduces lamps and lanterns controller, and the one end of lamps and lanterns controller is connected with image processing device, and the other end of lamps and lanterns controller is connected with lamps and lanterns, and the control command from image processing device transmits to lamps and lanterns controller for lamps and lanterns controller carries out automatically regulated to the illumination parameter of lamps and lanterns according to control command, does not need the manual operation of user to adjust the illumination parameter of lamps and lanterns compared with relevant technique, and the efficiency of adjusting lamps and lanterns is higher.
In a specific embodiment, in the luminaire control apparatus provided in the embodiments of the present application, a luminaire controller 802 is wirelessly connected to the luminaire 801, and the luminaire controller 802 is electrically connected to the image processing device 803 (not shown in the figures).
The wireless connection between the lamp controller 802 and the lamp 801 may be a bluetooth connection, a wifi connection, or other wireless connection modes, which are not specifically limited in this application.
In this way, since the lamp controller 802 is wirelessly connected with the lamps 801, the coverage area of the wireless connection is larger, the lamp controller 802 can be wirelessly connected with a plurality of lamps 801, and the lamp controller 802 can uniformly control all lamps within the coverage area of the wireless connection.
For example, as shown in fig. 9, the lamp controller 802 has a wireless signal transmitting chip 8021, and the lamp 801 has a wireless signal receiving chip 8011, and the wireless signal transmitting chip is adapted to the wireless signal receiving chip.
Thus, through the wireless communication between the wireless signal transmitting chip and the wireless signal receiving chip, the lamp 801 can be controlled by the remote control of the lamp controller, and when the movable portable lamp is flexibly transferred within a certain space range, the control signals of the lamp controller can be received.
In a specific embodiment, as shown in fig. 9, in the lamp control device provided in the embodiment of the present application, the lamp 801 includes a light source 8012, and the wireless signal receiving chip 8011 is electrically connected to the light source 8011.
The light source 8012 may be a fluorescent lamp, an LED lamp, a halogen lamp, or other types of light sources, which are not particularly limited in this application.
Thus, when the wireless signal receiving chip 8011 is electrically connected with the light source 8012, the illumination parameters of the light source 8012 are automatically adjusted by the control instruction received by the wireless signal receiving chip 8011.
In a specific embodiment, in the lighting device control apparatus provided in the embodiment of the present application, the wireless signal transmitting chip 8021 is a transmitting chip with a bluetooth function, and the wireless signal receiving chip 8011 is a receiving chip with a bluetooth function.
Therefore, the wireless communication between the wireless signal transmitting chip and the wireless signal receiving chip is Bluetooth communication, and when the lamp controller controls the lamp, the Bluetooth control signal has strong anti-interference capability and lower power consumption.
In a specific embodiment, as shown in fig. 9, in the light fixture control device provided in the embodiment of the present application, the light fixture controller 802 has a first connection interface 8022, and the image processing device 803 is provided with a second connection interface 8031. And the first connection interface of the luminaire controller 802 is plugged onto the second connection interface (not shown) of the graphics processing device 803.
In this way, the lamp controller 802 is plugged into the graphics processing device 803 through the first connection interface, the power required for the operation of the lamp controller can be provided by the graphics processing device, and the lamp controller can quickly receive the control command sent by the graphics processing device.
In practical applications, for example, the luminaire controller 802 may be a portable luminaire controller, the first connection interface is a USB type interface, and the second connection interface is a USB type interface.
Thus, the portable lamp controller is beneficial to the integral assembly of the lamp control equipment, and the USB type interface can enable the data transmission efficiency between the lamp controller and the graphic processing device to be higher.
In a specific embodiment, in the lighting fixture control device provided in the embodiment of the present application, the image acquisition device 803 may be an image capturing device, and the image processing device 803 is an industrial control computer; the camera device is electrically connected with the industrial control computer through a data line, or the camera device is wirelessly connected with the industrial control computer (not shown).
The camera device can be portable equipment such as a camera, a camera and the like, and is convenient for the integral assembly of the lamp control equipment.
The shell of the industrial control computer has higher capabilities of magnetic resistance, dust resistance and impact resistance. The industrial control computer can comprise a special bottom plate, and a slot is arranged on the bottom plate, so that the lamp controller can be conveniently connected and inserted. The industrial control computer can comprise a special power supply and has strong anti-interference capability. The industrial control computer has continuous long-time working capacity, and is beneficial to continuously controlling the lamps to automatically adjust the illumination parameters.
In a specific embodiment, as shown in fig. 9, in the lamp control device provided in the embodiment of the present application, the lamp 801 may further include: a power converter 8013 electrically connected to the light source 8012. Thus, the power converter 8013 is connected to an ac power source or a dc power source such as a commercial power source, so that the power is conveniently and stably supplied to the light source, and the light source stably works.
In addition, in a specific embodiment, in the lamp control device provided in the embodiment of the present application, a controllable human-machine interface (not shown) is further provided on the lamp 801, where a brightness adjustment key and/or a color temperature adjustment key are provided on the controllable human-machine interface.
The controllable human-computer interface may also be a brightness adjusting knob, and/or a color temperature adjusting knob, and/or a brightness touch adjusting area, etc., which are not particularly limited in this application.
Therefore, the manually-operated brightness adjusting key and/or the color temperature adjusting key can be provided on the lamp, so that the user can perform personalized adjustment on the lighting parameters of the lamp.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element. Furthermore, it should be noted that the scope of the methods and apparatus in the embodiments of the present application is not limited to performing the functions in the order shown or discussed, but may also include performing the functions in a substantially simultaneous manner or in an opposite order depending on the functions involved, e.g., the described methods may be performed in an order different from that described, and various steps may also be added, omitted, or combined. Additionally, features described with reference to certain examples may be combined in other examples.
From the above description of the embodiments, it will be clear to those skilled in the art that the above-described embodiment method may be implemented by means of software plus a necessary general hardware platform, but of course may also be implemented by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solutions of the present application may be embodied essentially or in a part contributing to the prior art in the form of a computer software product stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk), comprising several instructions for causing a terminal (which may be a mobile phone, a computer, a server, or a network device, etc.) to perform the methods described in the embodiments of the present application.
The embodiments of the present application have been described above with reference to the accompanying drawings, but the present application is not limited to the above-described embodiments, which are merely illustrative and not restrictive, and many forms may be made by those of ordinary skill in the art without departing from the spirit of the present application and the scope of the claims, which are also within the protection of the present application.

Claims (6)

1. A luminaire control method, characterized in that the control method comprises:
acquiring a target image of a target user;
inputting the target image into a pre-established deep learning model;
determining a first target area and a second target area in the target image through the deep learning model;
determining a target user in the target image and a target object associated with the behavior of the target user in the target image according to the first target area and the second target area;
determining a target activity scene where the target user is located based on the action of the target user and/or the position relation between the target user and the target object;
controlling lighting parameters of a lamp according to the target activity scene, so that the lighting parameters of the lamp are matched with the target activity scene;
the target activity scenes comprise a first target activity scene, a second target activity scene and a third target activity scene; the determining, based on the action of the target user and/or the positional relationship between the target user and the target object, the target activity scene in which the target user is located includes:
Determining that the target user is in a first target activity scene when the action of the target user indicates that the target user is in a learning state and the distance between the target user and the target object is smaller than a first threshold value;
determining that the target user is in a second target activity scene when the action of the target user indicates that the target user is in a learning state and the distance between the target user and the target object is greater than or equal to a first threshold;
determining that the target user is in a third target activity scene when the action of the target user indicates that the target user is in a relaxed state or the position relationship between the target user and the target object indicates that the target user is in a relaxed state;
wherein, in case the illumination parameter comprises a luminance value, a first target luminance value matching the first target activity scene > a second target luminance value matching the second target activity scene > a third target luminance value matching the third target activity scene;
wherein, in case the illumination parameter comprises a color temperature value, a first target color temperature value matching the first target activity scene > a second target color temperature value matching the second target activity scene > a third target color temperature value matching the third target activity scene.
2. The control method according to claim 1, wherein controlling the lighting parameters of the light fixture according to the target activity scene so that the lighting parameters of the light fixture match the target activity scene comprises:
determining a target illumination parameter corresponding to the target activity scene according to a pre-established corresponding relation between the activity scene and the illumination parameter;
and setting the target illumination parameter as the current illumination parameter of the lamp, so that the current illumination parameter of the lamp is matched with the target activity scene.
3. A luminaire control device, characterized in that the control device comprises:
the acquisition module is used for acquiring a target image of a target user;
a first determining module for inputting the target image into a pre-established deep learning model; determining a first target area and a second target area in the target image through the deep learning model; determining a target user in the target image and a target object associated with the behavior of the target user in the target image according to the first target area and the second target area;
the second determining module is used for determining a target activity scene where the target user is located based on the action of the target user and/or the position relation between the target user and the target object;
The control module is used for controlling the lighting parameters of the lamp according to the target activity scene so that the lighting parameters of the lamp are matched with the target activity scene;
the target activity scenes comprise a first target activity scene, a second target activity scene and a third target activity scene; the second determining module is configured to: determining that the target user is in a first target activity scene when the action of the target user indicates that the target user is in a learning state and the distance between the target user and the target object is smaller than a first threshold value; determining that the target user is in a second target activity scene when the action of the target user indicates that the target user is in a learning state and the distance between the target user and the target object is greater than or equal to a first threshold; determining that the target user is in a third target activity scene when the action of the target user indicates that the target user is in a relaxed state or the position relationship between the target user and the target object indicates that the target user is in a relaxed state; wherein, in case the illumination parameter comprises a luminance value, a first target luminance value matching the first target activity scene > a second target luminance value matching the second target activity scene > a third target luminance value matching the third target activity scene; wherein, in case the illumination parameter comprises a color temperature value, a first target color temperature value matching the first target activity scene > a second target color temperature value matching the second target activity scene > a third target color temperature value matching the third target activity scene.
4. A control device according to claim 3, wherein the control module is configured to:
determining a target illumination parameter corresponding to the target activity scene according to a pre-established corresponding relation between the activity scene and the illumination parameter;
and setting the target illumination parameter as the current illumination parameter of the lamp, so that the current illumination parameter of the lamp is matched with the target activity scene.
5. An electronic device, comprising: a luminaire control device as claimed in any one of claims 3 to 4.
6. A computer readable storage medium, characterized in that the computer readable storage medium has stored thereon a program or instructions which, when executed by a processor, realizes the steps of the control method according to any of claims 1-2.
CN202111671534.3A 2021-12-31 2021-12-31 Lamp control method, device, electronic equipment and computer readable storage medium Active CN114189969B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111671534.3A CN114189969B (en) 2021-12-31 2021-12-31 Lamp control method, device, electronic equipment and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111671534.3A CN114189969B (en) 2021-12-31 2021-12-31 Lamp control method, device, electronic equipment and computer readable storage medium

Publications (2)

Publication Number Publication Date
CN114189969A CN114189969A (en) 2022-03-15
CN114189969B true CN114189969B (en) 2024-03-01

Family

ID=80545475

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111671534.3A Active CN114189969B (en) 2021-12-31 2021-12-31 Lamp control method, device, electronic equipment and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN114189969B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114627435B (en) * 2022-04-04 2022-11-18 富华智能(深圳)有限公司 Intelligent light adjusting method, device, equipment and medium based on image recognition
CN117500127B (en) * 2024-01-03 2024-03-15 深圳市华电照明有限公司 Light control method and system based on wireless communication

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20140122320A (en) * 2013-04-09 2014-10-20 재단법인 한국조명연구원 Method for lighting control and lighting control system using the same
CN107277989A (en) * 2017-06-16 2017-10-20 深圳市盛路物联通讯技术有限公司 Intelligent House Light control method and device
CN109587875A (en) * 2018-11-16 2019-04-05 厦门盈趣科技股份有限公司 A kind of intelligent desk lamp and its adjusting method
CN109874209A (en) * 2018-11-06 2019-06-11 中国计量大学 Commercial hotel guest room scene lighting system based on scene automatic identification
CN110167243A (en) * 2019-06-17 2019-08-23 青岛亿联客信息技术有限公司 Intelligent lamp control method, device, system and computer readable storage devices
CN111392039A (en) * 2020-03-18 2020-07-10 浙江吉利汽车研究院有限公司 Auxiliary control system and control method for car lamp
CN112074062A (en) * 2019-05-21 2020-12-11 广东小天才科技有限公司 Scene-based light adjusting method and intelligent lighting device
CN113329545A (en) * 2021-05-25 2021-08-31 深圳市欧瑞博科技股份有限公司 Intelligent lighting method and device, intelligent control device and storage medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10375800B2 (en) * 2016-04-06 2019-08-06 Signify Holding B.V. Controlling a lighting system

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20140122320A (en) * 2013-04-09 2014-10-20 재단법인 한국조명연구원 Method for lighting control and lighting control system using the same
CN107277989A (en) * 2017-06-16 2017-10-20 深圳市盛路物联通讯技术有限公司 Intelligent House Light control method and device
CN109874209A (en) * 2018-11-06 2019-06-11 中国计量大学 Commercial hotel guest room scene lighting system based on scene automatic identification
CN109587875A (en) * 2018-11-16 2019-04-05 厦门盈趣科技股份有限公司 A kind of intelligent desk lamp and its adjusting method
CN112074062A (en) * 2019-05-21 2020-12-11 广东小天才科技有限公司 Scene-based light adjusting method and intelligent lighting device
CN110167243A (en) * 2019-06-17 2019-08-23 青岛亿联客信息技术有限公司 Intelligent lamp control method, device, system and computer readable storage devices
CN111392039A (en) * 2020-03-18 2020-07-10 浙江吉利汽车研究院有限公司 Auxiliary control system and control method for car lamp
CN113329545A (en) * 2021-05-25 2021-08-31 深圳市欧瑞博科技股份有限公司 Intelligent lighting method and device, intelligent control device and storage medium

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
基于Kinect体感动作识别的室内照明自主控制系统;王智德,等;《现代电子技术》;20210729;第44卷(第14期);143-146 *
多场景LED照明智能控制系统;李奇;;武汉工程职业技术学院学报;20151215(第04期);45-49 *

Also Published As

Publication number Publication date
CN114189969A (en) 2022-03-15

Similar Documents

Publication Publication Date Title
CN114189969B (en) Lamp control method, device, electronic equipment and computer readable storage medium
CN104395856B (en) For recognizing the computer implemented method and system of dumb show
CN105430841B (en) The control method and device of Intelligent illumination device
CN104202865A (en) Intelligent lighting device
CN106658926B (en) Method for controlling lamp and device
KR101543374B1 (en) Sensibility lighting control apparatus and method
KR20150030248A (en) Device control system, control apparatus and computer-readable medium
CN109327691B (en) Image shooting method and device, storage medium and mobile terminal
CN106817822B (en) Illumination control device, illumination system, and illumination control method
CN109542233A (en) A kind of lamp control system based on dynamic gesture and recognition of face
CN106707512B (en) Low-power consumption intelligent AR system and intelligent AR glasses
CN112596405A (en) Control method, device and equipment of household appliance and computer readable storage medium
CN110084204A (en) Image processing method, device and electronic equipment based on target object posture
CN105517222A (en) Light adjustment method and device
CN104240676B (en) Method for adjusting electronic device and electronic device
CN112446017A (en) Light supplement control method, system, storage medium and computer equipment
KR20180110472A (en) System and method for controlling a stereoscopic emotion lighting
CN112087845A (en) Light supplementing method and system based on detection lamp source
CN117337421A (en) Wearable electroencephalogram sensor and equipment control method using same
CN113867163A (en) Intelligent household scene switching method and device, intelligent terminal and storage medium
EP2778835A2 (en) Arbitration device, arbitration method, and computer program product
CN113966051B (en) Intelligent control method, device and equipment for desk lamp illumination and storage medium
CN111770609A (en) Intelligent lamp belt control system and method
JP2014078398A (en) Illumination control device, illumination control system and program
CN111246638A (en) Intelligent lamp control method and related device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant