WO2016206991A1 - Gesture based lighting control - Google Patents

Gesture based lighting control Download PDF

Info

Publication number
WO2016206991A1
WO2016206991A1 PCT/EP2016/063294 EP2016063294W WO2016206991A1 WO 2016206991 A1 WO2016206991 A1 WO 2016206991A1 EP 2016063294 W EP2016063294 W EP 2016063294W WO 2016206991 A1 WO2016206991 A1 WO 2016206991A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
lighting unit
controller
illuminated area
processor
Prior art date
Application number
PCT/EP2016/063294
Other languages
French (fr)
Inventor
Sanae CHRAIBI
Dzmitry Viktorovich Aliakseyeu
Jonathan David Mason
Original Assignee
Philips Lighting Holding B.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to EP15173246.8 priority Critical
Priority to EP15173246 priority
Application filed by Philips Lighting Holding B.V. filed Critical Philips Lighting Holding B.V.
Publication of WO2016206991A1 publication Critical patent/WO2016206991A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K9/00Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
    • G06K9/00335Recognising movements or behaviour, e.g. recognition of gestures, dynamic facial expressions; Lip-reading
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K9/00Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
    • G06K9/20Image acquisition
    • G06K9/2027Illumination control
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHTING NOT OTHERWISE PROVIDED FOR
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of the light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/105Controlling the light source in response to determined parameters
    • H05B47/115Controlling the light source in response to determined parameters by determining the presence or movement of objects or living beings
    • H05B47/125Controlling the light source in response to determined parameters by determining the presence or movement of objects or living beings by using cameras
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02BCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO BUILDINGS, e.g. HOUSING, HOUSE APPLIANCES OR RELATED END-USER APPLICATIONS
    • Y02B20/00Energy efficient lighting technologies
    • Y02B20/40Control techniques providing energy savings
    • Y02B20/44Control techniques providing energy savings based on detection of the user

Abstract

A controller (100) for controlling the light output of a lighting unit (110) is disclosed. The controller (100) comprises an image capture device (102) arranged for capturing a plurality of images of a scene (120) comprising a user and an illuminated area (132) being illuminated by the lighting unit (110). The image capture device (102) is further arranged for providing image information representing the captured images. The controller (100) further comprises a processor (104) arranged for analyzing the image information to detect a first user gesture (130) and a second user gesture (140, 140') provided by the user. Upon receiving the first user gesture (130), the processor identifies the illuminated area (132) based on the detected first user gesture (130). From the identified illuminated area (132), the processor (104) retrieves an embedded code (112) emitted by the lighting unit (110) illuminating the illuminated area (132). The processor identifies each lighting unit (110) illuminating the illuminated area (132) based on the retrieved embedded code (112). The processor (104) further identifies a user control command based on the detected second user gesture (140, 140'), whereupon the processor (104) generates a control signal based on the identified user control command. The controller (100) further comprises a transmitter 106 arranged for transmitting (108) the control signal to the identified lighting unit (110) to control its light output, resulting in an adjustment of the illumination of the illuminated area (132).

Description

GESTURE BASED LIGHTING CONTROL

FIELD OF THE INVENTION

The invention relates to a controller and a lighting system for controlling the light output of a lighting unit. The invention further relates to a method for controlling the light output of a lighting unit, and a computer program product for performing the method.

BACKGROUND

Future and current home and professional environments will contain a large number of lighting units for creation of ambient, atmosphere, accent or task lighting. These controllable lighting units may be controlled via remote control devices, such as smart phones or tablet pes. These devices allow a user to control the lighting via an application running on the smart device. This requires that a user always needs the remote control device to control different parameters, such as colour and brightness, of the lighting unit. It is desired to reduce the dependence on these smart remote control devices. One way to achieve this is to detect gestures made by the user to control the light output of the lighting unit. Patent application US2013120238 Al discloses a light output control method for a controlling a lighting device via a motion of an object (e.g. a hand of a user), the lighting device comprising an infrared (IR) video sensor and a light-emitting unit emitting IR light onto the object and, based on the reflected infrared light, determine a motion of the object to change an attribute of the output light if the motion of the object complies with a

predetermined condition.

SUMMARY OF THE INVENTION

It is an object of the present invention to provide a lighting controller, a system and a method that enable a user to control the lighting conditions via gestures with improved control functionality.

According to a first aspect of the present invention the object is achieved by a controller for controlling the light output of a lighting unit emitting light, the light emission comprising an embedded code identifying the lighting unit, the controller comprising: an image capture device arranged for capturing a plurality of images of a scene comprising a user and an illuminated area being illuminated by the lighting unit and for providing image information representing the captured images,

a processor arranged for:

a. analyzing the image information to detect a first and second user gesture,

identifying the illuminated area based on the detected first user gesture,

c. retrieving the embedded code emitted by the lighting unit from the illuminated area,

d. identifying the lighting unit based on the retrieved embedded code, e. identifying a user control command based on the detected second user gesture, and

f. generating a control signal based on the identified user control command, and

a transmitter arranged for transmitting the control signal to the identified lighting unit to control the light output of the identified lighting unit, thereby adjusting the illumination of the illuminated area.

The controller allows a user to control a lighting unit via two gestures: one to indicate the illuminated area and one to indicate how the light should be adjusted. The two gestures are derived from the plurality of images of the scene captured by the image capture unit. The scene comprises at least one user providing the first and second gesture and the illuminated area indicated by the user. The order wherein the gestures are made by the user and derived from the plurality of images may be dependent on the embodiment wherein the controller is used. The processor identifies the lighting unit(s) in the illuminated area (as indicated by the first user gesture) based on the coded light emitted by the lighting unit(s) and reflected at the illuminated area (e.g. a surface) and adjusts the light emission of the identified lighting unit(s) based on the user control command retrieved from the second gesture. This provides the advantage that no smart device or other remote control device is required to control the lighting unit. Another benefit of this controller is that it allows a user to indicate an area where the light should be controlled, thereby providing the user the possibility to select the lighting unit(s) by indicating the illuminated area. A user may, for example, indicate an illumination area in a room (e.g. an area on a desk) and control the illumination in that area, because the controller is arranged for identifying which lighting unit(s) illuminate the indicated area based on the reflected embedded code(s) retrieved from the illuminated area. The controller is further arranged for adjusting the light emission of the identified lighting unit(s) by generating and transmitting the control signal, which is based on the second user gesture.

In an embodiment of the controller, the processor is further arranged for activating and deactivating a control mode of the controller based on a user input, the control mode being a mode of operation wherein the controller is set for controlling the light output of the lighting unit. This embodiment allows a user to indicate when he intends to adjust the light emission of the lighting unit, thereby reducing the chance that the user may accidently adjust the light emission. The user input that activates or deactivates the control mode may, for example, be a voice/sound command, a dedicated gesture, an interaction with a further device, etc.

In an embodiment of the controller, the processor is arranged for identifying lighting information outside the identified illuminated area when insufficient lighting information is available in the identified illuminated area. This embodiment allows the controller to gather lighting information outside the identified illuminated area from the image information in order to gather more information from the one or more lighting units whose light emission is present in the illuminated area, which may be beneficial if, for example, insufficient (reflected) light is detected, or when the embedded code cannot be retrieved from the illuminated area as indicated by the user.

In an embodiment of the controller, the processor is arranged for determining a light contribution of the identified lighting unit, and wherein the processor is further arranged for generating the control signal based on the light contribution. This embodiment may for example be desirable when multiple lighting units illuminate the illuminated area, because it allows the processor to determine which lighting unit to control in order to create the lighting effect as indicated by the user.

In an embodiment of the controller, the controller is configured to be worn by the user. The controller may be comprised in smart glasses or augmented reality glasses, wherein the image capture device may capture gestures made by the user's hands. This embodiment may be beneficial because it allows a user to look into the desired direction, and the user sees exactly what the image capture device captures. This embodiment may further reduce the chance that the image capture device is unable to capture the gestures of the user, for example when the user gestures are not present in line of sight of the image capture device. This embodiment may further be beneficial because many available smart glasses are already equipped with a camera, a processing means and a communication system arranged for (in)direct communication with lighting units, thereby removing the need for a dedicated controller device.

In an embodiment of the controller, the processor is further arranged for retrieving colour information of at least one colour from at least one of the plurality of images, the at least one colour being indicated by the second user gesture, and the processor is further arranged for adjusting the light output of the identified lighting unit based on the retrieved colour information. This embodiment is advantageous because it allows a user to 'pick' at least one colour from the environment and apply it to an identified lighting unit.

The controller of any one of the preceding claims, wherein the processor is further arranged for identifying a light selection area in the scene based on the second user gesture, and wherein the controller is further arranged for retrieving information about light conditions from the light selection area, and wherein the controller is arranged for generating the control signal based on the retrieved information about the light conditions, thereby adjusting the illumination of the illuminated area based on the retrieved information about the light conditions. This embodiment is advantageous because it allows a user to copy a light setting from the light selection area to the illuminated area.

In an embodiment of the controller, the processor is further arranged for analyzing the image information to detect a third user gesture, and for retrieving in response to the detection of the third gesture embedded codes emitted by at least two lighting units, and for identifying and grouping the at least two lighting units based on the retrieved embedded codes, whereafter the grouped lighting units are arranged to be controlled as one lighting unit. This embodiment is advantageous because it allows a user to control multiple lighting units simultaneously after grouping them, even if the grouped lighting units do not share a common illuminated area.

According to a second aspect of the present invention the object is achieved by a system comprising the controller according to any one of the above-mentioned

embodiments and one or more lighting units arranged for being controlled by the controller.

According to a third aspect of the present invention the object is achieved by a method of controlling the light output of a lighting unit emitting light, the light emission comprising an embedded code identifying the lighting unit, the method comprising:

capturing a plurality of images of a scene comprising a user and an illuminated area being illuminated by the lighting unit,

providing image information representing the captured images, analyzing the image information to detect a first and second user gesture, identifying the illuminated area based on the detected first user gesture, retrieving the embedded code emitted by the lighting unit from the illuminated area,

- identifying the lighting unit based on the retrieved embedded code,

identifying a user control command based on the detected second user gesture, generating a control signal based on the identified user control command, and transmitting the control signal to the identified lighting unit to control the light output of the identified lighting unit, thereby adjusting the illumination of the illuminated area.

It should be noted that the above-mentioned steps of the method are not necessarily sequential, and that the order of the steps may differ per embodiment. For example, in a first embodiment it may be beneficial to first identify the user control command based on the second user gesture and second identify the illuminated area based on the first user gesture, while in a second embodiment these steps may be inversed.

In an embodiment, the first user gesture and/or the second user gesture are defined by a position of at least a part of the body of the user. This embodiment allows a user to provide an input by simply taking a pose in order to, for example, indicate the illuminated area or to adjust the lighting. An advantage of this embodiment is that the number of images to be analyzed by the processor may be reduced. Determining the position of the at least a part of the body of a user may reduce the level of complexity of the image processing algorithms of the processor.

In an alternative embodiment, the first user gesture and/or the second user gesture are defined by a movement of at least a part of the body of the user. This embodiment is advantageous because it allows a user to indicate the illuminated area via movement. The user may, for example, demarcate an area with his arms, thereby indicating the illuminated area, and provide a rotational movement with his arms to scroll through the colour setting of the lighting unit(s) whose light emission was detected in the illuminated area. Lighting control based on user movement may be beneficial because it allows the user to control the lighting in an intuitive way.

In an embodiment, the embedded code is comprised in visible light emitted by the lighting unit. The code embedded in the visible light may be imperceptible for a user. This embodiment provides the advantage that the visible light emission is used to emit the embedded code, thereby removing the requirement for a dedicated light source for emitting the code. In an alternative embodiment, the embedded code is comprised in invisible light (e.g. infrared light) emitted by the lighting unit. This embodiment may be beneficial, because it provides the possibility for a user to demarcate the illuminated area when the lighting unit(s) are turned off. In a further embodiment, visible light code emission and invisible light code emission may be combined.

According to a fourth aspect of the present invention the object is achieved by a computer program product comprising computer program code to perform the method according to the invention when the computer program product is run on a processing unit of the computing device.

BRIEF DESCRIPTION OF THE DRAWINGS

The above, as well as additional objects, features and advantages of the disclosed controller, system and methods, will be better understood through the following illustrative and non-limiting detailed description of embodiments of devices and methods, with reference to the appended drawings, in which:

Fig. 1 shows schematically an embodiment of a controller according to the invention for controlling the light output of a lighting unit and a user controlling the light output of the lighting unit by indicating an illuminated area via a first user gesture and by providing a user control command via a second gesture;

Fig. 2 shows schematically an embodiment of a controller according to the invention for controlling the light output of a lighting unit and a user indicating an illuminated area via a first user gesture;

Fig. 3 shows schematically an embodiment of a controller according to the invention, wherein the controller determines a light contribution of two identified lighting units;

Fig. 4 shows schematically an embodiment of a controller according to the invention for controlling the light output of a first lighting unit, wherein the controller is arranged to be worn by a user;

Fig. 5 shows schematically an embodiment of a controller according to the invention for controlling the light output of a first lighting unit based on the light output of a second lighting unit and a user indicating an illuminated area via a first user gesture and indicating a light selection area via a second gesture;

Fig. 6 shows schematically an embodiment of a controller according to the invention for detecting a third user gesture for grouping at least two lighting units. All the figures are schematic, not necessarily to scale, and generally only show parts which are necessary in order to elucidate the invention, wherein other parts may be omitted or merely suggested. DETAILED DESCRIPTION OF EMBODIMENTS

Fig. 1 shows schematically an embodiment of a controller 100 according to the invention for controlling the light output of a lighting unit 110. The controller 100 comprises an image capture device 102 arranged for capturing a plurality of images of a scene 120 comprising a user and an illuminated area 132 being illuminated by the lighting unit 110. The image capture device provides image information representing the captured images. The controller 100 further comprises a processor 104 arranged for analyzing the image information to detect a first user gesture 130 and a second user gesture 140, 140' provided by the user. Upon the detection of the first user gesture 130, the processor identifies the illuminated area 132 based on the detected first user gesture 130. From the identified illuminated area 132, the processor 104 retrieves an embedded code 112 emitted by the lighting unit 110 illuminating the illuminated area 132. The processor identifies each lighting unit 110 illuminating the illuminated area 132 based on the retrieved embedded code 112. The processor 104 further identifies a user control command (i.e. a lighting control command) based on the detected second user gesture 140, 140', whereupon the processor 104 generates a control signal based on the identified user control command. The controller 100 further comprises a transmitter 106 arranged for transmitting 108 the control signal to the identified lighting unit 110 to control its light output, resulting in an adjustment of the illumination of the illuminated area 132.

The controller 100 may be any type of device arranged for capturing images and for transmitting control signals 108 to lighting unit(s) (the controller 100 may for example be comprised in a smart device, smart glasses, a laptop, a tablet pc, a home automation system, a camera system, etc.). The transmitter 106 comprised in the controller 100 may communicate with the lighting unit(s) 110 via any type of communication technology. Various wired and wireless communication technologies that are known in the art may be used, for example Ethernet, DMX, DALI, Bluetooth, 4G, Wi-Fi or ZigBee. A specific communication technology may be selected based on the communication capabilities of the controller 100 and the lighting unit(s) 110, the power consumption of the

communication driver for the (wireless) communication technology and/or the

communication range of the signals. The controller 100 and the lighting unit(s) 110 may be connected to the same (home) network, thereby improving the communication between both. Additionally or alternatively, the controller 100 may communicate with the lighting unit(s) 110 through an intermediate communication device such as a communication hub, a bridge or a router. In a further embodiment, the controller 100 may be arranged for receiving information from the lighting unit(s) 110 in order to determine how to control the light output of the lighting unit(s) 110. The controller 100 may, for example, receive information from the lighting unit(s) 110 about the current light setting of the lighting unit(s) 110, or it may receive properties of the lighting unit(s) 110 (such as colour range, dimming range, colour temperature range, etc.). The controller 100 may receive this information directly from the lighting unit(s) 110 via network communication, or via the code 112 embedded in the light emission of the lighting unit(s) 112.

The image capture device 102 may be any type of device arranged for capturing images. The image capture device may be for example a digital camera, a depth camera, an IR camera, an RGB camera, etc., or a combination of cameras to provide (3D) motion/position capture. The processor 104 is arranged for identifying the first 130 and second user gesture 140, 140' in order to determine the illuminated area 132 (and therewith to identify the lighting unit(s) 110 that illuminate the illuminated area 132) and to determine how the light output should be controlled based on the second user gesture 140, 140'. The processor 104 may use digital image processing techniques to extract the first 130 and second gesture 140, 140' from the image information provided by the image capture device 102. The processor 104 may further use digital image processing techniques to identify the lighting unit(s) 110 illuminating the illuminated area 132 based on the reflected embedded code(s) 112 retrieved from the illuminated area 132. Upon identifying a lighting unit 110, the processor 104 may determine the type of lighting unit 110 and/or the properties of the lighting unit 110. The processor 104 may, for example, determine that a lighting unit 110 is arranged for emitting white light only, for emitting RGB light, for emitting different colour temperatures, etc. This information may further be used to control the lighting unit 110.

The first user gesture 130 made by the user indicates the illuminated area 132. The illuminated area 132 is an area illuminated by one or more lighting units 110. In a first example, as illustrated in Fig. 1, the user may select a lighting unit 110 by, for example, pointing at the illuminated area 132. The pointing gesture is captured by the image capture device 102, whereafter the processor 104 identifies the illuminated area 132 and retrieves the embedded code 112 emitted by the lighting unit 110 from the illuminated area 132. In a second example, the user may select a plurality of lighting units (not shown) by indicating an area with a movement of, for example, his arm. The user may 'draw' a shape in order to demarcate the illuminated area 132. The movement (i.e. the first user gesture) is captured by the image capture device 102, whereafter the processor 104 identifies the illuminated area 132 (i.e. the area indicated by the user) and retrieves the embedded code emitted by the lighting units that illuminate the illuminated area. In a third example, as illustrated in Fig. 2, the user may indicate an area 232 (e.g. the floor, a part of the room, etc.) and/or an object (e.g. a plant, a desk, a painting, etc.) that may be illuminated by one or more lighting units 200. The user may, for example, use two arms 230 to indicate the illuminated area 232.

Alternatively, the user may, for example, use one arm to demarcate the illuminated area by 'drawing' a shape around the area or object. The demarcation (i.e. the first user gesture 230) is captured by the image capture device 102, whereafter the processor 104 identifies the illuminated area 230 (i.e. the demarcated area) and retrieves the embedded code 202 present in the illuminated area 232. One or more codes 202 may be present in the illuminated area 232, allowing the processor 104 to determine which lighting units 200 illuminate the illuminated area 232. An advantage of indicating an illuminated area is that it is not required that the one or more lighting units 200 are in the line of sight 210 of the image capture device.

The second user gesture 140, 140' made by the user indicates the user control command. The user control command is representative of a lighting control command. In a first example as illustrated in Fig. 1, the user may move an arm upwards 140, 140' in order to increase the saturation of the colour of the light output of the lighting unit(s) 110 illuminating the illuminated area 132. The user may further use his other arm to increase/decrease the intensity of the light output of the lighting unit(s) 110 by moving the other arm

upwards/downwards (not shown). The user may further use one arm to scroll through light scene settings of the lighting devices by for example rotating one arm perpendicular to his torso, or rotate his wrist to make slight alterations to the colour. Alternatively, a static (non- moving) second user gesture may be made by the user. The user may, for example, take a pose (such as standing with open arms) that indicates that the light intensity should increase. In another example, the user may hug himself, which pose may indicate a 'cozy' light setting. It should be noted that the above-mentioned ways of lighting control via static or dynamic gestures are examples, and that a person skilled in the art is capable of designing many alternative gestures for lighting control.

The controller 100 may further comprise or be connected to a database (not shown) arranged for storing first user gestures and second user gestures. The stored first user gestures are related to the indication of the illuminated area, and the stored second user gestures are related to the user control commands. The processor 104 may be arranged for comparing the detected first 130 and second user gestures 140, 140' with the stored first and second user gestures. If the detected first user gesture 130 has sufficient similarities with a stored first user gesture, the processor 104 may determine to identify the illuminated area based 132 on the stored first user gesture. If the detected second user gesture 140, 140' has sufficient similarities with a stored second user gesture, the processor 104 may determine to identify the user control command based on the stored first user gesture.

The one or more lighting units 110 are arranged for emitting light comprising an embedded code 112 which, upon being detected by the controller 100, identifies each lighting unit 110. The embedded coded information (e.g., packets) in the light emission of each lighting unit may be comprised in the visible light emitted by the lighting unit 110. It may be desired that the coding is imperceptible for users that are interested in the

illumination function of the lighting unit(s) 110. Additionally or alternatively, the coding of the light may be comprised in invisible light (e.g. infrared, near-infrared, ultraviolet, etc.) and be detected by an image capture device 102 arranged for capturing invisible light (e.g. an IR camera), which may be beneficial when a lighting unit 110 is switched off. Each lighting unit 110 comprises at least one light source, for example an LED light source, for emitting the coded light and for lighting the environment. A lighting unit 110 may be arranged for providing, task lighting, ambient lighting, atmosphere lighting, accent lighting, etc. A lighting unit 110 may be installed in a luminaire or in a lighting fixture. Alternatively, a lighting unit 110 may be a portable lighting unit (e.g. a hand-sized device, such as an LED cube, an LED sphere, etc.) or a wearable lighting unit (e.g. a light bracelet, a light necklace, etc.).

The processor 104 may be further arranged for activating and deactivating a control mode of the controller 100 based on a user input. The control mode is a mode of operation wherein the controller 100 is set for controlling the light output of the lighting unit 110. While the controller 100 is set to control mode, the controller 100 may capture the plurality of images in order to retrieve the user gestures that comprise an indication of the illuminated area 132 and the user control command. The controller 100 may be set to a standby mode, wherein the controller 100 is arranged for detecting a specific user gesture, the specific user gesture (such as pointing towards the image capture device, pointing towards a lighting unit for a predefined period of time, making a circular movement with both arms, etc.) being the user input to activate/deactivate the control mode. Additionally or alternatively, the control mode may be activated via a sound command generated by the user. The controller 100 may be set to a standby mode, wherein the controller 100 comprises a sound detecting element (e.g. a microphone) which is arranged for detecting a specific sound, the specific sound (e.g. a voice command, a hand-clapping sound, etc.) being the user input to activate/deactivate the control mode. Additionally or alternatively, the control mode may be activated upon the detection of a further device in the vicinity of the controller 100. The controller 100 may be set to a standby mode, wherein the controller 100 comprises a receiver arranged for receiving a signal from the further device (e.g. a Bluetooth signal, an IR signal, etc.), the signal being indicative of the user input to activate/deactivate the control mode. Additionally or alternatively, the controller 100 may comprise a user interface arranged for receiving the user input. The user interface may comprise a touch-sensitive device such as a touchpad or a touchscreen, a motion sensor such as an accelerometer, magnetometer and/or a gyroscope for detecting gestures and/or one or more buttons for receiving the user input.

The processor 104 may be further arranged for identifying lighting information outside the identified illuminatedarea 132 when insufficient lighting information is available in the identified illuminated area 132. This allows the controller 100 to gather more information from the one or more lighting units 110 whose light emission may be present in the illuminated area 132. For example, if a user indicates an illuminated area 132 near a sofa, and a lighting unit 110 illuminates this area insufficiently to derive the embedded code 112, the processor 104 may determine to use coded light information surrounding the illuminated area 132 in order to determine which lighting unit 110 illuminates the illuminated area 132.

Fig. 3 shows schematically an embodiment of a controller 100 according to the invention, wherein the processor 104 is further arranged for determining a light contribution of the identified lighting units 310, 320. The processor 104 may for example determine, based on the analysis of the image information of the plurality of images, three locations 300, 314, 324 in the image: a location of the illuminated area 300, a location 314 of a first detected code 312 emitted by a first lighting unit 310 and a location 324 of a second detected code 322 emitted by a second lighting unit 320. Based on these locations, the processor may determine that the first lighting unit 310 illuminates the illuminated area 300 for 30%, and that a second lighting unit 320 illuminates the illuminated area 300 for 80%. The processor 104 is further arranged for generating the control signal(s) based on the light contribution. In the above- mentioned example, the processor 104 may determine to only adjust the light output of the second lighting unit 320 if a user wants to increase the brightness of the illuminated area 300, because adjusting the light output of the first lighting unit 310 may influence the illumination of another area, which may be undesirable. In a second example (not shown in Fig. 3), the processor may determine that a first lighting unit illuminates the illuminated area for 20%, that a second lighting unit illuminates the illuminated area for 20%, that a third lighting unit illuminates the illuminated area for 20% and that a fourth lighting unit illuminates the illuminated area for 100%. If the user provides a second user gesture to indicate a dynamic light effect (e.g. a disco effect) in the illuminated area, the processor may determine to adjust the colour output of the first, second and third lighting units, and turn off the fourth lighting unit in order to execute the dynamic light effect.

Fig. 4 shows schematically an embodiment of a controller 400 according to the invention for controlling the light output of a first lighting unit 410, wherein the controller 400 is arranged to be worn by a user. The controller 400 may for example be comprised in smart glasses. A user may wear the smart glasses and the camera 406 of the smart glasses may capture the plurality of images of, for example, the hands of the user. The user may use his hands to indicate the illuminated area 432 via the first user gesture 430 and to provide the user control command via the second user gesture 440, 440' (e.g. a downward movement to reduce the brightness of the lighting unit). The controller identifies the lighting unit 410 based on the embedded code 412 emitted by the lighting unit 410, whereafter it generates and transmits the control signal.

Fig. 5 shows schematically an embodiment of a controller 100 according to the invention for controlling the light output of a first lighting unit 500 based on the light output of a second lighting unit 510. The user indicates the illuminated area 532 via the first user gesture 530 and indicates a light selection area 542 via a second gesture 540. In this embodiment, the controller 100 is further arranged for retrieving information about light conditions of the light selection area 542 from the plurality of images captured by the image capture device 102, and for generating the control signal based on the information about the light conditions of the light selection area 542. The controller 100 may determine the light conditions by retrieving an embedded code 512 emitted by the second lighting unit 510 from the light selection area 542 in order to identify the second lighting unit 510, whereafter the controller 100 may determine the light output of the second lighting unit 510, for example via a communication link 550 between the controller 100 and the second lighting unit 510.

Alternatively, the controller 100 may determine the light output of the second lighting unit 510 based on information embedded in the coded light 512 emitted by the second lighting unit 510. Alternatively, the controller 100 may use image processing techniques to determine the light conditions in the light selection area 542. This embodiment allows a user to copy the light conditions from the light selection area 542 to the light illuminated 532.

In an embodiment, the processor 104 is further arranged for retrieving colour information of at least one colour from at least one of the plurality of images. The at least one colour may be indicated by the user by the second user gesture 140, 140'. The user may, for example, point to a colour in the environment (e.g. to a green plant, to a blue painting, to a white wall) as the second user gesture 140, 140' in order to apply that colour to the illuminated area 132. The processor 104 is further arranged for adjusting the light output of the identified lighting unit 110 based on the retrieved colour information. Alternatively, the processor 104 may retrieve a plurality of colours over time. The user may, for example, point at the fire in a fireplace as the second user gesture 140, 140', whereupon the processor 104 retrieves a plurality of colours (e.g. red, orange and yellow) from the plurality of images. The processor 104 is further arranged for applying a dynamic light effect (i.e. a plurality of light settings over time) to the identified lighting unit 110 based on retrieved colour information of the plurality of colours.

Fig. 6 shows schematically an embodiment of a controller 100 according to the invention for detecting a third user gesture 630 for grouping at least two lighting units 610, 620. In this embodiment, the processor 104 is further arranged for analyzing the image information of the plurality of images to detect a third user gesture 630. The third gesture 630 is a gesture related to a user control command for (un)grouping a plurality of lighting units 610, 620. The user may indicate which lighting units 610, 620 should be grouped (e.g. by demarcating an area 632 with two arms, the area comprising the area illuminated by the plurality of lighting units 610, 620), whereafter the user may provide the gesture for grouping the plurality of devices (e.g. by moving his hands towards each other). The processor 104 retrieves the embedded codes 612, 622 emitted by the plurality of lighting units 610, 620 from the illuminated area 632 in response to the detection of the third gesture 630 to identify the lighting units 610, 620, whereafter the processor 104 stores the plurality of lighting units 110 as a group. This allows the user to control the grouped lighting units 610, 620 as one lighting unit. In a further embodiment, the processor 104 may be arranged for detecting a further user gesture which is related to ungrouping the lighting units 610, 620.

In an embodiment, the processor 104 may be arranged for communicating to a user that a lighting unit 110 has been identified. The processor 104 may, for example, generate a control signal that turns the identified lighting unit(s) 110 on or off, or it may flicker shortly in order to indicate which lighting unit(s) 110 may be controlled by the user. Alternatively, the user may first provide the second gesture 140, 140' indicative of the user control command, whereafter the processor 104 may communicate to the user which lighting unit(s) 110 may be adjusted. The processor may, for example, generate a control signal that shortly flickers all lighting units 110 that may execute the user control command. After flickering the light units 110, the user may indicate one or more illuminated areas 132 in order to adjust their light output.

It should be noted that the above-mentioned embodiments illustrate rather than limit the invention, and that those skilled in the art will be able to design many alternative embodiments without departing from the scope of the appended claims.

In the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. Use of the verb "comprise" and its conjugations does not exclude the presence of elements or steps other than those stated in a claim. The article "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. The invention may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer or processing unit. In the device claim enumerating several means, several of these means may be embodied by one and the same item of hardware. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage.

Claims

CLAIMS:
1. A controller (100) for controlling the light output of a lighting unit (110) emitting light, the light emission comprising an embedded code (112) identifying the lighting unit (110), the controller (100) comprising:
an image capture device (102) arranged for capturing a plurality of images of a scene (120) comprising a user and an illuminated area (132) being illuminated by the lighting unit (110) and for providing image information representing the captured images,
a processor (104) arranged for:
a. analyzing the image information to detect a first (130) and second user gesture (140, 140'),
b. identifying the illuminated area (132) based on the detected first user gesture (130),
c. retrieving the embedded code (112) emitted by the lighting unit (110) from the illuminated area (132),
d. identifying the lighting unit (110) based on the retrieved embedded code (112),
e. identifying a user control command based on the detected second user gesture (140, 140'), and
f . generating a control signal based on the identified user control command, and
- a transmitter (106) arranged for transmitting (108) the control signal to the identified lighting unit (110) to control the light output of the identified lighting unit (110), thereby adjusting the illumination of the illuminated area (132).
2. The controller (100) of claim 1, wherein the processor (104) is further arranged for activating and deactivating a control mode of the controller (100) based on a user input, the control mode being a mode of operation wherein the controller (100) is set for controlling the light output of the lighting unit (110).
3. The controller (100) of any one of the preceding claims, wherein the processor (104) is arranged for identifying lighting information outside the identified illuminated area (132) when insufficient lighting information is available in the identified illuminated area (132).
4. The controller (100) of any one of the preceding claims, wherein the processor (104) is arranged for determining a light contribution of the identified lighting unit (110), and wherein the processor (104) is further arranged for generating the control signal based on the light contribution.
5. The controller (100) of any one of the preceding claims, wherein the controller (100) is configured to be worn by the user.
6. The controller (100) of any one of the preceding claims, wherein the processor (104) is further arranged for retrieving colour information of at least one colour from at least one of the plurality of images, the at least one colour being indicated by the second user gesture (140, 140'), and wherein the processor (104) is further arranged for adjusting the light output of the identified lighting unit (110) based on the retrieved colour information.
7. The controller (100) of any one of the preceding claims, wherein the processor
(104) is further arranged for identifying a light selection area in the scene (120) based on the second user gesture (140, 140'), and wherein the controller (100) is further arranged for retrieving information about light conditions from the identified light selection area, and wherein the controller 100 is arranged for generating the control signal based on the retrieved information about the light conditions, thereby adjusting the illumination of the illuminated area (132) based on the retrieved information about the light conditions.
8. The controller (100) of any one of the preceding claims, wherein the processor
(104) is further arranged for analyzing image information to detect a third user gesture, and for retrieving in response to the detection of the third gesture embedded codes (112) emitted by at least two lighting units (110), and for identifying and grouping the at least two lighting units (110) based on the retrieved embedded codes (112), whereafter the grouped lighting units (110) are arranged to be controlled as one lighting unit (110).
9. A lighting system comprising the controller (100) as claimed in any one of the claims 1 to 8 and one or more lighting units (110) arranged for being controlled by the controller (100).
10. A method of controlling the light output of a lighting unit (110) emitting light, the light emission comprising an embedded code (112) identifying the lighting unit (110), the method comprising:
capturing a plurality of images of a scene (120) comprising a user and an illuminated area (132) being illuminated by the lighting unit (110),
- providing image information representing the captured images,
analyzing the image information to detect a first (130) and second user gesture
(140, 140'),
identifying the illuminated area (132) based on the detected first user gesture
(130),
- retrieving the embedded code (112) emitted by the lighting unit (110) from the illuminated area (132),
identifying the lighting unit (110) based on the retrieved embedded code
(112),
identifying a user control command based on the detected second user gesture (140, 140'),
generating a control signal based on the identified user control command, and transmitting (108) the control signal to the identified lighting unit (110) to control the light output of the identified lighting unit (110), thereby adjusting the illumination of the illuminated area (132).
11. The method of claim 10, wherein the first user gesture (130) and/or the second user gesture (140, 140') are defined by a position of at least a part of the body of the user.
12. The method of claim 10, wherein the first user gesture (130) and/or the second user gesture (140, 140') are defined by a movement of at least a part of the body of the user.
13. The method of claim 10, 11 or 12, wherein the embedded code (112) is comprised in visible light emitted by the lighting unit (110).
14. The method of claim 10, 11 or 12, wherein the embedded code (112) is comprised in invisible light emitted by the lighting unit (110).
15. A computer program product for a computing device, the computer program product comprising computer program code to perform the method of claim 10 when the computer program product is run on a processing unit of the computing device.
PCT/EP2016/063294 2015-06-23 2016-06-10 Gesture based lighting control WO2016206991A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP15173246.8 2015-06-23
EP15173246 2015-06-23

Publications (1)

Publication Number Publication Date
WO2016206991A1 true WO2016206991A1 (en) 2016-12-29

Family

ID=53488205

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2016/063294 WO2016206991A1 (en) 2015-06-23 2016-06-10 Gesture based lighting control

Country Status (1)

Country Link
WO (1) WO2016206991A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018219900A1 (en) * 2017-06-01 2018-12-06 Philips Lighting Holding B.V. A system for rendering virtual objects and a method thereof
WO2018219962A1 (en) * 2017-06-01 2018-12-06 Philips Lighting Holding B.V. A system for rendering virtual objects and a method thereof

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009095833A1 (en) * 2008-01-30 2009-08-06 Philips Intellectual Property & Standards Gmbh Lighting system and method for operating a lighting system
WO2011086501A1 (en) * 2010-01-15 2011-07-21 Koninklijke Philips Electronics N.V. Method and system for 2d detection of localized light contributions
WO2013054221A1 (en) * 2011-10-14 2013-04-18 Koninklijke Philips Electronics N.V. Coded light detector
US20130120238A1 (en) 2011-11-11 2013-05-16 Osram Sylvania Inc. Light control method and lighting device using the same
US20130127712A1 (en) * 2011-11-18 2013-05-23 Koji Matsubayashi Gesture and voice recognition for control of a device
WO2013085600A2 (en) * 2011-12-05 2013-06-13 Greenwave Reality, Pte Ltd. Gesture based lighting control
US20150023019A1 (en) * 2013-07-16 2015-01-22 Chia Ming Chen Light control systems and methods

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009095833A1 (en) * 2008-01-30 2009-08-06 Philips Intellectual Property & Standards Gmbh Lighting system and method for operating a lighting system
WO2011086501A1 (en) * 2010-01-15 2011-07-21 Koninklijke Philips Electronics N.V. Method and system for 2d detection of localized light contributions
WO2013054221A1 (en) * 2011-10-14 2013-04-18 Koninklijke Philips Electronics N.V. Coded light detector
US20130120238A1 (en) 2011-11-11 2013-05-16 Osram Sylvania Inc. Light control method and lighting device using the same
US20130127712A1 (en) * 2011-11-18 2013-05-23 Koji Matsubayashi Gesture and voice recognition for control of a device
WO2013085600A2 (en) * 2011-12-05 2013-06-13 Greenwave Reality, Pte Ltd. Gesture based lighting control
US20150023019A1 (en) * 2013-07-16 2015-01-22 Chia Ming Chen Light control systems and methods

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018219900A1 (en) * 2017-06-01 2018-12-06 Philips Lighting Holding B.V. A system for rendering virtual objects and a method thereof
WO2018219962A1 (en) * 2017-06-01 2018-12-06 Philips Lighting Holding B.V. A system for rendering virtual objects and a method thereof

Similar Documents

Publication Publication Date Title
US10452153B2 (en) Gesture-based load control
US9232601B2 (en) Lighting system
JP6214572B2 (en) Method and apparatus for control device configuration setting
ES2708695T3 (en) Coded light detector
ES2616249T3 (en) Light source remote control
US10465882B2 (en) Methods and apparatus for controlling lighting
US10009978B2 (en) Learning capable lighting equipment
US9137878B2 (en) Dynamic lighting based on activity type
JP5825561B2 (en) Interactive lighting control system and method
US9706623B2 (en) Learning capable control of chaotic lighting
US9301372B2 (en) Light control method and lighting device using the same
US9872368B2 (en) Control method for mobile device
RU2689148C2 (en) Methods and device for starting and controlling lighting units and lamps with touch-sensitive control and with gestures control
JP2014056670A (en) Lighting control system
US20160224036A1 (en) Gesture-based load control via wearable devices
JP4374472B2 (en) Lighting control system
CN103168505B (en) For controlling user interactive system and the portable electric appts of illuminator
ES2726990T3 (en) Control of a lighting system using a mobile terminal
US10271405B2 (en) Lighting fixture sensor network
US8952626B2 (en) Lighting control systems and methods
EP2891386B1 (en) Controlling light source(s) via a portable device
JP6125621B2 (en) Method and apparatus for storing, proposing and / or using lighting settings
EP2879470B1 (en) Control method of mobile device
EP2783550B1 (en) Configuration of operating devices for lighting means
JP6382806B2 (en) Method and apparatus for providing personalized lighting

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16732505

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase in:

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16732505

Country of ref document: EP

Kind code of ref document: A1