US20090273287A1 - Semantic light - Google Patents

Semantic light Download PDF

Info

Publication number
US20090273287A1
US20090273287A1 US12/503,217 US50321709A US2009273287A1 US 20090273287 A1 US20090273287 A1 US 20090273287A1 US 50321709 A US50321709 A US 50321709A US 2009273287 A1 US2009273287 A1 US 2009273287A1
Authority
US
United States
Prior art keywords
subject
semantic
lighting
lighting system
light source
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/503,217
Inventor
Zary Segall
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to US12/503,217 priority Critical patent/US20090273287A1/en
Publication of US20090273287A1 publication Critical patent/US20090273287A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/30Devices for illuminating a surgical field, the devices having an interrelation with other surgical devices or with a surgical procedure
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/165Controlling the light source following a pre-assigned programmed sequence; Logic control [LC]

Definitions

  • the present invention relates to light delivery systems and, more particularly, to a semantic lighting that delivers appropriate light (spectrum, intensity, color, contrast, temperature, angle, focus, data) to a subject by analyzing the properties of the subject (nature, dimensions, shape, texture, contrast, reflectivity, transparence, temperature, etc.), the existing illumination, the eye characteristics of the human user and the relative position of the subject with respect to the source of light and user.
  • the semantic light delivers dynamic light that is changing in time and in sync with the semantic of the task requiring illumination.
  • Lighting is a key element in human performance and productivity. Thus, good interior designers consider all aspect of the light needed to properly illuminate a room, including intensity, spectrum, directionality, etc. Unfortunately, once the lights are installed they are relatively static. Despite changing seasons, daylight hours, moving occupants of the house, rearranged furniture, etc., conventional lighting does not adapt.
  • an object of this invention to provide a dynamic light system that changes its illumination in time and in sync with the task performed.
  • a dynamic light system that automatically analyzes a range of properties in order to control a range of variables.
  • the properties analyzed include the proprieties of a subject (nature, dimensions, shape, texture, contrast, reflectivity, transparence, temperature, etc.), plus the properties of the environment (such as existing illumination), plus human properties including eye characteristics and the relative position of the subject with respect to the source of light.
  • the foregoing properties are automatically analyzed to control a range of variables (spectrum, intensity, color, contrast, temperature, angle, focus, data) in order to project the optimal lighting conditions for any given environment, user, subject and situation.
  • the invention disclosed herein provides dynamic light that changes over time to adapt to changing environments and changing requirements of the task requiring illumination.
  • FIG. 1 is a perspective view of the semantic lighting system 2 according to the present invention.
  • FIG. 2 is a block diagram of the primary components of the semantic lighting system 2 of FIG. 1 .
  • FIG. 3 is a flow diagram illustrating the general operation of the semantic lighting system according to the present invention.
  • the present invention is an active lighting system that analyzes a subject (by measuring a range of properties of a subject, inclusive of nature, dimensions, shape, texture, contrast, reflectivity, transparence, temperature, etc.), analyzes the environment (by measuring properties such as existing illumination), incorporates a knowledge of the eye characteristics of the user, understands the task the user is performing, and provides the most appropriate lighting conditions for any given environment and situation by automatic control of a range of variables (spectrum, intensity, color, contrast, temperature, angle, focus, data). Since the user is human, properties specifically include the user eye characteristics, relative position of the subject with respect to the source of light, etc.
  • FIG. 1 is a perspective view
  • FIG. 2 is a block diagram of the primary components of the semantic lighting system 2 according to the present invention.
  • the system 2 generally includes a Programmable Light Unit 40 (PLU), Sensor Pod 50 (SP) and Control Unit 30 (CU), all of which are available for sensing and controlling the appropriate lighting conditions in a given area for a given subject.
  • PLU Programmable Light Unit 40
  • SP Sensor Pod 50
  • CU Control Unit 30
  • the system 2 includes a Processing Unit 60 (PU) and Network Unit 70 (NU).
  • the Subject 80 is the target of illumination such as a magazine or a book.
  • the Sensor Pod (SP) 50 incorporates a variety of sensors including, but not limited to, a visual spectrum camera 52 , infrared spectrum camera 54 , range sensors 56 , a sensor for tracking eye movement 58 , and other possible sensors.
  • the Programmable Light Unit 40 preferably includes one or more digital programmable light sources such as a conventional DLP or LCD projectors, one or more high intensity programmable LED clusters, one or more conventional incandescent or fluorescent light sources including halogen, or any combination of the foregoing. PLU 40 also contains conventional means of focusing and directing light to a particular area of the subject.
  • digital programmable light sources such as a conventional DLP or LCD projectors, one or more high intensity programmable LED clusters, one or more conventional incandescent or fluorescent light sources including halogen, or any combination of the foregoing.
  • PLU 40 also contains conventional means of focusing and directing light to a particular area of the subject.
  • the Control Unit 30 includes a user interface with controls for controlling the PU 60 , and may be configured as a conventional IR remote controller.
  • the Control Unit 30 (CU) is used for user-input of a task requirement and personalization depending on the particular task or environment that the user desires lighting for.
  • the task requirement may be a categorical choice of task such as user reading lighting, user writing lighting, surgery lighting, working, etc.
  • the Network Unit (NU) 70 may be any conventional network interface for wired or wireless connection to other remote-programmed devices, including but not limited to other semantic lighting systems, the Internet, or any other programmable devices and wireless devices.
  • Network Unit 70 (NU) provides networking capability with other remote systems or accessories having a like networking capability.
  • the Processing Unit 60 includes an on-board (one or more) processors with memory and peripheral communications interfaces for receiving inputs from the Sensor Pod 50 (SP), Control Unit 30 (CU) and Network Unit 70 (NU), and for delivering appropriate outputs to the Programmable Light Unit 40 (PLU), Control Unit 30 (CU) and Network Unit 70 (NU).
  • the Processing Unit 60 (PU) also includes one or more outputs as appropriate for coupling to the Programmable Light Unit 40 (PLU), including, for example, a standard data output (USB, serial, parallel, etc.).
  • the Processing Unit 60 There is software resident on the Processing Unit 60 (PU) that creates an array of models inclusive of a user eye model, task model, and subject model.
  • the user eye model is constructed using specific user physiological eye parameters such as, light perception, color perception, age, eye injury, and lens prescription.
  • the data necessary to construct the user eye model may be pre-programmed or input by the user via Control Unit 30 (CU).
  • the task model is built using a specific categorical task description such as reading, writing or a specific manufacturing task.
  • the data necessary to construct the task model is typically an input by the user via Control Unit (CU).
  • the subject task model is a 2D/3D model of the subject including existing illumination, the shape of the subject, contrast, temperature, color, transparency, reflection and texture.
  • the software resident in the Processing Unit 60 executes a suite of algorithms that analyze data inputs from the Sensor Pod 50 (SP) (and, optionally, Control Unit 30 (CU) data and Network Unit 70 (NU) data) in accordance with the user eye model, the task model, and subject model, to create a Subject Illumination Profile comprising a set of instructions to control the Programmable Lighting Unit 40 (PLU) in order to produce at any given time light with specified spectrum, intensity, focus, color, contrast, temperature and angle.
  • the PLU 40 will project text and images for aesthetic value or task oriented value.
  • the Processing Unit 60 (PU) may deliver outputs to the Control Unit 30 (CU) for user-feedback, and to the Network Unit 70 (NU) for remote control of other networked systems or accessories.
  • FIG. 3 is a flow diagram illustrating the general operation of the semantic lighting system 2 according to the present invention.
  • the Processing Unit 60 receives programming inputs from the Control Unit 30 (CU), plus sensor inputs from the Sensor Pod 50 (SP) which may comprise spectral analyses from the visual spectrum camera 52 , infrared spectral analyses from spectrum camera 54 , subject range data from range sensors 56 , eye tracking information from the tracking device 58 , etc.
  • the processor Processing Unit 60 (PU) may additionally receive sensor inputs from the Network Unit 70 (NU), and deliver appropriate outputs to the foregoing devices.
  • the Processing Unit 60 will execute its internal algorithms as appropriate on the data from the Sensor Pod 50 (SP), and will determine the most appropriate lighting conditions and or text to display based on the user eye model 110 , the task requirement 120 (which may be programmed at Control Unit 30 (CU)), and the subject model 80 / 130 (including visual spectrum, infrared spectrum, range, etc.). Additionally, Network Unit 70 (NU) data may be considered.
  • the Processing Unit 60 algorithms analyze the combined data, generally designated by the subject illumination model 140 in FIG. 3 , generate the most appropriate Lighting Profile, and outputs control signals to the Programmable Light Unit 40 (PLU) as necessary to control the stated lighting variables (spectrum, intensity, color, contrast, temperature, angle, etc.), generally designated by outputs 150 .
  • PLU Programmable Light Unit 40
  • outputs 150 generally designated by outputs 150 .
  • DLP projector(s) will allow text projecting capabilities such as projection of recipes in a kitchen, directions for repair, etc.
  • Like outputs may be delivered to the Control Unit 30 (CU) for visual confirmation, and to the Network Unit (NU) 70 for remote control of networked systems and accessories.
  • Task Light (as Shown in FIG. 1 ):
  • the semantic lighting system 2 delivers light that has the most comfortable color, spectrum intensity and temperature for a user that is reading/writing and/or manipulating objects related to a task all collocated on a desk.
  • the subject 80 is a collection of reading/writing materials and objects that are on a work desk.
  • the user is the person that is using the desk.
  • the semantic lighting system 2 takes into consideration the particular user eye performance parameters, the subject position, angle, contrast, texture, color, reflection and the distribution of objects on the desk. Based on the above considerations the Processing Unit 60 (PU) analyzes the data, builds models for the subject and the user, and instructs the Programmable Light Unit 40 (PLU) to deliver appropriate light to each part of the subject.
  • PU Processing Unit 60
  • PLU Programmable Light Unit 40
  • the Programmable Light Unit 40 uses a data projector, in this case the semantic light is a projected image over the subject that could include text that possibly will be used to communicate with the user. Since, the Sensor Pod 50 (SP) is continuously analyzing the subject, user movements, change of lighting condition, change of subject and some user gestures (such as writing, hands movement, moving of objects on the desk, pointing to different part of the subject) are registered and employed as inputs to adjust the Programmable Light Unit (PLU) image.
  • SP Sensor Pod 50
  • the semantic lighting system delivers light that has the most comfortable color, spectrum intensity and temperature for a reading user.
  • the subject is an instance of a reading material (book, magazine, etc.).
  • the semantic lighting system 2 will take in consideration particular user eye performance parameters, the subject position, angle, contrast, texture, color, reflection, the distribution of text/pictures, and the character set.
  • the Processing Unit 60 PU
  • the Processing Unit 60 analyzes the data, builds models for the subject and the user, and instructs the Programmable Light Unit 40 (PLU) to deliver appropriate light to each part of the subject.
  • PLU Programmable Light Unit 40
  • the Programmable Light Unit 40 uses a data projector unit, in this case, the semantic light is a projected image over the subject that could also include the text to be read, or any other text for communicating with the user.
  • the present system continuously analyzes the subject, user movements, change of lighting condition, change of subject and some user gestures (such as pointing to different part of the subject), and all these are registered and used as input to adjust the PLU image.
  • the semantic lighting system 2 delivers light that has the most comfortable color, spectrum intensity and temperature for users that are collocated around a dining table.
  • the subject 80 is a collection of objects that are on a dining table.
  • the users are the people that are using dining table.
  • the semantic lighting system 2 takes into consideration the subject position, angle, contrast, texture, color, reflection and the distribution of objects on the dining table.
  • the Processing Unit 60 PU
  • the Processing Unit 60 analyzes the data, builds models for the subject, and instructs the Programmable Light Unit 40 (PLU) to deliver appropriate light to each part of the subject.
  • Programmable Light Unit 40 (PLU) uses a data projector unit, in this case the semantic light is a projected image over the subject that may also include text. Since, the semantic lighting system is continuously analyzing the subject, user movements, change of lighting condition, changes of subject are registered and employed as inputs to adjust the Programmable Light Unit (PLU) image.
  • semantic lighting system 2 delivers light that has the most comfortable color, spectrum intensity and temperature for a room.
  • the subject 80 is a collection of objects that are in a room.
  • the users are the people that are using the room.
  • the semantic lighting system 2 takes into consideration the subject position, angle, contrast, texture, color, reflection and the distribution of objects in a room. Based on the above considerations the Processing Unit 60 (PU) analyzes the data, builds Processing Unit 60 (PU) models for the subject, build models for the expected user task in the room and instruct the Programmable Light Unit 40 (PLU) to deliver appropriate light to each part of the subject.
  • PU Processing Unit 60
  • PLU Programmable Light Unit 40
  • PLU Programmable Light Unit 40
  • SP uses a collection of clusters of LED high intensity lights that could be programmed in terms of color, temperature, spectrum and color, the semantic light is differentiated for each part of the subject. Since, SP is continuously analyzing the subject, user movements, change of lighting condition, change of subject are registered and employed as inputs to adjust the PLU image.
  • the semantic lighting system 2 delivers light that has the most effective color, spectrum intensity and temperature for a physician that is performing a surgery in a surgery room.
  • the subject 80 is the human body that is under the medical procedure.
  • the semantic lighting system 2 takes into consideration particular user eye performance parameters, the particular body part or organ position, range, angle, contrast, texture, color, reflection and other visual properties that are related to the task. Based on the above considerations the Processing Unit 60 (PU) analyzes the data, builds models for the subject and the user, and instructs the Programmable Light Unit 40 (PLU) to deliver appropriate light to each part of the subject.
  • PU Processing Unit 60
  • PLU Programmable Light Unit 40
  • Programmable Light Unit 40 uses a data projector unit
  • the semantic light is a projected image over the subject that could also include text that could be used to communicate with the user by projecting physiological data directly on the subject.
  • the semantic lighting system 2 continuously analyzes the subject, user movements, change of lighting condition, change of subject and some user gestures (such as pointing to different part of the subject) are registered and used as input to adjust the Programmable Light Unit 40 (PLU) light.
  • a wearable semantic light unit may be mounted on the forehead of the user to direct the Programmable Light Unit 40 (PLU) to deliver appropriate light that is controlled by user head movement.
  • semantic lighting system lends itself to specific medical procedures. Using the same principles as in the Surgery theater illumination, semantic lighting system may be adapted for specific medical procedures. The same is true for specific manufacturing jobs.
  • This application is similar to that of task lighting (above) but may also include a manufacturing task model.
  • the algorithms may employ additional models for a lathe, for the manufactured part, the manufacturing process, etc.
  • the Programmable Light Unit (PLU) may project text on the manufacturing part to indicate current dimensions or the like. Employing semantic lighting in the work place is likely to have a substantial impact in the worker comfort and productivity.
  • the semantic lighting system will deliver light that has the most effective and comfortable color, spectrum, intensity, focus, range and temperature for night driving.
  • the subject 80 is the road ahead of the driver and any object that is in the path of the vehicle movement.
  • the semantic lighting system 2 will take in consideration a particular driver night eye performance parameters, the subject temperature, infrared image, position, range, angle, contrast, texture, V color, reflection and other visual properties that are related to driving.
  • the Processing Unit 60 PU
  • PLU Programmable Light Unit 40
  • Programmable Light Unit 40 uses a combination of halogen, high intensity LED and data projection engines, the semantic light is obtained by a real time programmed combination of all three light sources.
  • Programmable Light Unit 40 also have the capability to project data directly on the subject. Since the sensor pod 30 (SP) continuously analyzes (both in visual and infrared spectrum) the change in lighting condition and the change of subject (such as new object in the path) the Programmable Light Unit 40 (PLU) light could change to focus light on an object of relevance (such as a deer in the path of the vehicle).
  • semantic lighting system provides a dynamic, full customized, and automatic lighting profile to a subject by controlling that is optimized in spectrum, intensity, color, contrast, temperature, angle, focus, etc., for any given environment, subject and task.
  • the invention disclosed herein provides dynamic light that changes over time to adapt to changing environments and changing requirements of the task requiring illumination.

Landscapes

  • Health & Medical Sciences (AREA)
  • Surgery (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Pathology (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Circuit Arrangement For Electric Light Sources In General (AREA)

Abstract

A lighting system for delivering a dynamic, fully customized, and automatic illumination to a subject. The lighting system comprises a programmable light unit for emitting a programmed pattern and spectra of illumination, a sensor pod comprising an array of sensors for detecting ambient lighting conditions and subject characteristics, a control unit for allowing a user to program the lighting system, and a processing unit for analyzing data from the sensor pod and control unit to construct an optimal lighting profile in accordance therewith. The lighting system generates light in accordance with the lighting profile which is fully optimized in spectrum, intensity, color, contrast, temperature, angle, and focus for any given environment, subject and task.

Description

    CROSS-REFERENCE TO RELATED APPLICATION(S)
  • This application is a Continuation of U.S. patent application Ser. No. 11/800,213 filed May 4, 2007, now U.S. Pat. No. 7,564,368, which claims benefit of U.S. Provisional Application No. 60/797,711 filed May 4, 2006.
  • BACKGROUND OF THE INVENTION
  • (1) Field of the Invention
  • The present invention relates to light delivery systems and, more particularly, to a semantic lighting that delivers appropriate light (spectrum, intensity, color, contrast, temperature, angle, focus, data) to a subject by analyzing the properties of the subject (nature, dimensions, shape, texture, contrast, reflectivity, transparence, temperature, etc.), the existing illumination, the eye characteristics of the human user and the relative position of the subject with respect to the source of light and user. In addition, the semantic light delivers dynamic light that is changing in time and in sync with the semantic of the task requiring illumination.
  • (2) Description of Prior Art
  • Conventional lighting devices deliver static light and are agnostic to the user, the subject or the environment. Thus, such lighting devices are designed to illuminate a predefined, average illumination scene for an average user. This completely overlooks the fact that the light is perceived differently by different people and that the lighting requirements are different for each particular task. Furthermore the lighting requirement is dependent on the visual qualities of the subject being illuminated. It is well-known that appropriate light can enhance virtually any human experience and make the task at hand easier to perform. Bright light is better for reading, soft warm light for resting, etc. Interior designers recognize this and carefully assess the quality of ambient lighting in a room before installing a lighting system. Along with color, many aspects of a light source help establish task-suitable lighting. Intensity, direction and angle, number of lights, and shadows all play a major role in defining the lighting quality of a scene. Lighting is a key element in human performance and productivity. Thus, good interior designers consider all aspect of the light needed to properly illuminate a room, including intensity, spectrum, directionality, etc. Unfortunately, once the lights are installed they are relatively static. Despite changing seasons, daylight hours, moving occupants of the house, rearranged furniture, etc., conventional lighting does not adapt.
  • It would be greatly advantageous to provide a dynamic light system (changing in time and in sync with the task performed), a semantic lighting system (adapting to the illuminated subject visual properties), both personalized (adapting to the eye characteristic of the user) and task specific (adapting to the requirement of a particular task), for delivering appropriate light to a subject by controlling a range of variables (spectrum, intensity, color, contrast, temperature, angle, focus, data)
  • SUMMARY OF THE INVENTION
  • It is therefore, an object of this invention to provide a dynamic light system that changes its illumination in time and in sync with the task performed.
  • It is another object to provide a semantic lighting system that adapts to an illuminated subject's visual properties.
  • It is another object to provide a semantic lighting system that is both personalized (adapting to the eye characteristic of the user) and task specific (adapting to the requirement of a particular task).
  • It is still another object to provide a semantic lighting system that delivers an appropriate light to a subject by controlling a range of variables (spectrum, intensity, color, contrast, temperature, angle, focus, data).
  • These and other objects are accomplished herein by a a dynamic light system that automatically analyzes a range of properties in order to control a range of variables. The properties analyzed include the proprieties of a subject (nature, dimensions, shape, texture, contrast, reflectivity, transparence, temperature, etc.), plus the properties of the environment (such as existing illumination), plus human properties including eye characteristics and the relative position of the subject with respect to the source of light.
  • The foregoing properties are automatically analyzed to control a range of variables (spectrum, intensity, color, contrast, temperature, angle, focus, data) in order to project the optimal lighting conditions for any given environment, user, subject and situation. Moreover, the invention disclosed herein provides dynamic light that changes over time to adapt to changing environments and changing requirements of the task requiring illumination.
  • Other variations and advantages are described in the detailed description below.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • Other objects, features, and advantages of the present invention will become more apparent from the following detailed description of the preferred embodiments and certain modifications thereof when taken together with the accompanying drawings in which:
  • FIG. 1 is a perspective view of the semantic lighting system 2 according to the present invention.
  • FIG. 2 is a block diagram of the primary components of the semantic lighting system 2 of FIG. 1.
  • FIG. 3 is a flow diagram illustrating the general operation of the semantic lighting system according to the present invention.
  • DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
  • The present invention is an active lighting system that analyzes a subject (by measuring a range of properties of a subject, inclusive of nature, dimensions, shape, texture, contrast, reflectivity, transparence, temperature, etc.), analyzes the environment (by measuring properties such as existing illumination), incorporates a knowledge of the eye characteristics of the user, understands the task the user is performing, and provides the most appropriate lighting conditions for any given environment and situation by automatic control of a range of variables (spectrum, intensity, color, contrast, temperature, angle, focus, data). Since the user is human, properties specifically include the user eye characteristics, relative position of the subject with respect to the source of light, etc.
  • Not only does the system provide lighting best suited to environment and task, but also provides dynamic lighting that is adjusted over time to adapt to changing environmental and task requirements.
  • FIG. 1 is a perspective view, and FIG. 2 is a block diagram of the primary components of the semantic lighting system 2 according to the present invention. The system 2 generally includes a Programmable Light Unit 40(PLU), Sensor Pod 50 (SP) and Control Unit 30(CU), all of which are available for sensing and controlling the appropriate lighting conditions in a given area for a given subject.
  • In addition, the system 2 includes a Processing Unit 60 (PU) and Network Unit 70 (NU). The Subject 80 is the target of illumination such as a magazine or a book.
  • The Sensor Pod (SP) 50 incorporates a variety of sensors including, but not limited to, a visual spectrum camera 52, infrared spectrum camera 54, range sensors 56, a sensor for tracking eye movement 58, and other possible sensors.
  • The Programmable Light Unit 40(PLU) preferably includes one or more digital programmable light sources such as a conventional DLP or LCD projectors, one or more high intensity programmable LED clusters, one or more conventional incandescent or fluorescent light sources including halogen, or any combination of the foregoing. PLU 40 also contains conventional means of focusing and directing light to a particular area of the subject.
  • The Control Unit 30(CU) includes a user interface with controls for controlling the PU 60, and may be configured as a conventional IR remote controller. The Control Unit 30(CU) is used for user-input of a task requirement and personalization depending on the particular task or environment that the user desires lighting for. The task requirement may be a categorical choice of task such as user reading lighting, user writing lighting, surgery lighting, working, etc.
  • The Network Unit (NU) 70 may be any conventional network interface for wired or wireless connection to other remote-programmed devices, including but not limited to other semantic lighting systems, the Internet, or any other programmable devices and wireless devices. Network Unit 70 (NU) provides networking capability with other remote systems or accessories having a like networking capability.
  • The Processing Unit 60 (PU) includes an on-board (one or more) processors with memory and peripheral communications interfaces for receiving inputs from the Sensor Pod 50 (SP), Control Unit 30(CU) and Network Unit 70(NU), and for delivering appropriate outputs to the Programmable Light Unit 40 (PLU), Control Unit 30(CU) and Network Unit 70 (NU). Thus, the Processing Unit 60 (PU) also includes one or more outputs as appropriate for coupling to the Programmable Light Unit 40(PLU), including, for example, a standard data output (USB, serial, parallel, etc.).
  • There is software resident on the Processing Unit 60 (PU) that creates an array of models inclusive of a user eye model, task model, and subject model. The user eye model is constructed using specific user physiological eye parameters such as, light perception, color perception, age, eye injury, and lens prescription. The data necessary to construct the user eye model may be pre-programmed or input by the user via Control Unit 30 (CU). The task model is built using a specific categorical task description such as reading, writing or a specific manufacturing task. The data necessary to construct the task model is typically an input by the user via Control Unit (CU). The subject task model is a 2D/3D model of the subject including existing illumination, the shape of the subject, contrast, temperature, color, transparency, reflection and texture.
  • Given the three completed models, the software resident in the Processing Unit 60 (PU) executes a suite of algorithms that analyze data inputs from the Sensor Pod 50 (SP) (and, optionally, Control Unit 30 (CU) data and Network Unit 70 (NU) data) in accordance with the user eye model, the task model, and subject model, to create a Subject Illumination Profile comprising a set of instructions to control the Programmable Lighting Unit 40 (PLU) in order to produce at any given time light with specified spectrum, intensity, focus, color, contrast, temperature and angle. In addition, if the task model so requires, the PLU 40 will project text and images for aesthetic value or task oriented value. In addition, the Processing Unit 60 (PU) may deliver outputs to the Control Unit 30 (CU) for user-feedback, and to the Network Unit 70(NU) for remote control of other networked systems or accessories.
  • FIG. 3 is a flow diagram illustrating the general operation of the semantic lighting system 2 according to the present invention. In operation, generally designated by input 100 in FIG. 3, the Processing Unit 60 (PU) receives programming inputs from the Control Unit 30 (CU), plus sensor inputs from the Sensor Pod 50 (SP) which may comprise spectral analyses from the visual spectrum camera 52, infrared spectral analyses from spectrum camera 54, subject range data from range sensors 56, eye tracking information from the tracking device 58, etc. The processor Processing Unit 60 (PU) may additionally receive sensor inputs from the Network Unit 70 (NU), and deliver appropriate outputs to the foregoing devices. The Processing Unit 60 (PU) will execute its internal algorithms as appropriate on the data from the Sensor Pod 50 (SP), and will determine the most appropriate lighting conditions and or text to display based on the user eye model 110, the task requirement 120 (which may be programmed at Control Unit 30 (CU)), and the subject model 80/130 (including visual spectrum, infrared spectrum, range, etc.). Additionally, Network Unit 70 (NU) data may be considered.
  • The Processing Unit 60(PU) algorithms analyze the combined data, generally designated by the subject illumination model 140 in FIG. 3, generate the most appropriate Lighting Profile, and outputs control signals to the Programmable Light Unit 40 (PLU) as necessary to control the stated lighting variables (spectrum, intensity, color, contrast, temperature, angle, etc.), generally designated by outputs 150. In addition, it is envisioned that use of DLP projector(s) will allow text projecting capabilities such as projection of recipes in a kitchen, directions for repair, etc. Like outputs may be delivered to the Control Unit 30 (CU) for visual confirmation, and to the Network Unit (NU) 70 for remote control of networked systems and accessories.
  • To further understand the various embodiments of the present invention the following are examples of certain application, though the list is not exhaustive:
  • Task Light (as Shown in FIG. 1):
  • In this application the semantic lighting system 2 delivers light that has the most comfortable color, spectrum intensity and temperature for a user that is reading/writing and/or manipulating objects related to a task all collocated on a desk. The subject 80 is a collection of reading/writing materials and objects that are on a work desk. The user is the person that is using the desk. The semantic lighting system 2 takes into consideration the particular user eye performance parameters, the subject position, angle, contrast, texture, color, reflection and the distribution of objects on the desk. Based on the above considerations the Processing Unit 60 (PU) analyzes the data, builds models for the subject and the user, and instructs the Programmable Light Unit 40 (PLU) to deliver appropriate light to each part of the subject. Since in one implementation the Programmable Light Unit 40 (PLU) uses a data projector, in this case the semantic light is a projected image over the subject that could include text that possibly will be used to communicate with the user. Since, the Sensor Pod 50 (SP) is continuously analyzing the subject, user movements, change of lighting condition, change of subject and some user gestures (such as writing, hands movement, moving of objects on the desk, pointing to different part of the subject) are registered and employed as inputs to adjust the Programmable Light Unit (PLU) image.
  • Reading Light:
  • In this application the semantic lighting system delivers light that has the most comfortable color, spectrum intensity and temperature for a reading user. The subject is an instance of a reading material (book, magazine, etc.). The semantic lighting system 2 will take in consideration particular user eye performance parameters, the subject position, angle, contrast, texture, color, reflection, the distribution of text/pictures, and the character set. Based on the above considerations the Processing Unit 60 (PU) analyzes the data, builds models for the subject and the user, and instructs the Programmable Light Unit 40 (PLU) to deliver appropriate light to each part of the subject. Since in one implementation the Programmable Light Unit 40 (PLU) uses a data projector unit, in this case, the semantic light is a projected image over the subject that could also include the text to be read, or any other text for communicating with the user. The present system continuously analyzes the subject, user movements, change of lighting condition, change of subject and some user gestures (such as pointing to different part of the subject), and all these are registered and used as input to adjust the PLU image.
  • Dining Room Light:
  • In this application the semantic lighting system 2 delivers light that has the most comfortable color, spectrum intensity and temperature for users that are collocated around a dining table. The subject 80 is a collection of objects that are on a dining table. The users are the people that are using dining table. The semantic lighting system 2 takes into consideration the subject position, angle, contrast, texture, color, reflection and the distribution of objects on the dining table. Based on the above considerations the Processing Unit 60 (PU) analyzes the data, builds models for the subject, and instructs the Programmable Light Unit 40 (PLU) to deliver appropriate light to each part of the subject. Since in one implementation Programmable Light Unit 40 (PLU) uses a data projector unit, in this case the semantic light is a projected image over the subject that may also include text. Since, the semantic lighting system is continuously analyzing the subject, user movements, change of lighting condition, changes of subject are registered and employed as inputs to adjust the Programmable Light Unit (PLU) image.
  • Office/Room Semantic Light:
  • In this application semantic lighting system 2 delivers light that has the most comfortable color, spectrum intensity and temperature for a room. The subject 80 is a collection of objects that are in a room. The users are the people that are using the room. The semantic lighting system 2 takes into consideration the subject position, angle, contrast, texture, color, reflection and the distribution of objects in a room. Based on the above considerations the Processing Unit 60 (PU) analyzes the data, builds Processing Unit 60 (PU) models for the subject, build models for the expected user task in the room and instruct the Programmable Light Unit 40 (PLU) to deliver appropriate light to each part of the subject. Since in one implementation Programmable Light Unit 40 (PLU) uses a collection of clusters of LED high intensity lights that could be programmed in terms of color, temperature, spectrum and color, the semantic light is differentiated for each part of the subject. Since, SP is continuously analyzing the subject, user movements, change of lighting condition, change of subject are registered and employed as inputs to adjust the PLU image.
  • Surgery Theater Illumination:
  • In this application the semantic lighting system 2 delivers light that has the most effective color, spectrum intensity and temperature for a physician that is performing a surgery in a surgery room. The subject 80 is the human body that is under the medical procedure. The semantic lighting system 2 takes into consideration particular user eye performance parameters, the particular body part or organ position, range, angle, contrast, texture, color, reflection and other visual properties that are related to the task. Based on the above considerations the Processing Unit 60 (PU) analyzes the data, builds models for the subject and the user, and instructs the Programmable Light Unit 40 (PLU) to deliver appropriate light to each part of the subject. Since in one implementation Programmable Light Unit 40 (PLU) uses a data projector unit, in this case the semantic light is a projected image over the subject that could also include text that could be used to communicate with the user by projecting physiological data directly on the subject. Since the semantic lighting system 2 continuously analyzes the subject, user movements, change of lighting condition, change of subject and some user gestures (such as pointing to different part of the subject) are registered and used as input to adjust the Programmable Light Unit 40 (PLU) light. In addition a wearable semantic light unit may be mounted on the forehead of the user to direct the Programmable Light Unit 40 (PLU) to deliver appropriate light that is controlled by user head movement.
  • In addition to the foregoing, the semantic lighting system lends itself to specific medical procedures. Using the same principles as in the Surgery theater illumination, semantic lighting system may be adapted for specific medical procedures. The same is true for specific manufacturing jobs. This application is similar to that of task lighting (above) but may also include a manufacturing task model. For example, the algorithms may employ additional models for a lathe, for the manufactured part, the manufacturing process, etc. Moreover, the Programmable Light Unit (PLU) may project text on the manufacturing part to indicate current dimensions or the like. Employing semantic lighting in the work place is likely to have a substantial impact in the worker comfort and productivity.
  • Vehicle Semantic Headlight:
  • In this application the semantic lighting system will deliver light that has the most effective and comfortable color, spectrum, intensity, focus, range and temperature for night driving. The subject 80 is the road ahead of the driver and any object that is in the path of the vehicle movement. The semantic lighting system 2 will take in consideration a particular driver night eye performance parameters, the subject temperature, infrared image, position, range, angle, contrast, texture, V color, reflection and other visual properties that are related to driving. Based on the above considerations the Processing Unit 60 (PU) will analyze the data, build models for the subject and the user, and instruct the Programmable Light Unit 40 (PLU) to deliver appropriate light to each part of the subject, while taking in consideration the current regulation for headlight range, color and intensity. Since in one implementation Programmable Light Unit 40 (PLU) uses a combination of halogen, high intensity LED and data projection engines, the semantic light is obtained by a real time programmed combination of all three light sources. Programmable Light Unit 40 (PLU) also have the capability to project data directly on the subject. Since the sensor pod 30 (SP) continuously analyzes (both in visual and infrared spectrum) the change in lighting condition and the change of subject (such as new object in the path) the Programmable Light Unit 40 (PLU) light could change to focus light on an object of relevance (such as a deer in the path of the vehicle).
  • It should now be apparent that the foregoing semantic lighting system provides a dynamic, full customized, and automatic lighting profile to a subject by controlling that is optimized in spectrum, intensity, color, contrast, temperature, angle, focus, etc., for any given environment, subject and task. Moreover, the invention disclosed herein provides dynamic light that changes over time to adapt to changing environments and changing requirements of the task requiring illumination. Having now fully set forth the preferred embodiments and certain modifications of the concept underlying the present invention, various other embodiments as well as certain variations and modifications thereto may obviously occur to those skilled in the art upon becoming familiar with the underlying concept. It is to be understood, therefore, that the invention may be practiced otherwise than as specifically set forth herein.

Claims (20)

1. A semantic lighting system for illuminating a subject for a user, comprising:
a light source, said light source capable of emitting a spectra of illumination on said subject;
a sensor, said sensor detecting ambient lighting conditions of said subject; and
a processor, said processor analyzing said ambient lighting conditions detected by said sensor, constructing an optimal lighting profile therefrom, and directing said light source to illuminate said subject pursuant to said optimal lighting profile,
whereby said light source illuminates said subject for said user.
2. The semantic lighting system according to claim 1, wherein said sensor detects physical characteristics of said user, said processor analyzing said physical characteristics.
3. The semantic lighting system according to claim 2, wherein said user is a human, said processor constructing said optimal lighting profile pursuant to a plurality of human characteristics.
4. The semantic lighting system according to claim 3, wherein said human characteristics are selected from the group consisting of: eye parameters and relative position of said subject to said light source.
5. The semantic lighting system according to claim 4, wherein said eye parameters are selected from the group consisting of: light perception, color perception, age, eye injury, night eye performance, infrared, position, range, lens prescription, subject position, angle, contrast, texture, color, reflection, distribution of objects on a surface, and combinations thereof.
6. The semantic lighting system according to claim 1, wherein said processor constructs said optimal lighting profile pursuant to a plurality of subject properties, selected from the group consisting of: nature, dimensions, shape, texture, contrast, reflectivity, transparence, temperature and combinations thereof.
7. The semantic lighting system according to claim 1, wherein said optimal lighting profile instructs said light source pursuant to variables selected from the group consisting of: spectrum, intensity, color, contrast, temperature, angle, focus, data, user movements, change of lighting condition, change of subject, user gestures, writings, movement of objects on a surface pointing to different portion of subject, and combinations thereof.
8. The semantic lighting system according to claim 1, further comprising:
a control unit, said control unit, in communication with said processor, includes a user interface,
whereby said user may dynamically adjust said optimal lighting profile.
9. The semantic lighting system according to claim 8, wherein said user uses said user interface to adjust said optimal lighting profile pursuant to a task.
10. The semantic lighting system according to claim 8, wherein said control unit is a remote-control device.
11. The semantic lighting system according to claim 1, wherein said light source is a plurality of light sources, said processor directing each light source.
12. The semantic lighting system according to claim 1, wherein said semantic lighting system is portable.
13. The semantic lighting system according to claim 12, wherein said semantic lighting system is worn, said processor directing said light source by body movement.
14. The semantic lighting system according to claim 1, wherein said semantic lighting system is a reading light system.
15. The semantic lighting system according to claim 1, wherein said semantic lighting system is a room light system.
16. The semantic lighting system according to claim 1, wherein said semantic lighting system is a surgical light system.
17. The semantic lighting system according to claim 1, wherein said semantic lighting system is a vehicle light system.
18. The semantic lighting system according to claim 17, wherein said processor, pursuant to a change in subject, focuses said light source on an object of relevance.
19. A method for illuminating a subject for a user using semantic light, comprising:
analyzing a plurality of dynamic properties pertaining to said subject;
analyzing a plurality of dynamic properties pertaining to the environment of said subject;
constructing, from said subject and environment dynamic properties, an optimal lighting profile;
dynamically controlling a light source, said light source capable of emitting a spectra of illumination on said subject, said light source illuminating said subject pursuant to said optimal lighting profile.
20. A semantic lighting apparatus, comprising:
a light source, said light source capable of emitting a spectra of illumination on said subject;
a sensor, said sensor detecting ambient lighting conditions of said subject; and
processor means for analyzing said ambient lighting conditions detected by said sensor, constructing an optimal lighting profile therefrom, and directing said light source to illuminate said subject pursuant to said optimal lighting profile,
whereby said light source illuminates said subject for said user.
US12/503,217 2006-05-04 2009-07-15 Semantic light Abandoned US20090273287A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US12/503,217 US20090273287A1 (en) 2006-05-04 2009-07-15 Semantic light

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US79771106P 2006-05-04 2006-05-04
US11/800,213 US7564368B2 (en) 2006-05-04 2007-05-04 Semantic light
US12/503,217 US20090273287A1 (en) 2006-05-04 2009-07-15 Semantic light

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US11/800,213 Continuation US7564368B2 (en) 2006-05-04 2007-05-04 Semantic light

Publications (1)

Publication Number Publication Date
US20090273287A1 true US20090273287A1 (en) 2009-11-05

Family

ID=38660998

Family Applications (2)

Application Number Title Priority Date Filing Date
US11/800,213 Expired - Fee Related US7564368B2 (en) 2006-05-04 2007-05-04 Semantic light
US12/503,217 Abandoned US20090273287A1 (en) 2006-05-04 2009-07-15 Semantic light

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US11/800,213 Expired - Fee Related US7564368B2 (en) 2006-05-04 2007-05-04 Semantic light

Country Status (1)

Country Link
US (2) US7564368B2 (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090072945A1 (en) * 2007-09-13 2009-03-19 Meng-Shiuan Pan Automatic Lighting Control System And Method
US20130093334A1 (en) * 2011-10-17 2013-04-18 Lextar Electronics Corporation Lamps and illuminating system
US9265458B2 (en) 2012-12-04 2016-02-23 Sync-Think, Inc. Application of smooth pursuit cognitive testing paradigms to clinical drug development
US9374867B2 (en) 2010-12-31 2016-06-21 Koninklijkle Philips Electronics N.V. Illumination apparatus and method
US9380976B2 (en) 2013-03-11 2016-07-05 Sync-Think, Inc. Optical neuroinformatics
US9655191B2 (en) 2013-01-25 2017-05-16 Philips Lighting Holding B.V. Lighting device and lighting system
CN111511064A (en) * 2020-04-14 2020-08-07 佛山市艾温特智能科技有限公司 Intelligent desk lamp control method and system and intelligent desk lamp
CN111981439A (en) * 2019-05-21 2020-11-24 广东小天才科技有限公司 Illumination angle adjusting method and device, intelligent desk lamp and storage medium
EP3842689A4 (en) * 2018-07-06 2021-11-03 Nanjing Mindray Bio-Medical Electronics Co., Ltd. Method for adjusting surgical light parameters, surgical lighting device, and readable storage medium

Families Citing this family (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7564368B2 (en) * 2006-05-04 2009-07-21 Zary Segall Semantic light
US20110216546A1 (en) * 2010-03-03 2011-09-08 Leviton Manufacturing Co., Inc. Lampholder with occupancy sensor
US8917905B1 (en) * 2010-04-15 2014-12-23 Don K. Dill Vision-2-vision control system
US9357613B2 (en) * 2010-06-17 2016-05-31 Koninklijke Philips N.V. Display and lighting arrangement for a fitting room
EP2591641A2 (en) 2010-07-06 2013-05-15 Koninklijke Philips Electronics N.V. Method and apparatus for illuminating
CN102972098B (en) * 2010-07-06 2016-01-20 皇家飞利浦电子股份有限公司 For the method and apparatus thrown light on
WO2014006525A2 (en) 2012-07-05 2014-01-09 Koninklijke Philips N.V. Lighting system for workstations.
US8974077B2 (en) 2012-07-30 2015-03-10 Ultravision Technologies, Llc Heat sink for LED light source
EP2912924B1 (en) * 2012-10-24 2020-02-05 Signify Holding B.V. Assisting a user in selecting a lighting device design
PL2939504T5 (en) * 2012-10-24 2022-10-24 Signify Holding B.V. Assisting a user in selecting a lighting device design
EP2912925B1 (en) 2012-10-26 2018-12-19 Philips Lighting Holding B.V. Lighting methods for providing personalized lighting to users positioned proximal to one another
US20140286011A1 (en) * 2013-03-14 2014-09-25 Aliphcom Combination speaker and light source powered using light socket
EP3025493B1 (en) 2013-07-22 2019-04-03 Google LLC Method, system, and medium for projecting light to indicate a device status
JP6304618B2 (en) * 2013-11-05 2018-04-04 パナソニックIpマネジメント株式会社 Lighting device
DE102014104174A1 (en) * 2014-03-26 2015-10-01 Steinel Gmbh Controlled lighting device
CN106664773B (en) * 2014-06-05 2019-12-24 飞利浦灯具控股公司 Light scene creation or modification by means of lighting device usage data
CN104482449B (en) * 2014-12-04 2017-07-28 京东方科技集团股份有限公司 Eye-protecting lamp and its light intensity regulating method
FR3034226A1 (en) * 2015-03-27 2016-09-30 Avant-Gout Studios INCREASED READING LAMP
DE102015106519A1 (en) * 2015-04-28 2016-11-03 Berchtold Holding Gmbh Method and device for controlling a surgical light
EP3466206B1 (en) * 2016-05-30 2020-07-15 Signify Holding B.V. Illumination control
WO2018035488A1 (en) * 2016-08-18 2018-02-22 Rohinni, LLC Backlighting color temperature control apparatus
JP2018056012A (en) * 2016-09-29 2018-04-05 パナソニック株式会社 Environment control system, environment control method and program
US10667366B2 (en) 2018-06-29 2020-05-26 Osram Sylvania Inc. Lighting devices with automatic lighting adjustment
US10997402B2 (en) * 2018-07-03 2021-05-04 Fuji Xerox Co., Ltd. Systems and methods for real-time end-to-end capturing of ink strokes from video
DE102019202070A1 (en) * 2019-02-15 2020-08-20 Osram Gmbh SYSTEM FOR MONITORING THE RADIATION OF AN OBJECT WITH LIGHT
JP2022540391A (en) * 2019-07-12 2022-09-15 ブレインリット・アーベー Exposure monitoring system
WO2021058191A1 (en) * 2019-09-25 2021-04-01 Osram Gmbh Methods of illuminating an artwork
US20210298863A1 (en) * 2020-03-27 2021-09-30 Trumpf Medizin Systeme GmbH & Co. KG. Augmented reality for a surgical system

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060077193A1 (en) * 2004-10-07 2006-04-13 Robbie Thielemans Intelligent lighting module and method of operation of such an intelligent lighting module
US20060149607A1 (en) * 2004-12-30 2006-07-06 Solarone Solutions, Llc LED lighting system
US20060227123A1 (en) * 2005-04-11 2006-10-12 M-Systems Flash Disk Pioneers, Ltd. Storage device with illuminated panel
US7139716B1 (en) * 2002-08-09 2006-11-21 Neil Gaziz Electronic automation system
US7339471B1 (en) * 2004-12-30 2008-03-04 Cordelia Lighting, Inc. Nighttime-controlled lighting system
US7564368B2 (en) * 2006-05-04 2009-07-21 Zary Segall Semantic light

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7139716B1 (en) * 2002-08-09 2006-11-21 Neil Gaziz Electronic automation system
US20060077193A1 (en) * 2004-10-07 2006-04-13 Robbie Thielemans Intelligent lighting module and method of operation of such an intelligent lighting module
US20060149607A1 (en) * 2004-12-30 2006-07-06 Solarone Solutions, Llc LED lighting system
US7339471B1 (en) * 2004-12-30 2008-03-04 Cordelia Lighting, Inc. Nighttime-controlled lighting system
US20060227123A1 (en) * 2005-04-11 2006-10-12 M-Systems Flash Disk Pioneers, Ltd. Storage device with illuminated panel
US7564368B2 (en) * 2006-05-04 2009-07-21 Zary Segall Semantic light

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090072945A1 (en) * 2007-09-13 2009-03-19 Meng-Shiuan Pan Automatic Lighting Control System And Method
US7843353B2 (en) * 2007-09-13 2010-11-30 Industrial Technology Reseacrh Institute Automatic lighting control system and method
US9374867B2 (en) 2010-12-31 2016-06-21 Koninklijkle Philips Electronics N.V. Illumination apparatus and method
US20130093334A1 (en) * 2011-10-17 2013-04-18 Lextar Electronics Corporation Lamps and illuminating system
US9265458B2 (en) 2012-12-04 2016-02-23 Sync-Think, Inc. Application of smooth pursuit cognitive testing paradigms to clinical drug development
US9655191B2 (en) 2013-01-25 2017-05-16 Philips Lighting Holding B.V. Lighting device and lighting system
US9380976B2 (en) 2013-03-11 2016-07-05 Sync-Think, Inc. Optical neuroinformatics
EP3842689A4 (en) * 2018-07-06 2021-11-03 Nanjing Mindray Bio-Medical Electronics Co., Ltd. Method for adjusting surgical light parameters, surgical lighting device, and readable storage medium
CN111981439A (en) * 2019-05-21 2020-11-24 广东小天才科技有限公司 Illumination angle adjusting method and device, intelligent desk lamp and storage medium
CN111511064A (en) * 2020-04-14 2020-08-07 佛山市艾温特智能科技有限公司 Intelligent desk lamp control method and system and intelligent desk lamp

Also Published As

Publication number Publication date
US20070258243A1 (en) 2007-11-08
US7564368B2 (en) 2009-07-21

Similar Documents

Publication Publication Date Title
US7564368B2 (en) Semantic light
TWI682689B (en) Smart lighting device and control method thereof
CN108713949A (en) Vanity mirror
CN110353449A (en) The dressing glass of acoustic control
US11098889B2 (en) Lighting system and method for operating lighting system
CN113767314A (en) Cosmetic mirror
JP6601709B2 (en) Lighting control apparatus and lighting control method
CN106817822B (en) Illumination control device, illumination system, and illumination control method
JP7286815B2 (en) Illumination system and method for object tracking
CN108966440A (en) It follows spot lamp control system
JP6611038B2 (en) Lighting control apparatus and lighting system
KR20220036712A (en) Smart mirror, controlling method thereof and system for purchasing a cosmetic
CN110657385A (en) Lamp and electronic equipment
CN112153788A (en) Illumination adjusting method and system
CN112181130A (en) Method and system for automatically controlling light source
US20190331936A1 (en) Illuminated Lens Frame
CN103438381B (en) A kind of desk lamp
US12033358B2 (en) Luminance distribution determination
WO2006100644A2 (en) Orientation and position adaptation for immersive experiences
CN112399677B (en) Light adjusting method and intelligent device
JP2018028560A (en) Illumination device
KR101729008B1 (en) Lighting system and the operating method
KR101664114B1 (en) Illumination controlling system
WO2016104190A1 (en) Illumination device
CN112150369A (en) Method and system for eliminating lighting shadow

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION