WO2012148385A1 - Détection et réglage des caractéristiques d'un environnement - Google Patents
Détection et réglage des caractéristiques d'un environnement Download PDFInfo
- Publication number
- WO2012148385A1 WO2012148385A1 PCT/US2011/033924 US2011033924W WO2012148385A1 WO 2012148385 A1 WO2012148385 A1 WO 2012148385A1 US 2011033924 W US2011033924 W US 2011033924W WO 2012148385 A1 WO2012148385 A1 WO 2012148385A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- environment
- source
- target
- ambiance
- illumination
- Prior art date
Links
- 238000000034 method Methods 0.000 claims abstract description 14
- 238000005286 illumination Methods 0.000 claims description 48
- 230000004044 response Effects 0.000 claims description 13
- 230000005236 sound signal Effects 0.000 claims description 3
- 238000012804 iterative process Methods 0.000 claims 1
- 238000004891 communication Methods 0.000 description 7
- 239000002386 air freshener Substances 0.000 description 5
- 230000004075 alteration Effects 0.000 description 4
- 238000003491 array Methods 0.000 description 4
- 230000008859 change Effects 0.000 description 4
- 230000006855 networking Effects 0.000 description 4
- 238000013507 mapping Methods 0.000 description 3
- 239000000463 material Substances 0.000 description 3
- 230000007246 mechanism Effects 0.000 description 3
- 239000004065 semiconductor Substances 0.000 description 3
- 238000004458 analytical method Methods 0.000 description 2
- 238000004040 coloring Methods 0.000 description 2
- 238000013500 data storage Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 238000010295 mobile communication Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- 239000000758 substrate Substances 0.000 description 2
- PEDCQBHIVMGVHV-UHFFFAOYSA-N Glycerine Chemical compound OCC(O)CO PEDCQBHIVMGVHV-UHFFFAOYSA-N 0.000 description 1
- 238000004378 air conditioning Methods 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 230000008878 coupling Effects 0.000 description 1
- 238000010168 coupling process Methods 0.000 description 1
- 238000005859 coupling reaction Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005401 electroluminescence Methods 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 238000010438 heat treatment Methods 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H05—ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
- H05B—ELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
- H05B47/00—Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
- H05B47/10—Controlling the light source
- H05B47/175—Controlling the light source by remote control
-
- H—ELECTRICITY
- H05—ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
- H05B—ELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
- H05B47/00—Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
- H05B47/10—Controlling the light source
- H05B47/105—Controlling the light source in response to determined parameters
- H05B47/115—Controlling the light source in response to determined parameters by determining the presence or movement of objects or living beings
- H05B47/125—Controlling the light source in response to determined parameters by determining the presence or movement of objects or living beings by using cameras
-
- H—ELECTRICITY
- H05—ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
- H05B—ELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
- H05B47/00—Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
- H05B47/10—Controlling the light source
- H05B47/175—Controlling the light source by remote control
- H05B47/19—Controlling the light source by remote control via wireless transmission
-
- H—ELECTRICITY
- H05—ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
- H05B—ELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
- H05B47/00—Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
- H05B47/10—Controlling the light source
- H05B47/175—Controlling the light source by remote control
- H05B47/196—Controlling the light source by remote control characterised by user interface arrangements
- H05B47/1965—Controlling the light source by remote control characterised by user interface arrangements using handheld communication devices
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02B—CLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO BUILDINGS, e.g. HOUSING, HOUSE APPLIANCES OR RELATED END-USER APPLICATIONS
- Y02B20/00—Energy efficient lighting technologies, e.g. halogen lamps or gas discharge lamps
- Y02B20/40—Control techniques providing energy savings, e.g. smart controller or presence detection
Definitions
- the present application relates generally to sensing and adjusting features of an environment and specifically to utilizing a computing device to determine features of a first environment for utilization in a second environment.
- a user will enter a first environment, such as a house, room, restaurant, hotel, office, etc. and an ambience of that environment is found to be desirable.
- the features of the ambiance may include the lighting, sound, temperature, humidity, air quality, scent, etc.
- the user may then enter a second environment and desire to replicate ambience from the first environment in that second environment.
- the user may be forced to manually adjust one or more different settings in the second environment.
- the user when the user is adjusting the settings he/she may be forced to refer only to his or her memory to implement the setting from the first environment.
- the second environment may include different light sources, heating systems, air conditioning systems, audio systems, etc., a user's attempt to manually replicate the ambiance from the first environment is often difficult if not futile.
- Some embodiments of a method for sensing and adjusting features of an environment are configured for receiving an ambience feature of a source environment, determining from the ambiance feature, a source output provided by a source device in the source environment, and determining an ambiance capability for a target environment. Some embodiments include determining, based on the ambiance capability, a target output for a target device in the target environment and communicating with the target device to model the ambiance feature from the source environment into the target environment by altering the target output provided by the target device.
- Some embodiments of the system include an image capture device for receiving an illumination signal for a source environment and a memory component that stores logic that causes the system to receive the illumination signal from the image capture device and determine, from the illumination signal, an illumination ambiance in the source environment.
- the logic further causes the system to determine a characteristic of the source environment, and determine an illumination capability for a target environment.
- the logic causes the system to determine, based on the illumination capability, a target output for a light source in the target environment and communicate with the light source to model the illumination ambiance from the source environment into the target environment by altering the target output provided by the light source.
- Non-transitory computer-readable medium includes logic that causes a computing device to receive an illumination signal, determine, from the illumination signal, an illumination ambiance in a source environment, and determine a characteristic of the source environment.
- the logic further causes the computing device to determine an illumination capability for a target environment, determine, based on the illumination capability, a target output for a light source in the target environment, and communicate with the light source to model the illumination ambiance from the source environment into the target environment by altering the target output provided by the light source.
- the logic causes the computing device to receive an updated lighting characteristic of the target environment, determine whether the updated lighting characteristic substantially models the illumination ambiance from the source environment, and in response to determining that the updated lighting characteristic does not substantially model the illumination ambience from the source environment, altering the target output provided by the light source.
- FIG. 1 depicts a plurality of environments from which an ambience may be sensed and adjusted, according to embodiments disclosed herein;
- FIG. 2 depicts a user computing device that may be utilized for sensing and adjusting features in an environment, according to embodiments disclosed herein;
- FIG. 3 depicts a user interface that provides options to model an environment ambience and apply a stored model, according to embodiments disclosed herein;
- FIG. 4 depicts a user interface for determining a type of ambiance feature to capture in an environment, according to embodiments disclosed herein;
- FIG. 5 depicts a user interface for receiving data from a source environment, according to embodiments disclosed herein;
- FIG. 6 depicts a user interface for modeling the source environment, according to embodiments disclosed herein;
- FIG. 7 depicts a user interface for storing a received ambience, according to embodiments disclosed herein;
- FIG. 8 depicts a user interface for receiving a theme from an environment, according to embodiments disclosed herein;
- FIG. 9 depicts a user interface for applying a stored ambience to a target environment, according to embodiments disclosed herein;
- FIG. 10 depicts a user interface for receiving an ambience capability for a target environment, according to embodiments disclosed herein;
- FIG. 11 depicts a user interface for providing a suggestion to more accurately model the target environment according to the source environment, according to embodiments disclosed herein;
- FIG. 12 depicts a user interface for providing options to apply additional ambience features to the target environment, according to embodiments disclosed herein;
- FIG. 13 depicts a flowchart for modeling an ambience feature in a target environment, according to embodiments disclosed herein;
- FIG. 14 depicts a flowchart for determining whether an ambience feature has previously been stored, according to embodiments disclosed herein.
- FIG. 15 depicts a flowchart for determining whether an applied ambiance feature substantially matches a theme, according to embodiments disclosed herein.
- Embodiments disclosed herein include systems and methods for sensing and adjusting features in an environment. More specifically, in some embodiments, a user may enter a source environment, such as a house, room, office, hotel, restaurant, etc. and realize that the ambiance is pleasing.
- the ambiance may include the lighting, the sound, the scent, the climate, and/or other features of the source environment.
- the user may utilize a user computing device, such as a mobile phone, personal digital assistant (PDA), laptop computer, tablet computer, etc. to capture an ambiance feature of the source environment.
- PDA personal digital assistant
- the user computing device may include (or be coupled to a device that includes) an image capture device, a microphone, a gyroscope, an accelerometer, a positioning system, a thermometer, a humidity sensor, an air quality sensor, and/or other sensors for determining the ambiance features of the source environment.
- a device that includes an image capture device, a microphone, a gyroscope, an accelerometer, a positioning system, a thermometer, a humidity sensor, an air quality sensor, and/or other sensors for determining the ambiance features of the source environment.
- the user may select an option on the user computing device that activates the image capture device.
- the image capture device may capture lighting characteristics of the source environment. The lighting characteristics may include a light intensity, a light frequency, a light distribution, etc., as well as dynamic changes over time thereof.
- the user computing device can determine a source output, which (for lighting) may include a number of light sources, a light output of sources; whether the light is diffuse light, columnar light, direct light, reflected light, color temperature of the light, overall brightness, etc.
- the user computing device may also determine a characteristic of the source environment, such as size, coloring, acoustics, and/or other characteristics. Once the user computing device has determined the source output, this data may be stored locally and/or sent to a remote computing device for storage.
- the user device may implement the ambiance from the source environment into a target environment.
- the user may utilize the image capture device (and/or other components, such as the positioning system, gyroscope, accelerometer, etc.) to determine an ambience capability (such as an illumination capability in the lighting context or an audio capability, a scent capability, a climate capability, etc. in other contexts) of the target environment.
- the ambiance capability may be determined from a number and position of target devices (such as light sources or other output devices), windows, furniture, and/or other components. Other features of the target environment may also be determined, such as size, global position, coloring, etc.
- the user computing device can determine alterations to make to the light sources in the target environment to substantially model the ambiance feature from the source environment. This determination may be made by comparing the location and position of the output sources in the source environment, as well as the light actually realized from those output sources with the determined ambiance capability of the target environment. As an example, if the source environment is substantially similar to the target environment, the user computing device can determine that the output (such as lighting effects) provided by the light sources should be approximately the same. If there are differences between the source environment and the target environment, those differences may be factored into the analysis. More specifically, when the source environment and target environment are different, the combination of light output and room dynamics adds up to the visual feeling of the environment.
- embodiments disclosed herein may shape the light output such that the ambiance "felt" by the image capture device would be similar.
- some embodiments may utilize a feedback loop configuration to dynamically assess the source environment and/or target environment and dynamically adjust the settings and ensure accuracy.
- the user computing device can communicate with the output sources directly and/or with a network component that controls the output sources.
- the user computing device may additionally reexamine the target environment to determine whether the adjustments made substantially model the ambiance feature from the source environment. If not, further alterations may be made. If the alterations are acceptable, the settings for this ambiance may be stored.
- the remote computing device may receive the source output data and create an application to send to the user computing device for implementing the ambiance into a target environment. This may be accomplished such that the ambiance may be implemented in any environment (with user input on parameters of the target environment).
- the user computing device may additionally send environmental characteristics data (such as size, shape, position, etc. of an environment), such that the remote computing device can create an application to implement the ambiance in the particular target environment.
- some embodiments may be configured with a feedback loop for continuous and/or repeated monitoring and adjustment of settings in the target environment.
- the user computing device may be configured to take a plurality of measurements of the source environment to determine a current ambiance. Similarly, when modeling the current ambiance into the target environment, the user computing device can send data related to the current ambiance to a target device. Additionally, once the adjustments to the target environment are implemented, the user computing device can monitor the ambiance, calculate adjustments, and send those adjustments to achieve a desired target ambiance. This may continue a predetermined number of iterations or until accuracy is achieved within a predetermined threshold.
- a light source may include any component that provides a visible form of light, including a lamp, an overhead light, a television, a component light, sunlight, a fire, an external light source, and a candle, etc.
- a light source may take many shapes, sizes, and forms and, since the inception of electric lighting, have matured to include many types of emission sources.
- Incandescence, electroluminescence, and gas discharge have each been used in various lighting apparatus and, among each the primary emitting element (e.g., incandescent filaments, light-emitting diodes, gas, plasma, etc.) may be configured in any number of ways according to the intended application.
- Many embodiments of light sources described herein are susceptible to use with almost any type of emission source, as will be understood by a person of ordinary skill in the art upon reading the following described embodiments.
- certain embodiments may include light-emitting diodes (LEDs), LED light sources, lighted sheets, and the like.
- LEDs light-emitting diodes
- LED lighting arrays come in many forms including, for instance, arrays of individually packaged LEDs arranged to form generally planar shapes (i.e., shapes having a thickness small relative to their width and length).
- LED arrays may also be formed on a single substrate or on multiple substrates, and may include one or more circuits (i.e., to illuminate different LEDs), various colors of LEDs, etc. Additionally, LED arrays may be formed by any suitable semiconductor technology including, by way of example and not limitation, metallic semiconductor material and organic semiconductor material. In any event, embodiments utilizing an LED material or the use of a planar illuminated sheet, any suitable technology known presently or later invented may be employed in cooperation with other elements without departing from the spirit of the disclosure.
- FIG. 1 depicts a plurality of environments from which an ambience may be sensed and adjusted, according to embodiments disclosed herein.
- a network 100 may include a wide area network, such as the Internet, a local area network (LAN), a mobile communications network, a public service telephone network (PSTN) and/or other network and may be coupled to a user computing device 102, remote computing device 104, and a target environment 110b.
- a source environment 110a may include one or more output devices 112a - 112d, which in FIG. 1 are depicted as light sources.
- a light source may include any component that provides a visible form of light, including a lamp, an overhead light, a television, a component light, sunlight, a fire, an external light source, a candle, etc.
- the target environment 110b may also include one or more output devices 114a - 114c. While the output devices 112 and 114 are illustrated as light sources in FIG. 1 that provide an illumination ambience, other sources may also be considered within the scope of this disclosure, including an audio source, a scent source, climate source (such as a temperature source, a humidity source, an air quality source, wind source, etc.) and/or other sources. As illustrated, in some embodiments, the source environment 110a and target environment 110b may each be coupled to the network 100, such as via a network device. The network device may include any local area and/or wide area device for controlling an output device in an environment. Such network devices may be part of a "smart home" and/or other intelligent system.
- the network connection may allow the user computing device 102 with a mechanism for receiving an ambience theme and/or other data related to the source environment 110a.
- the target environment 110b may provide the user computing device 102 with a mechanism for controlling one or more of the output devices 114.
- the user computing device 102 may include a memory component 140 that stores source environment logic 144a for functionality related to determining characteristics of the source environment 110a.
- the memory component 140 also stores target environment logic 144b for modeling the ambience features from the source environment 110a and applying those ambience features into the target environment 110b.
- the user computing device 102 and the remote computing device 104 are depicted as a mobile computing device and server respectively, these are merely examples. More specifically, in some embodiments any type of computing device ⁇ e.g. mobile computing device, personal computer, server, etc.) may be utilized for either of these components. Additionally, while each of these computing devices 102, 104 is illustrated in FIG. 1 as a single piece of hardware, this is also an example. More specifically, each of the computing devices 102, 104 depicted in FIG. 1 may represent a plurality of computers, servers, databases, etc.
- the source environment logic 144a and the target environment logic 144b are depicted in the user computing device 102, this is also just an example.
- the user computing device 102 and/or the remote computing device 104 may include this and/or similar logical components.
- FIG. 1 depicts embodiments in the lighting context, other contexts are included within the scope of this disclosure.
- a scent sensor may be included in an air freshener (or other external device) that is located in the source environment 110a and is in communication with the user computing device 102.
- the air freshener may determine an aroma in the source environment 110a and may communicate data related to that aroma to the user computing device 102.
- the air freshener may be set to produce an aroma and may send data related to the settings for producing that aroma.
- another air freshener may be in communication with the user computing device 102 for providing the aroma data received from the source environment 110a. With this information, the air freshener may implement the aroma to model the ambience from the source environment 110a.
- FIG. 2 depicts a user computing device 102 that may be utilized for sensing and adjusting features in an environment, according to embodiments disclosed herein.
- the user computing device 102 includes at least one processor 230, input/output hardware 232, network interface hardware 234, a data storage component 236 (which includes product data 238a, user data 238b, and/or other data), and the memory component 140.
- the memory component 140 may be configured as volatile and/or nonvolatile memory and, as such, may include random access memory (including SRAM, DRAM, and/or other types of RAM), flash memory, secure digital (SD) memory, registers, compact discs (CD), digital video discs (DVD), and/or other types of non-transitory computer-readable mediums. Depending on the particular embodiment, these non-transitory computer-readable mediums may reside within the user computing device 102 and/or external to the user computing device 102.
- the memory component 140 may be configured to store operating logic 242, the source environment logic 144a, and the target environment logic 144b.
- the operating logic 242 may include an operating system, basic input output system (BIOS), and/or other hardware, software, and/or firmware for operating the user computing device 102.
- the source environment logic 144a and the target environment logic 144b may each include a plurality of different pieces of logic, each of which may be embodied as a computer program, firmware, and/or hardware, as an example.
- a local interface 246 is also included in FIG. 2 and may be implemented as a bus or other interface to facilitate communication among the components of the user computing device 102.
- the processor 230 may include any processing component operable to receive and execute instructions (such as from the data storage component 236 and/or memory component 140).
- the input/output hardware 232 may include and/or be configured to interface with a monitor, positioning system, keyboard, mouse, printer, image capture device, microphone, speaker, gyroscope, accelerometer, compass, thermometer, humidity sensor, air quality sensor and/or other device for receiving, sending, and/or presenting data.
- the network interface hardware 234 may include and/or be configured for communicating with any wired or wireless networking hardware, including an antenna, a modem, LAN port, wireless fidelity (Wi-Fi) card, WiMax card, mobile communications hardware, and/or other hardware for communicating with other networks and/or devices. From this connection, communication may be facilitated between the user computing device 102 and other computing devices.
- the processor 230 may also include and/or be coupled to a graphical processing unit (GPU).
- GPU graphical processing unit
- FIG. 2 the components illustrated in FIG. 2 are merely exemplary and are not intended to limit the scope of this disclosure. As an example, while the components in FIG. 2 are illustrated as residing within the user computing device 102, this is merely an example. In some embodiments, one or more of the components may reside external to the user computing device 102. It should also be understood that, while the user computing device 102 in FIG. 2 is illustrated as a single device, this is also merely an example. In some embodiments, the source environment logic 144a and the target environment logic 144b may reside on different devices. Additionally, while the user computing device 102 is illustrated with the source environment logic 144a and the target environment logic 144b as separate logical components, this is also an example. In some embodiments, a single piece of logic may perform the described functionality.
- FIG. 3 depicts a user interface 300 that provides options to model an environment ambiance and apply a stored model, according to embodiments disclosed herein.
- the user computing device 102 may include a sensor device 318 and an application that provides the user interface 300.
- the sensor device 318 depicted in FIG. 3 represents any sensor device that may be integral to and/or coupled with the user computing device 102. More specifically, the sensor device 318 may be configured as an image capture device, a microphone, a scent sensor, a humidity sensor, a temperature sensor, an air quality sensor, wind sensor, etc.
- the user interface 300 may include a model environment option 320 and an apply stored model option 322.
- the model environment option 320 may be selected to facilitate capture of ambience data from a source environment 110a.
- the apply stored model option 322 may be selected to apply ambience data from the source environment 110a and apply that data to the target environment 110b.
- FIG. 4 depicts a user interface 400 for determining a type of ambiance feature to capture in an environment, according to embodiments disclosed herein.
- the user interface 400 may be provided with a lighting option 420, a sound option 422, a scent option 424, and a climate option 428. More specifically, the user may select one or more of the options 420 - 428 to capture the corresponding data from the source environment 110a.
- the lighting option 420 the user computing device 102 may acquire lighting data via the sensor device 318, which may be embodied as an image capture device.
- audio signals may be captured by the sensor device 318, which may be embodied as a microphone.
- the user computing device 102 may capture scents via the sensor device 318, which may be embodied as a scent sensor.
- the user computing device 102 may capture a temperature signal, a humidity signal, an air quality signal, a wind signal, etc. via the sensor device 318, which may be embodied as a thermometer, humidity sensor, air quality sensor, etc.
- FIG. 5 depicts a user interface 500 for receiving data from the source environment 110a, according to embodiments disclosed herein.
- the image capture device may be utilized to capture lighting data from the source environment 110a and display at least a portion of that data in the user interface 500.
- the image capture device may capture an image of the source environment 110a. While FIG. 5 depicts that the image data is a photographic image of the environment and source devices, this is merely an example.
- the user interface 500 may simply provide a graphical representation of light intensity (such as a color representation).
- the user computing device 102 may utilize the received ambiance feature (which in this case is lighting data) to determine source output data, such as the location, number, and intensity of light sources in the source environment 110a. Other determinations may also be made, such as size and color of the environment, whether the light sources are internal light sources (such as lamps, overhead lights, televisions, electronic components, etc.) or external light sources (such as the sun, moon, stars, street lamps, automobiles, etc.).
- source output data such as the location, number, and intensity of light sources in the source environment 110a.
- Other determinations may also be made, such as size and color of the environment, whether the light sources are internal light sources (such as lamps, overhead lights, televisions, electronic components, etc.) or external light sources (such as the sun, moon, stars, street lamps, automobiles, etc.).
- the user interface 500 of FIG. 5 depicts the source environment 110a in the context of determining the lighting ambiance, this is merely an example. More specifically, if the sound option 422 (from FIG. 4) is selected, a microphone may be utilized to capture audio data from the source environment 110a. The user may direct the user computing device 102 across the environment. From the received audio data, the user computing device 102 can determine the source, intensity, frequency, etc. of the audio from the environment.
- the user computing device 102 may receive scent data from a scent sensor.
- the scent sensor may be integral with or coupled to the user computing device 102.
- the user computing device 102 may receive climate related data from the source environment 110a, such as via a temperature sensor, a humidity sensor, an air quality sensor, etc. With this data, the user computing device 102 can determine a climate ambience for the source environment 110a.
- FIG. 6 depicts a user interface 600 for modeling the source environment 110a, according to embodiments disclosed herein.
- the user interface 600 includes an indication of the number of output sources that were located in the source environment 110a, as well as features of the source environment 110a, itself. This determination may be made based on an intensity analysis of the output form the output source.
- a graphical representation 620 of the source environment 110a may also be provided. If the user computing device is incorrect regarding the environment and/or output sources, the user may alter the graphical representation 620 to add, move, delete, or otherwise change the graphical representation 620. Additionally, a correct option 622 is also included for indicating when the ambiance features of the source environment 110a are accurately determined.
- FIG. 7 depicts a user interface 700 for storing a received ambiance, according to embodiments disclosed herein.
- the user interface 700 includes keyboard for entering a name for the output source data and source environment data from FIG. 6.
- FIG. 8 depicts a user interface 800 for receiving a theme from an environment, according to embodiments disclosed herein.
- the user interface 800 may be provided in response to a determination by the user computing device 102 that a source environment 110a is broadcasting a theme or other ambiance data. More specifically, the embodiments discussed with reference to FIGS. 3 - 7 address the situation where the user computing device 102 actively determines the ambiance characteristics of the source environment 110a. However, in FIG.
- the user computing device 102 need not make this determination because the source environment 110a is broadcasting the ambiance characteristics (e.g., the source output data, the environment characteristics data and/or other data), such as via a wireless local area network. Accordingly, in response to receiving the ambiance characteristics, the user interface 800 may be provided with options for storing the received data.
- the ambiance characteristics e.g., the source output data, the environment characteristics data and/or other data
- the user may scan a 1- dimensional or 2-dimensional bar code to receive information pertaining to the source environment 110a.
- the information may be sent to the user computing device 102 via a text message, email message, and/or other messaging.
- a theme store may be accessible over a wide area network and/or local area network for receiving any number of different themes. In the theme store, users may be provided with options to purchase, upload, and/or download themes for use in a target environment.
- some embodiments may be configured to upload and/or download ambiance characteristics to and/or from a website, such as a social media website, a mapping website, etc.
- a website such as a social media website, a mapping website, etc.
- restaurant or other source environment controller may provide the ambiance characteristics on a page dedicated to that restaurant. Thus, when users visit that page, they may download the ambiance.
- the social media website may provide a link to that restaurant that may also include a link to download the ambiance characteristics.
- a user can upload ambiance characteristics to the mapping website, such that when a map, satellite image, or other image of that environment is provided, a link to download the ambiance may also be provided.
- FIG. 9 depicts a user interface 900 for applying a stored ambiance to the target environment 110b, according to embodiments disclosed herein.
- the user interface 900 may be provided in response to selection of the apply stored model option 324, from FIG. 3.
- the user interface 900 may provide a "dad's house” option 920, a "sis' kitchen” option 922, a "fav eatery” option 924, and a "beach” option 926.
- the user computing device 102 can apply the stored ambience to the target environment 110b.
- FIG. 10 depicts a user interface 1000 for receiving an ambiance capability for the target environment 110b, according to embodiments disclosed herein.
- the user interface 1000 may be configured to capture imagery and/or other data from the target environment 110b and utilize that data to determine an ambiance capability of the target environment 110b.
- the ambiance capability may be portrayed in a graphical representation 1002, which may be provided as a photographic image, video image, altered image, etc.
- an apply option 1022 and an amend option 1024 More specifically, by selecting the amend option 1024, the user may add, edit, move, and/or otherwise change the output sources that are provided in the user interface 1000.
- FIG. 11 depicts a user interface 1100 for providing a suggestion to more accurately model the target environment 110b according to the source environment 110a, according to embodiments disclosed herein.
- the user interface 1100 is similar to the user interface 1000 from FIG. 10, except that the user computing device 102 has determined that changes to the target environment 110b would allow a greater accuracy in modeling the ambience from the source environment 110a.
- the user interface 1100 may provide a graphical representation 1120, which illustrates a change and a location of that change.
- An option 1122 may be provided to navigate away from the user interface 1100.
- FIG. 12 depicts a user interface 1200 for providing options to apply additional ambiance features to the target environment 110b, according to embodiments disclosed herein.
- the user interface 1200 may be provided in response to selection of the apply option 1022 from FIG. 10.
- the apply option 1022 is selected, the selected ambiance may be applied to the target environment 110b. More specifically, with regard to FIGS. 9 - 11, determinations regarding the target environment 110b have been made for more accurately customizing the desired ambiance to that target environment 110b.
- the user computing device 102 may communicate with one or more of the output devices to implement the desired changes. The communication may be directly with the output devices, if the output devices are so configured.
- FIG. 13 depicts a flowchart for modeling an ambiance feature in a target environment, according to embodiments disclosed herein.
- an ambiance feature of a source environment may be received.
- the ambience feature may include those features of the source environment that may be detected by the sensor device 318, such as light (e.g., an illumination signal), an audio signal, a scent signal, and a climate signal (such as temperature, humidity, air quality, etc.) and/or other features.
- a determination may be made from the ambiance feature regarding a source output provided by a source device in the source environment. More specifically, the determination may include determining a type of source device (such as a type of illumination device or other output device), where the type of illumination device includes a lamp, an overhead light, a television, a component light, sunlight, a fire, an external light source, a candle, etc.
- a determination may be made regarding an ambience capability for a target environment.
- a determination may be made based on the ambiance capability of the target environment, regarding a target output for the target device in the target environment.
- the target device may include an output device, such as a light source, audio source, climate source, etc.
- a communication may be facilitated with the target device to model the ambiance feature from the source environment into the target environment by altering the target output provided by the target device.
- modeling the ambiance feature from the source environment into the target environment includes determining a number of target devices in the target environment, a location of the target device in the target environment, a type of target device in the target environment (such as a type of light source), etc.
- the communication may include sending a command to the target device.
- FIG. 14 depicts a flowchart for determining whether an ambience feature has previously been stored, according to embodiments disclosed herein.
- the user computing device 102 may enter a target environment.
- a determination may be made regarding whether an ambiance setting is currently stored. If an ambience setting is not currently stored, the user computing device 102 may be taken to a source environment and the process may proceed to block 1330 in FIG. 13. If an ambience setting is currently stored, at block 1436 the stored settings may be retrieved.
- the user computing device 102 can communicate with the target environment to alter target devices to match the stored settings.
- FIG. 15 depicts a flowchart for determining whether an applied ambiance feature substantially matches a theme, according to embodiments disclosed herein.
- a theme ambiance may be received.
- a request to apply the theme to the target environment may be received.
- the user computing device 102 may communicate with the target environment to alter the target devices to match the theme.
- an ambiance feature may be received from the target environment.
- a determination may be made regarding whether the ambiance feature substantially matches the theme. This determination may be based on a predetermined threshold for accuracy. If the ambiance feature does substantially match, at block 1542, the settings of the target devices may be stored. If the ambiance feature does not substantially match, the user computing device 102 can alter the target devices to provide an updated ambiance feature (such as an updated lighting characteristic) to more accurately model the theme.
- an updated ambiance feature such as an updated lighting characteristic
Landscapes
- Engineering & Computer Science (AREA)
- Computer Networks & Wireless Communication (AREA)
- Circuit Arrangement For Electric Light Sources In General (AREA)
Abstract
Des modes de réalisation de l'invention ont trait à la détection et au réglage des caractéristiques d'un environnement. Certains modes de réalisation se rapportent à un système et/ou un procédé qui permettent la réception d'une caractéristique d'ambiance d'un environnement source grâce à la détermination, à partir de la caractéristique d'ambiance, d'une sortie source produite par un dispositif source dans l'environnement source, et grâce à la détermination d'une capacité d'ambiance pour un environnement cible. Certains modes de réalisation comprennent la détermination, basée sur la capacité d'ambiance, d'une sortie cible pour un dispositif cible dans l'environnement cible, et la communication avec le dispositif cible pour modéliser la caractéristique d'ambiance à partir de l'environnement source sur l'environnement cible par la modification de la sortie cible produite par le dispositif cible.
Priority Applications (7)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/US2011/033924 WO2012148385A1 (fr) | 2011-04-26 | 2011-04-26 | Détection et réglage des caractéristiques d'un environnement |
CA2834217A CA2834217C (fr) | 2011-04-26 | 2011-04-26 | Detection et reglage des caracteristiques d'un environnement |
EP11864309.7A EP2702528A4 (fr) | 2011-04-26 | 2011-04-26 | Détection et réglage des caractéristiques d'un environnement |
US14/063,006 US9504099B2 (en) | 2011-04-26 | 2013-10-25 | Lighting system with flexible lighting sheet and intelligent light bulb base |
US14/063,030 US20140052278A1 (en) | 2011-04-26 | 2013-10-25 | Sensing and adjusting features of an environment |
US14/062,990 US20140049972A1 (en) | 2011-04-26 | 2013-10-25 | Stemmed lighting assembly with disk-shaped illumination element |
US14/062,961 US9500350B2 (en) | 2011-04-26 | 2013-10-25 | Methods and apparatus for providing modular functionality in a lighting assembly |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/US2011/033924 WO2012148385A1 (fr) | 2011-04-26 | 2011-04-26 | Détection et réglage des caractéristiques d'un environnement |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2012148385A1 true WO2012148385A1 (fr) | 2012-11-01 |
Family
ID=47072625
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2011/033924 WO2012148385A1 (fr) | 2011-04-26 | 2011-04-26 | Détection et réglage des caractéristiques d'un environnement |
Country Status (3)
Country | Link |
---|---|
EP (1) | EP2702528A4 (fr) |
CA (1) | CA2834217C (fr) |
WO (1) | WO2012148385A1 (fr) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2015104650A3 (fr) * | 2014-01-08 | 2015-11-26 | Koninklijke Philips N.V. | Système de partage et/ou de synchronisation d'attributs d'une lumière émise entre des systèmes d'éclairage |
WO2016050539A1 (fr) * | 2014-10-02 | 2016-04-07 | Philips Lighting Holding B.V. | Système et procédé d'éclairage pour générer des scénarios d'éclairage |
EP3035208A1 (fr) * | 2014-12-19 | 2016-06-22 | Koninklijke KPN N.V. | Amélioration de la sélection et du contrôle de fichiers de contenu |
EP3247177A1 (fr) * | 2016-05-16 | 2017-11-22 | BrainLit AB | Système de contrôle |
WO2018113084A1 (fr) * | 2016-12-20 | 2018-06-28 | Taolight Company Limited | Dispositif, système et procédé de commande de fonctionnement d'unités d'éclairage |
WO2018127378A1 (fr) * | 2017-01-04 | 2018-07-12 | Philips Lighting Holding B.V. | Commande d'éclairage |
EP3360393A4 (fr) * | 2016-12-12 | 2018-08-15 | Taolight Company Limited | Dispositif, système et procédé de commande du fonctionnement d'unités d'éclairage |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104596929B (zh) | 2013-10-31 | 2017-06-23 | 国际商业机器公司 | 确定空气质量的方法及设备 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060071605A1 (en) * | 2002-11-22 | 2006-04-06 | Koninklijke Philips Electronics N.V. | System for and method of controlling a light source and lighting arrangement |
US20100257187A1 (en) * | 2007-12-11 | 2010-10-07 | Koninklijke Philips Electronics N.V. | Method of annotating a recording of at least one media signal |
US7840567B2 (en) * | 2006-01-17 | 2010-11-23 | International Business Machines Corporation | Method and apparatus for deriving optimal physical space and ambiance conditions |
US7856152B2 (en) * | 2005-03-23 | 2010-12-21 | Koninklijke Philips Electronics N.V. | Light condition recorder system and method |
US20110066412A1 (en) * | 2008-05-09 | 2011-03-17 | Koninklijke Philips Electronics N.V. | System and method for processing application logic of a virtual and a real-world ambient intelligence environment |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040230486A1 (en) * | 2003-05-15 | 2004-11-18 | Greenlee Garrett M. | System and method for creating a dynamic and interactive simulated remote-locale atmosphere |
WO2006046190A2 (fr) * | 2004-10-25 | 2006-05-04 | Koninklijke Philips Electronics, N.V. | Procede et systeme de trames d'image a regulation de la lumiere |
US8314569B2 (en) * | 2006-11-17 | 2012-11-20 | Koninklijke Philips Electronic N.V. | Light wand for lighting control |
CN101569240B (zh) * | 2006-12-22 | 2017-04-19 | 飞利浦灯具控股公司 | 用于自动验证根据抽象描述呈现照明氛围的可能性的方法和系统 |
KR20100017584A (ko) * | 2007-05-03 | 2010-02-16 | 코닌클리즈케 필립스 일렉트로닉스 엔.브이. | 추상적 기술로부터 조명 분위기를 렌더링할 가능성을 자동으로 검증하기 위한 방법 및 시스템 |
-
2011
- 2011-04-26 CA CA2834217A patent/CA2834217C/fr active Active
- 2011-04-26 EP EP11864309.7A patent/EP2702528A4/fr not_active Withdrawn
- 2011-04-26 WO PCT/US2011/033924 patent/WO2012148385A1/fr active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060071605A1 (en) * | 2002-11-22 | 2006-04-06 | Koninklijke Philips Electronics N.V. | System for and method of controlling a light source and lighting arrangement |
US7856152B2 (en) * | 2005-03-23 | 2010-12-21 | Koninklijke Philips Electronics N.V. | Light condition recorder system and method |
US7840567B2 (en) * | 2006-01-17 | 2010-11-23 | International Business Machines Corporation | Method and apparatus for deriving optimal physical space and ambiance conditions |
US20100257187A1 (en) * | 2007-12-11 | 2010-10-07 | Koninklijke Philips Electronics N.V. | Method of annotating a recording of at least one media signal |
US20110066412A1 (en) * | 2008-05-09 | 2011-03-17 | Koninklijke Philips Electronics N.V. | System and method for processing application logic of a virtual and a real-world ambient intelligence environment |
Non-Patent Citations (1)
Title |
---|
See also references of EP2702528A4 * |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2015104650A3 (fr) * | 2014-01-08 | 2015-11-26 | Koninklijke Philips N.V. | Système de partage et/ou de synchronisation d'attributs d'une lumière émise entre des systèmes d'éclairage |
US9769910B2 (en) | 2014-01-08 | 2017-09-19 | Philips Lighting Holding B.V. | System for sharing and/or synchronizing attributes of emitted light among lighting systems |
WO2016050539A1 (fr) * | 2014-10-02 | 2016-04-07 | Philips Lighting Holding B.V. | Système et procédé d'éclairage pour générer des scénarios d'éclairage |
EP3035208A1 (fr) * | 2014-12-19 | 2016-06-22 | Koninklijke KPN N.V. | Amélioration de la sélection et du contrôle de fichiers de contenu |
EP3247177A1 (fr) * | 2016-05-16 | 2017-11-22 | BrainLit AB | Système de contrôle |
EP3360393A4 (fr) * | 2016-12-12 | 2018-08-15 | Taolight Company Limited | Dispositif, système et procédé de commande du fonctionnement d'unités d'éclairage |
WO2018113084A1 (fr) * | 2016-12-20 | 2018-06-28 | Taolight Company Limited | Dispositif, système et procédé de commande de fonctionnement d'unités d'éclairage |
EP3456154A4 (fr) * | 2016-12-20 | 2019-03-20 | Wizconnected Company Limited | Dispositif, système et procédé de commande de fonctionnement d'unités d'éclairage |
US20190297700A1 (en) * | 2016-12-20 | 2019-09-26 | Taolight Company Limited | Device, system and method for controlling operation of lighting units |
WO2018127378A1 (fr) * | 2017-01-04 | 2018-07-12 | Philips Lighting Holding B.V. | Commande d'éclairage |
US10736202B2 (en) | 2017-01-04 | 2020-08-04 | Signify Holding B.V. | Lighting control |
Also Published As
Publication number | Publication date |
---|---|
CA2834217A1 (fr) | 2012-11-01 |
EP2702528A1 (fr) | 2014-03-05 |
EP2702528A4 (fr) | 2014-11-05 |
CA2834217C (fr) | 2018-06-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CA2834217C (fr) | Detection et reglage des caracteristiques d'un environnement | |
US20140052278A1 (en) | Sensing and adjusting features of an environment | |
US9585229B2 (en) | Anticipatory lighting from device screens based on user profile | |
EP3152981B1 (fr) | Création ou modification de scène lumineuse au moyen de données d'utilisation de dispositif d'éclairage | |
CN110603901B (zh) | 使用语音识别来控制实用程序的方法和控制系统 | |
JP6821820B2 (ja) | 照明システム用の推薦エンジン | |
JP7266537B2 (ja) | コネクテッド照明システムの使用方法 | |
JP6839103B2 (ja) | 照明システム内の装置を設定するための方法 | |
JP2023533431A (ja) | 照明デバイスの複数のパラメータを構成する方法 | |
EP3928594B1 (fr) | Amélioration de la reconnaissance d'un utilisateur d'une scène lumineuse | |
EP3607521B1 (fr) | Procédé et appareil pour surveiller l'utilisation d'un système d'éclairage | |
WO2021165173A1 (fr) | Détermination d'une direction de sortie de lumière réglée imitant la lumière du jour | |
WO2024046782A1 (fr) | Procédé pour distinguer une rétroaction d'utilisateur sur une image | |
WO2020254227A1 (fr) | Dispositif d'éclairage pour éclairer un environnement et procédé de commande d'un dispositif d'éclairage |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 11864309 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2011864309 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2834217 Country of ref document: CA |
|
NENP | Non-entry into the national phase |
Ref country code: DE |