WO2021224262A2 - Conception basée sur réalité augmentée - Google Patents

Conception basée sur réalité augmentée Download PDF

Info

Publication number
WO2021224262A2
WO2021224262A2 PCT/EP2021/061732 EP2021061732W WO2021224262A2 WO 2021224262 A2 WO2021224262 A2 WO 2021224262A2 EP 2021061732 W EP2021061732 W EP 2021061732W WO 2021224262 A2 WO2021224262 A2 WO 2021224262A2
Authority
WO
WIPO (PCT)
Prior art keywords
augmented reality
measured
display
reality device
measurement
Prior art date
Application number
PCT/EP2021/061732
Other languages
English (en)
Other versions
WO2021224262A3 (fr
Inventor
Tarun Dass MATHUR
Ajay Laxman GOLE
Nam Chin Cho
Parth JOSHI
Original Assignee
Signify Holding B.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Signify Holding B.V. filed Critical Signify Holding B.V.
Publication of WO2021224262A2 publication Critical patent/WO2021224262A2/fr
Publication of WO2021224262A3 publication Critical patent/WO2021224262A3/fr

Links

Classifications

    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/105Controlling the light source in response to determined parameters
    • H05B47/11Controlling the light source in response to determined parameters by determining the brightness or colour temperature of ambient light
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/20Administration of product repair or maintenance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]
    • G06Q30/0631Item recommendations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]
    • G06Q30/0641Shopping interfaces
    • G06Q30/0643Graphical representation of items or shoppers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/06Energy or water supply
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • G06T15/506Illumination models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/105Controlling the light source in response to determined parameters
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/175Controlling the light source by remote control
    • H05B47/196Controlling the light source by remote control characterised by user interface arrangements
    • H05B47/1965Controlling the light source by remote control characterised by user interface arrangements using handheld communication devices
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J1/00Photometry, e.g. photographic exposure meter
    • G01J1/02Details
    • G01J1/0219Electrical interface; User interface
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2219/00Indexing scheme for manipulating 3D models or images for computer graphics
    • G06T2219/20Indexing scheme for editing of 3D models
    • G06T2219/2012Colour editing, changing, or manipulating; Use of colour codes
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02BCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO BUILDINGS, e.g. HOUSING, HOUSE APPLIANCES OR RELATED END-USER APPLICATIONS
    • Y02B20/00Energy efficient lighting technologies, e.g. halogen lamps or gas discharge lamps
    • Y02B20/40Control techniques providing energy savings, e.g. smart controller or presence detection

Definitions

  • the present disclosure relates generally to lighting, and more particularly to lighting augmented reality based lighting design.
  • the light intensity levels of the lights provided by the luminaires are the light intensity levels of the lights provided by the luminaires.
  • the light intensity of a light provided by the luminaire decreases over time. The light intensity may be reduced to a level such that the replacement of the luminaire or a light engine of the luminaire may be required.
  • a new luminaire may be added in the space.
  • the light intensity level at different locations along a work plane e.g., at the height of desks in the space
  • the light intensity level at different locations along a work plane may first be determined. However, reliably determining the light intensity level at different locations along a work plane to decide whether to replace or a luminaire may be challenging.
  • the present disclosure relates generally to augmented reality and more particularly to the operation of an augmented reality-based design system.
  • the augmented reality-based design system includes an augmented reality device, where the augmented reality device has a display and camera.
  • the system further includes a measurement device, in communication with the augmented realty device, where measured values of a parameter measured by the measurement device are provided to an augmented reality device.
  • the augmented reality device is capable of determining locations of the augmented reality device each associated with a measured value of the parameter taken by the measurement device at each location.
  • the augmented reality device being further capable of providing, via the display, a spatial mapping of the locations within the target area, and displaying the measured values overlaid on a real-time image of a target physical area on the display, each measured value displayed near a corresponding location where each measured value was measured.
  • the measured device is attached to an augmented reality device, where the measurement device is moved along with the augmented reality device during the measuring of the measured values.
  • the measurement device may be a lux measurement device and the parameter is the illuminance value detected by the lux measurement device.
  • the lux measurement device is capable of measuring ultraviolent light.
  • the augmented reality device may be further capable of determining a recommended device for installation within the target physical area based on the measured values and displaying a three dimensional model of the recommended device.
  • the augmented reality device may be further capable of determining an updated parameter value at a location based on parameter data associated with the recommended device and displaying the recommended device on the display with the updated parameter value displayed at the location.
  • the augmented reality device displays multiple recommended devices for selection by a user.
  • the recommended device may be a luminaire.
  • displaying measured values overlaid on a real-time image of a target physical area may include color coding the measured values on the display.
  • the measurement device is one of several measurement devices, each being remotely located from the augmented reality device and located and located in or mounted to a ceiling. In some example embodiments, each location may be associated with at least one measurement device.
  • the measurement devices are sensors, where each sensor is included in or on a luminaire and the measured values displayed on the display are the measured values measured by each sensor.
  • the measurement devices are air quality sensors and the measured values displayed are associated with an air quality parameter.
  • the measurement devices measure air flow.
  • the augmented reality device is further capable of generating an air flow image indicating air flow values and direction of air flow based on the measurement devices and displaying the air flow image overlaid on the real time image of the target display area.
  • FIGS. 1 A and IB illustrate an augmented reality device for lighting design and internet of things (IoT) design according to an example embodiment
  • FIG. 2 illustrates a block diagram of the augmented reality device of FIG. 1 A according to an example embodiment
  • FIG. 3 illustrates a lighting design system including the augmented reality device of FIG. 1 A for improving the lighting of an area according to an example embodiment
  • FIG. 4 illustrates the lighting design system of FIG. 3 showing a 3-D model of lighting fixture overlaid on a real-time image of a target area according to an example embodiment
  • FIG. 5 illustrates an air floor measurement and display system including the augmented reality device of FIG. 1A according to an example embodiment
  • FIG. 6 illustrates an air humidity measurement and display system including the augmented reality device of FIG. 1 A according to an example embodiment
  • FIG. 7 illustrates an air quality measurement and display system including the augmented reality device of FIG. 1 A according to an example embodiment
  • FIG. 8 illustrates the augmented reality device of FIG. 1 A being used for ultraviolet lighting design according to an example embodiment
  • FIG. 9 illustrates a 3-D model of a lighting fixture that emits an ultraviolet light and ultraviolet light intensity values determined based on parameter data associated with the 3-D model according to an example embodiment
  • FIG. 10 illustrates a 3-D model of lighting fixture that emits an ultraviolet light and ultraviolet light intensity values overlaid on a real-time image of a target area according to an example embodiment
  • FIG. 11 illustrates a method of augmented reality -based lighting design to improve a lighting of an area according to an example embodiment
  • FIG. 12 illustrates a method of augmented reality -based lighting design for ultraviolet light lighting fixtures according to an example embodiment
  • FIG. 13 illustrates a method of augmented reality -based lighting design for ultraviolet light lighting fixtures according to an example embodiment.
  • a measurement device is attached to an augmented reality device, and measurements are made in an area by the measurement device.
  • Parameters that are measured by the measurement device may include, for example, illuminance (i.e., light intensity.
  • the augmented reality device associates measured values with locations of measurements and displays the measured values overlaid on the real-time image of the area captured by the camera of the augmented reality device.
  • the augmented reality device may display the measured value in one or more formats including displaying the values and using different colors that represent ranges of values.
  • the augmented reality device may also process the measured values and associated location information to determine if changes are needed at some locations of the area.
  • the augmented reality device may determine that the illuminance values measured by a light measurement device attached to the augmented reality device are too low at some locations of the area and may indicate such locations.
  • a user may indicate to the augmented reality device locations that have illuminance below a desired level displayed measured values.
  • the augmented reality device may recommend products, such as lighting fixtures, that can be used to improve measured values at some locations indicated by a user or determined by the augmented reality device as having low illuminance.
  • FIGS. 1 A and IB illustrate an augmented reality device 100 for lighting design according to an example embodiment.
  • FIG. 1 A illustrates a back side of the augmented reality device 100
  • FIG. IB illustrates the front side of the augmented reality device 100.
  • the augmented reality device 100 may be a tablet, a smartphone, etc.
  • the augmented reality (AR) device 100 may include a back-facing camera 102 on a back side of the augmented reality device 100.
  • the AR device 100 may also include a viewport/display screen 106 on a front side of the augmented reality device 100.
  • the AR device 100 may also include a front-facing camera 104, a user input area 108, an ambient light sensor 110, accelerometers, or other sensors useful in determining orientation or real-time feedback from the physical space the AR device 100 is located for use in interpreting and displaying the AR on the display 106 of the AR device 100.
  • the viewport 106 may be used to display images as seen by the cameras 102, 104 as well as to display objects (e.g., icons, text, etc.) stored, received, and/or generated by the AR device 100.
  • the viewport 106 may also be used as a user input interface for the AR device 100.
  • the viewport 106 may be a touch sensitive display screen.
  • the viewport 106 may contain a number of pixels in the vertical and horizontal directions (known as display resolution).
  • the viewport 106 may have a display resolution of 2048 x 1536.
  • Each pixel may contain subpixels, where each subpixel typically represents red, green, and blue colors.
  • an image of a physical/real area in front of the AR device 100 may be displayed on the viewport 106 in real time as viewed by the camera 102.
  • the AR device 100 may include a lighting design AR application that activates the camera 102 such that a real-time image of the physical space viewed by the camera 102 is displayed on the viewport 106.
  • the camera 102 may be enabled/activated to display a real-time image of the physical space before or after the lighting design AR application started.
  • the real-time image displayed on the physical space may be displayed with a slight delay.
  • the AR device 100 may include an artificial intelligence application and/or components that can automatically suggest/provide recommended types of lighting fixtures.
  • the AR device 100 may also suggest location, orientation, and/or an appropriate number of lighting fixtures based on characteristics associated with the light fixtures (e.g., glare, intensity, available color temperatures or colors, available optics or accessories that change the beam angle or distribution produced by the light fixture, etc.), measured illuminance, etc.
  • the artificial intelligence software application and/or component may identify or suggest the right location for a certain fixture in the observed space, which results in requiring minimal input, interaction, and decision making by a user in achieving lighting design of a physical space/area.
  • FIG. 2 illustrates a block diagram of the augmented reality device 100 of FIG.
  • the AR device 100 includes a controller 202, a camera component 204, a display component 206, an input interface 208, a memory device 212, and a communication interface 214.
  • the camera component 204 may correspond to or may be part of the cameras 102, 104.
  • the display component 206 may correspond to or may be part of the viewport/display screen 106 and may include circuitry that enables or performs displaying of information (e.g., images, text, etc.) on the viewport 106.
  • the pixels of the viewport may be set/adjusted to display the image as viewed by the camera 102 or 104.
  • the input interface 208 may include the user input area 108 and/or the user input capability of viewport 106.
  • the display component 206 and the input interface 208 may make up or may be part of the viewport 106, where the viewport 106 is, for example, a touch-sensitive display screen.
  • the communication interface 214 may be used for communication, wirelessly or via a wired connection, by the AR device 100.
  • the communication interface 214 may include a USB port that can be used to connect an external device (e.g., a measurement device) to the AR device 100.
  • the controller 202 may include one or more microprocessors and/or microcontrollers that can execute software code stored in the memory device 212.
  • the software code of the lighting design AR application may be stored in the memory device 212 or retrievable from a remote storage location (e.g., cloud service or remotely located server or database) via the communication interface 214 or other communication means.
  • Other executable software codes used in the operation of the AR device 100 may also be stored in the memory device 212 or in another memory device of the AR device 100.
  • artificial intelligence lighting and/or other software may be stored in the memory device 212 as part of the AR application or along with the AR application and may be executed by the controller 202.
  • the controller 202 may execute the artificial intelligence application or another software code to identify locations that have low illuminance and automatically suggest/provide recommended type(s) of lighting fixtures along with additional information such as suggested location, orientation, and/or an appropriate number of lighting fixtures.
  • the one or more microprocessors and/or microcontrollers of the controller 202 execute software code stored in the memory device 212 or in another device to implement the operations of the AR device 100 described herein.
  • the memory device 212 may include a non-volatile memory device and volatile memory device.
  • data that is used in or generated during the execution of the lighting design AR application and other code may also be retrieved and/or stored in the memory device 212 or in another memory device of the AR device 100 or retrieved from a remote storage location (e.g., cloud service or remotely located server or database) via the communication interface 214 or other communication means.
  • a remote storage location e.g., cloud service or remotely located server or database
  • 3-D models of lighting fixtures and photometric data files e.g., IES files, ultraviolet light parameter files
  • associated with the lighting fixture models may be stored in the memory device 112, or retrieved from storage on a remote “cloud” -based service, and may be retrieved during execution of the lighting design AR application.
  • 3-D models of other devices such as sensors, cameras, microphones, speakers emitter/detector, wireless devices such as Bluetooth or Wi-Fi repeater, etc. and parameter data associated with the devices may be stored in the memory device 112, or stored in and retrieved from storage on a remote cloud-based service, and may be retrieved during execution of AR applications on the AR device 100.
  • the data stored and/or retrieved may include information such as range, viewing angle, resolution or similar operation information that may be visualized through the AR device).
  • the data may contain necessary information to estimate one or more view angles and range that is produced by sensor (e.g., motion, light, temperature, humidity, sound or other type of sensor) or an accessory device, such as camera, microphone, speaker, emitter/detector, wireless device like Bluetooth or Wi-Fi repeater, etc. within a three dimensional space.
  • the files may also include other information about the light emitted by the sensor or the accessory device.
  • the lighting design AR application stored in the memory device 112 may incorporate or interface with an augmented reality application/software, such as ARKit, ARCore, etc., that may also be stored in the memory device 112 or called upon from or provided via a remote storage location (e.g., cloud service or remotely located server or database) via the communication interface 214 or other communication means.
  • an augmented reality application/software such as ARKit, ARCore, etc.
  • the controller 202 may communicate with the different components of the AR device 100, such as the camera component 204, etc., and may execute relevant code, for example, to display a real-time image as viewed by the camera 102 and/or 104 as well as other image objects on the viewport 106.
  • FIG. 3 illustrates a lighting design system 300 including the augmented reality device 100 of FIG. 1 A for improving the lighting of a target area 304 according to an example embodiment.
  • the system 100 includes the AR device 100 and a lux measurement device 302.
  • the lux measurement device 302 is attached to the AR device 100.
  • the lux measurement device 302 is attached to the AR device 100 via a cable as shown in FIG. 3 or may be directly plugged into a port of the AR device 100.
  • the lux measurement device 302 measures light intensity levels. As the light intensity level changes, the light intensity level measured and indicated by the lux measurement device 302 also changes.
  • the lux measurement device 302 may measure and indicate different light intensity values.
  • the lux measurement device 302 may be moved at a particular elevation above the floor of the area 304.
  • the lux measurement device 302 may be moved around to make measurements at a work plane level, for example, to determine light intensity levels at an elevation where people may sit to work.
  • the lux measurement device 302 may be moved to make measurements at the floor or close to the floor level.
  • the lux measurement device may provide the light intensity values to the AR device 100 on a real time basis as the light intensity is being measured by the lux measurement device 302.
  • the AR device 100 may determine the location of the AR device 100 (thus, effectively the location of the lux measurement device 302) and associate particular locations with the measured light intensity level values provided by the lux measurement device 302.
  • the AR device 302 may determine the location of the AR device 100 in the area 304 based on indoor position system (GPS), global positioning system (GPS), or other means.
  • GPS indoor position system
  • GPS global positioning system
  • the AR device 100 may execute one or more ARKit 3.0 modules to perform position tracking.
  • the AR device 100 may display a real time image 312 of the area 304 on the viewport 106 of the AR device 100.
  • the real time image 312 may be captured by the camera 102 of the AR device 100.
  • the AR device 100 may augment the real time image 312 of the area 304 with light intensity (illuminance) information 306.
  • the AR device 100 may display the light intensity information 306 overlaid on the real time image 312.
  • the light intensity information 306 may include light intensity( i.e., illuminance values) as measured by the lux measurement device 302 and provided to the AR device 100. Because the measured light intensity values are associated by the AR device 100 with locations in the area 304, the AR device 100 may display the illuminance values in association with respective locations in the area 304.
  • the AR device 100 may display the light intensity information 306 using color coding instead of or in addition to illuminance values (e.g., in footcandle).
  • the AR device 100 may display the lighting intensity information as a “heat-map,” where locations at the floor level or a work plane that are associated with different illuminance values are shown in different color.
  • locations that are associated with higher illuminance values may be shown with more reddish colors
  • locations associated with lower illuminance values may be shown with more bluish colors
  • locations associated with mid-range illuminance values may be shown with greenish colors.
  • a user may provide an input to the AR device 100 to identify one or more sections of the area 304 that have undesirably low illuminance. For example, the user may identify a section 308 as a dark spot location (i.e., an area with a lower illuminance than desired).
  • the AR device 100 may execute code to process the measured light intensity level values provided by the lux measurement device 302 to identify one or more dark spots such as the section 308.
  • the AR device 100 may identify location that have illuminance values below a threshold level as dark spots.
  • the AR device 100 may also display indicators (e.g., a particular shape icon) on the viewport 106 to indicate such dark spots.
  • the AR device 100 may execute code to recommend a particular lighting fixture that can be used to remedy one or more dark spots such as the dark spot 308. For example, the AR device 100 may automatically display a 3-D model of recommended lighting fixture overlaid on the real time image 306 as shown in FIG. 4. In some alternative embodiments, the AR device 100 may display a 3-D model of a recommended lighting fixture.
  • the AR device 100 may display identification information (e.g., model number, serial number, etc.) of a recommended lighting fixture to remedy a dark spot, and a user may select a 3-D model of the recommended lighting fixture from a menu 304 and place the 3-D model at a desired location in the real time image 312 such that the 3-D model is overlaid on the real time image 312.
  • identification information e.g., model number, serial number, etc.
  • the AR device 100 may not recommend a particular lighting fixture. Instead, a user may select a 3-D model of a desired lighting fixture from the menu 304 and place the 3-D model at a desired location in the real time image 312 such that the 3-D model is overlaid on the real time image 312.
  • FIG. 4 illustrates the lighting design system 300 of FIG. 3 showing a 3-D model 402 of lighting fixture overlaid on a real-time image 312 of the target area 304 according to an example embodiment.
  • the 3-D model 402 may correspond to a lighting fixture recommended by the AR device 100 based on the analysis of the illuminance values measured by the lux measurement device 302.
  • the 3-D model 402 may be displayed automatically by the AR device 100 as a recommendation of the lighting fixture represented by the 3-D model 402 to remedy a dark spot such as the dark spot 308.
  • the 3-D model 402 may be selected and placed at a particular location by a user by provide input (e.g., selecting and moving using a finger) to the AR device 102.
  • the AR device 100 may calculate/determine updated illuminance values for the area 304, for example, at a work plane or floor level, and display the light intensity information 404 that is based on the measured light intensity values and calculated illuminance values associated with the 3-D model 402.
  • the illuminance values associated with the 3-D model 402 and, thus, with the lighting fixture represented by the 3-D model 402 may be determined using parameter data (e.g., parameter data in an IES file) for various locations in the area 304 and for a particular installation height of the lighting fixture as represented by the location of the 3-D model 402 in the real time image 312 in a similar manner as described in U.S. Patent Application No.
  • the illuminance values determined with respect to the 3-D model 402 may be combined with the measured light intensity values to produce the light intensity information 404 that may include illuminance values and/or color coded heat map as described with respect to FIG. 3.
  • the light intensity information 404 may be updated if additional 3-D models of lighting fixtures are added to the real-time image 312.
  • FIG. 5 illustrates an air floor measurement and display system 500 including the augmented reality device 100 of FIG. 1A according to an example embodiment.
  • the system 500 includes AR device 100 and lighting fixtures including lighting fixtures 504, 506.
  • the AR device 100 may display a real time image 502 of an area 516.
  • the lighting fixtures 506 and other lighting fixtures may be installed in the area 516.
  • the lighting fixtures 504 may be installed in the area 516.
  • the lighting fixture 504 may include an air flow sensor 508 that may be integrated with or attached to the lighting fixture 504, and the lighting fixture 506 may include an air flow sensor 510 that may be integrated with or attached to the lighting fixture 506.
  • the air flow sensors 508, 510 may measure air flows in the area 516 and wirelessly (e.g., using Wi-Fi, BLE, ZigBee, etc. connections) transmit the measured air flow values (e.g., in cfm or other units) to the AR device 100.
  • the sensors 508, 510 may transmit the air flow values to the AR device 100 using the wireless communication components of the lighting fixtures 504, 506.
  • the AR device 100 may display the air flow values (e.g., an air flow value 514) overlaid on the real time image 502 of the area 516.
  • the AR device 100 may display the air flow values in association with particular locations of the area 516.
  • each sensor 508, 510 or respective lighting fixture 504, 506 may transmit the location of the sensor 508, 510 or the lighting fixture 504, 506 to the AR device 100.
  • the AR device 100 may display the air flow values overlaid on the real time image 502 in association with the locations corresponding to the locations of the sensors 508, 510.
  • the AR device 100 may display air flow values received from a sensor regardless of whether the sensor is shown in the real time image 502.
  • the AR device 100 may generate an image 512 from the air flow values received from sensors (e.g., the sensor 508, 510) of lighting fixtures (e.g., the sensor 504, 506) in the area 516.
  • the AR device 100 may display the image 512 overlaid on the real time image 502 of the area 516.
  • FIG. 6 illustrates an air humidity measurement and display system 600 including the augmented reality device 100 of FIG. 1A according to an example embodiment.
  • the system 600 includes AR device 100 and lighting fixtures including lighting fixtures 604, 606.
  • the AR device 100 may display a real time image 602 of an area 616.
  • the lighting fixtures 604, 606 may each include one or more sensors.
  • the lighting fixture 604 may include a humidity sensor 608 that may be integrated with or attached to the lighting fixture 604, and the lighting fixture 606 may include a humidity sensor 610 that may be integrated with or attached to the lighting fixture 606.
  • the air humidity sensors 608, 610 may measure humidity in the area 616 and wirelessly (e.g., using Wi-Fi, BLE, ZigBee, etc. connections) transmit the measured humidity values to the AR device 100.
  • the sensors 608, 610 may transmit the humidity values to the AR device 100 using the wireless communication components of the lighting fixtures 604, 606.
  • the AR device 100 may receive humidity values from different sensors, including the sensors 608, 610, and calculate an average humidity value 612.
  • the AR device 100 may display the calculated average humidity value 612 overlaid on the real time image 602 of the area 616.
  • the AR device 100 may also display an icon 614 overlaid on the real time image 602 of the area 616.
  • the icon 614 graphically illustrates the calculated humidity value 612.
  • the AR device 100 may calculate the average humidity value 612 based on humidity values received from sensors that may not be shown in the real time image 616. For example, when the AR device 100 is turned to view a different section of the area 616, the sensor 610 may not be displayed in the viewport 106, but the humidity information overlaid on the displayed real time image may still include the humidity value provided by the sensor 610.
  • FIG. 7 illustrates an air quality measurement and display system 700 including the augmented reality device 100 of FIG. 1A according to an example embodiment.
  • the system 700 includes AR device 100 and lighting fixtures including lighting fixtures 704, 706.
  • the AR device 100 may display a real time image 702 of an area 716.
  • the lighting fixtures 704, 706 may display a real time image 702 of an area 716.
  • the lighting fixtures 704, 706 may each include one or more sensors.
  • the lighting fixture 704 may include an air quality sensor 708 that may be integrated with or attached to the lighting fixture 704, and the lighting fixture 706 may include an air quality sensor 710 that may be integrated with or attached to the lighting fixture 706.
  • the air quality sensors 708, 710 may measure air quality in the area 716 and wirelessly (e.g., using Wi-Fi, BLE, ZigBee, etc. connections) transmit the measured quality values to the AR device 100.
  • the sensors 708, 710 may transmit air quality values for different matters.
  • the sensors 708, 710 may transmit the air quality values to the AR device 100 using the wireless communication components of the lighting fixtures 704, 706.
  • the AR device 100 may receive air quality values from different sensors, including the sensors 708, 710, and calculate an average air quality value 712, 718 for different matters.
  • the AR device 100 may display the calculated average air quality values 712, 718 (e.g., as percentages of maximum acceptable level) overlaid on the real time image 702 of the area 716.
  • the AR device 100 may also display an icon 714 overlaid on the real time image 702 of the area 716 and associated with the air quality value 712.
  • the icon 714 may be a graphical illustration of a particular matter that is identified in a legend 722.
  • the AR device 100 may also display an icon 720 overlaid on the real time image 702 of the area 716 and associated with the air quality value 718.
  • the icon 720 may be a graphical illustration of another matter that is identified in the legend 722.
  • the AR device 100 may calculate the average air quality value 712, 718 based on air quality values received from sensors that may not be shown in the real time image 716.
  • the highest or lowest air quality value with respect to a particular matter may be shown without departing from the scope of this disclosure.
  • FIG. 8 illustrates the augmented reality device 100 of FIG. 1 A used for ultraviolet lighting design according to an example embodiment.
  • a real time image 802 of a target area 804 may be displayed on the viewport 106 of the AR device 100.
  • the real time image 802 may be captured by the camera 102 of the AR device 100.
  • the menu 310 of lighting fixtures may also be displayed in the viewport 106.
  • the target area 802 may include surfaces such as the floor surface 806 and the tale surface 808.
  • the AR device 100 may perform a spatial mapping of at least a portion of the target physical area 804.
  • the AR device 100 may execute software code, such as modules of ARKit 3.0 or modules of HoloToolkit, to identify surfaces.
  • software code such as modules of ARKit 3.0 or modules of HoloToolkit
  • a user may place a 3-D model of a lighting fixture in the viewport 106 overlaid on the real time image 802 as shown in FIG. 10, and the AR device 100 may determine intensity levels of an ultraviolet light that the lighting fixture emits as described with respect to FIG. 9.
  • FIG. 9 illustrates a 3-D model 902 of a lighting fixture that emits an ultraviolet light, and ultraviolet light intensity values determined based on parameter data associated with the 3-D model 902 according to an example embodiment.
  • the 3-D model 902 may correspond to the 3-D model 1004 shown in FIG. 10.
  • the photometric data associated with the 3-D model 902 may include or may be used to determine a light distribution shape and intensity levels of an ultraviolet light that the lighting fixture represented by the 3-D model 902 emits.
  • the intensity levels of the ultraviolet light may be determined at the different areas of the particular surface.
  • the AR device 100 may determine the distribution shape and the intensity levels of the ultraviolet light in a similar manner as described in U.S. Patent Application No. 16/195,581.
  • FIG. 10 illustrates a 3-D model of 1004 lighting fixture that emits an ultraviolet light, and ultraviolet light intensity values overlaid on a real-time image of a target area according to an example embodiment.
  • the 3-D model 1004 is displayed at a particular location in the area 802. For example, a user may select the 3-D model 1004 from the menu 310 and place the 3- D model 1004 at a desired location in the real time image 802.
  • the AR device 100 By executing AR software code, such as ARKit 3.0 modules, the AR device 100 associates the 3-D model 1004 with a particular location (including a particular installation height) in the area 802 as can be readily understood by those of ordinary skill in the art with the benefit of this disclosure.
  • AR software code such as ARKit 3.0 modules
  • the AR device 100 determines the intensity levels of the ultraviolet light that gets emitted by the lighting fixture represented by the 3-D model 1004 in a similar manner as described with respect to FIG. 9 and U.S. Patent Application No. 16/195,581.
  • the AR device 100 may display the calculated intensity level values of the ultraviolet light at different locations of the surfaces 806, 808.
  • the AR device 100 may recalculate the intensity levels of the ultraviolet light if the user moves the 3-D model 1004 to a different location in the real time image 802 as displayed in the viewport 106, if the user selects a different 3-D model from the menu 310, if the user places another 3- D model of a lighting fixture in the real time image 802, etc.
  • the AR device 100 may display the intensity levels of the ultraviolet light as color coded (e.g., as a heat map).
  • FIG. 11 illustrates a method 1100 of augmented reality -based lighting design to improve a lighting of an area according to an example embodiment.
  • the method 1100 includes attaching a measurement device to an augmented reality device.
  • the method 1100 includes measuring illuminance values in an area using the measurement device, wherein the illuminance values are provided to the augmented reality device.
  • the method 1100 includes determining locations of the augmented reality device while the parameter is being measured the measurement device.
  • the method 1100 includes displaying the illuminance values overlaid on a real-time image of the area at locations corresponding to the measurement values.
  • the method 1110 includes displaying a lighting fixture model overlaid on the real-time image of the area, wherein the lighting fixture model is selected based on illuminance values at a portion of the area.
  • the method 1100 may include more or fewer steps than described without departing from the scope of this disclosure. In some alternative embodiments, the steps of the method 1100 may be performed in a different order than shown without departing from the scope of this disclosure.
  • FIG. 12 illustrates a method 1200 of augmented reality -based lighting design to improve a lighting of an area according to an example embodiment.
  • the method 1200 includes displaying, by the augmented reality device 100, a real-time image of a target physical area on a display screen of the augmented reality device.
  • the method 1200 includes displaying, by the augmented reality device 100, a 3-D model 1004 of a lighting fixture in response to a user input, where the 3-D model 1004 is overlaid on the real-time image of the target physical area.
  • the lighting fixture provides an ultraviolet light.
  • the method 1200 includes displaying on the display screen, by the augmented reality device 100, ultraviolet light intensity level values overlaid on the real-time image of the target physical area.
  • the method 1200 may include more or fewer steps than described without departing from the scope of this disclosure. In some alternative embodiments, the steps of the method 1200 may be performed in a different order than shown without departing from the scope of this disclosure.
  • FIG. 13 illustrates a method 1300 of augmented reality -based lighting design for ultraviolet light lighting fixtures according to an example embodiment.
  • the method 1300 includes displaying, by the augmented reality device 100, a real-time image of a target physical area on a display screen of the augmented reality device.
  • the method 1300 includes displaying, by the augmented reality device, a lighting fixture 3-D model on the display screen in response to a user input, where the lighting fixture 3-D model is overlaid on the real-time image of the target physical area.
  • the method 1300 includes performing, by the augmented reality device, a spatial mapping of at least a portion of the target physical area.
  • the method 1300 includes determining, by the augmented reality device, ultraviolet light intensity level values of an ultraviolet light on one or more surfaces of the target physical area based on at least parameter data associated with the lighting fixture 3-D model.
  • the method 1300 includes displaying, by the augmented reality device, ultraviolet light intensity level values on the display screen overlaid on the real-time image of the target physical area.
  • the method 1300 may include more or fewer steps than described without departing from the scope of this disclosure. In some alternative embodiments, the steps of the method 1300 may be performed in a different order than shown without departing from the scope of this disclosure.

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Economics (AREA)
  • Marketing (AREA)
  • Finance (AREA)
  • Accounting & Taxation (AREA)
  • General Business, Economics & Management (AREA)
  • Strategic Management (AREA)
  • Human Resources & Organizations (AREA)
  • Computer Graphics (AREA)
  • Tourism & Hospitality (AREA)
  • Software Systems (AREA)
  • Development Economics (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • General Health & Medical Sciences (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Operations Research (AREA)
  • Quality & Reliability (AREA)
  • Primary Health Care (AREA)
  • Architecture (AREA)
  • Water Supply & Treatment (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Circuit Arrangement For Electric Light Sources In General (AREA)
  • Processing Or Creating Images (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Un système de conception à réalité augmentée comprend un dispositif de réalité augmentée, le dispositif de réalité augmentée comportant un dispositif d'affichage et une caméra. Le système comprend en outre un dispositif de mesure en communication avec le dispositif de réalité augmentée, des valeurs mesurées d'un paramètre mesuré par le dispositif de mesure étant fournies à un dispositif de réalité augmentée. Le dispositif de réalité augmentée est capable de déterminer des emplacements du dispositif de réalité augmentée, dont chacun est associé à une valeur mesurée du paramètre prise par le dispositif de mesure à chaque emplacement. Le dispositif de réalité augmentée est en outre capable de fournir, par l'intermédiaire du dispositif d'affichage, une cartographie spatiale des emplacements à l'intérieur de la zone cible, et d'afficher les valeurs mesurées superposées sur une image en temps réel d'une zone physique cible sur le dispositif d'affichage, chaque valeur mesurée étant affichée à proximité d'un emplacement correspondant où chaque valeur mesurée a été mesurée.
PCT/EP2021/061732 2020-05-07 2021-05-04 Conception basée sur réalité augmentée WO2021224262A2 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202063021217P 2020-05-07 2020-05-07
US63021217 2020-05-07

Publications (2)

Publication Number Publication Date
WO2021224262A2 true WO2021224262A2 (fr) 2021-11-11
WO2021224262A3 WO2021224262A3 (fr) 2022-01-06

Family

ID=75801608

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2021/061732 WO2021224262A2 (fr) 2020-05-07 2021-05-04 Conception basée sur réalité augmentée

Country Status (1)

Country Link
WO (1) WO2021224262A2 (fr)

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8830267B2 (en) * 2009-11-16 2014-09-09 Alliance For Sustainable Energy, Llc Augmented reality building operations tool
FR2970791B1 (fr) * 2011-01-24 2013-02-08 Sens Innov Procede de restitution de donnees representatives de la qualite de l'air a l'interieur d'un lieu clos ou semi-clos, systeme et produit programme d'ordinateur correspondants
US11232502B2 (en) * 2017-12-20 2022-01-25 Signify Holding B.V. Lighting and internet of things design using augmented reality
EP3810211A4 (fr) * 2018-06-12 2022-03-09 Phonesoap LLC Systèmes et procédés de gestion de désinfection

Also Published As

Publication number Publication date
WO2021224262A3 (fr) 2022-01-06

Similar Documents

Publication Publication Date Title
US11847677B2 (en) Lighting and internet of things design using augmented reality
US10937245B2 (en) Lighting and internet of things design using augmented reality
JP5819431B2 (ja) 照明システムを制御する方法及びユーザ対話システム、持ち運び可能な電子デバイス並びにコンピュータプログラム
US9317959B2 (en) System and method for visualizing virtual objects on a mobile device
RU2666770C2 (ru) Устройство для управления освещением
RU2557084C2 (ru) Система и способ управления интерактивным освещением
JP6480012B2 (ja) カラーピッカー
US20180197339A1 (en) Augmented reality device for visualizing luminaire fixtures
EP3338516A1 (fr) Procédé de visualisation d'une forme d'un dispositif d'éclairage linéaire
US11985748B2 (en) Method of configuring a plurality of parameters of a lighting device
WO2021224262A2 (fr) Conception basée sur réalité augmentée
US20190043398A1 (en) Device management apparatus, device managing method, and program
RU2574586C2 (ru) Способ и система взаимодействия с пользователем для управления системой освещения, портативное электронное устройство и компьютерный программный продукт
JP2021056127A (ja) 情報処理システム
JP2021056849A (ja) 情報処理装置および情報処理方法
JP2021056848A (ja) 情報処理装置および情報処理方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21723721

Country of ref document: EP

Kind code of ref document: A2

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21723721

Country of ref document: EP

Kind code of ref document: A2