WO2017084940A1 - Glare reduction - Google Patents

Glare reduction Download PDF

Info

Publication number
WO2017084940A1
WO2017084940A1 PCT/EP2016/077117 EP2016077117W WO2017084940A1 WO 2017084940 A1 WO2017084940 A1 WO 2017084940A1 EP 2016077117 W EP2016077117 W EP 2016077117W WO 2017084940 A1 WO2017084940 A1 WO 2017084940A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
glare
light source
control unit
environment
Prior art date
Application number
PCT/EP2016/077117
Other languages
French (fr)
Inventor
Vincentius Paulus Buil
Lucas Jacobus Franciscus Geurts
Original Assignee
Koninklijke Philips N.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips N.V. filed Critical Koninklijke Philips N.V.
Publication of WO2017084940A1 publication Critical patent/WO2017084940A1/en

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02CSPECTACLES; SUNGLASSES OR GOGGLES INSOFAR AS THEY HAVE THE SAME FEATURES AS SPECTACLES; CONTACT LENSES
    • G02C7/00Optical parts
    • G02C7/10Filters, e.g. for facilitating adaptation of the eyes to the dark; Sunglasses
    • G02C7/101Filters, e.g. for facilitating adaptation of the eyes to the dark; Sunglasses having an electro-optical light valve
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60JWINDOWS, WINDSCREENS, NON-FIXED ROOFS, DOORS, OR SIMILAR DEVICES FOR VEHICLES; REMOVABLE EXTERNAL PROTECTIVE COVERINGS SPECIALLY ADAPTED FOR VEHICLES
    • B60J3/00Antiglare equipment associated with windows or windscreens; Sun visors for vehicles
    • B60J3/04Antiglare equipment associated with windows or windscreens; Sun visors for vehicles adjustable in transparency
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J1/00Photometry, e.g. photographic exposure meter
    • G01J1/02Details
    • G01J1/0266Field-of-view determination; Aiming or pointing of a photometer; Adjusting alignment; Encoding angular position; Size of the measurement area; Position tracking; Photodetection involving different fields of view for a single detector
    • GPHYSICS
    • G02OPTICS
    • G02CSPECTACLES; SUNGLASSES OR GOGGLES INSOFAR AS THEY HAVE THE SAME FEATURES AS SPECTACLES; CONTACT LENSES
    • G02C11/00Non-optical adjuncts; Attachment thereof
    • G02C11/10Electronic devices other than hearing aids

Landscapes

  • Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Ophthalmology & Optometry (AREA)
  • General Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Optics & Photonics (AREA)
  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Acoustics & Sound (AREA)
  • Otolaryngology (AREA)
  • Circuit Arrangement For Electric Light Sources In General (AREA)

Abstract

A method of reducing glare perceived by a user, the method comprising: receiving at least one sensor output signal; detecting a position and orientation of the user using at least one or more of the at least one sensor output signal; plotting the position and orientation of the user on a predetermined map of an environment of the user, the map including a location of one, or any combination of: one or more controllable artificial light source in the environment of the user; and one or more indirect light source in the environment of the user; detecting that glare is perceived by the user and identifying a light source in the environment of the user causing the glare using the map; and based on said detection, adjusting light coming from said light source to reduce the glare.

Description

Glare reduction
TECHNICAL FIELD
The present disclosure relates to reducing glare that is perceived by a human.
BACKGROUND
Glare is a phenomenon wherein a human has difficulties to see because of a very bright light.
Glare may be caused by light emitted by the sun or from an artificial light source. The light causing the glare may be directly incident into a human's eyes or reflect off of a surface and thus be indirectly incident into a human's eyes. Well known situations are reflections on displays or shiny surfaces, windows, (car) mirrors, and so on.
Glare can cause discomfort, hinder or even temporarily blind people from important things around them which may result in dangerous situations.
Known approaches to circumvent glare include application of diffusers, filters, polarization, or blocking the light emitted by the light source which causes the glare.
Dynamic Eye, Inc. have developed sunglasses which comprise a small sensor attached to the bridge of the glasses which monitors the wearer's line of vision. The sensor chip is designed to detect glare and determine what direction it's coming from. When brightness has exceeded a certain threshold, the sensor triggers an adjacent microcontroller to darken specific pixels on the lens. A 4-by-6-millimeter rectangle is created that moves around each lens to block blinding light as the wearer moves his or her head.
WO2014195821 discloses a glare control system for drivers that interfere with the transmission path of light towards the driver.
US7651220B1 discloses a glare control system for drivers that interfere with the transmission path of light using glasses that block glare in the view of the user of these glasses.
US 20090204291 discloses a glare control system that also interferes
An apparatus and method ofcontrolling the transparency of transmission path of light towards the driver. The window in a vehicle includes one multisegment light filtering elements, each segment having a controllable light transmissive characteristic. W093/21624 discloses an electro -optical system which comprises an electro- optical element having an optically transparent surface having electrically addressable pixels. Each pixel has a light transmittance which is controllable by a controller that is operably associated with a computer system and thus only interferes with the transmission path of light.
SUMMARY
Automatic blind systems aim to control an indoor building's climate such that engagement of air-conditioning systems can be minimized. Automatic blind systems engage and disengage the blinds in relation to the amount of sunlight present over time. This may be in conflict with people that are nearby windows who experience glare, directly from the sun, or indirectly via reflections of surfaces inside or outside the building (e.g. windows or lake). In these situations people will often manually override the automatic behavior of the blind system. This is distracting and often annoying.
A similar problem is encountered with indoor lighting installations. Indoor lighting installations can be controlled manually or automatically. In case of glare that is perceived by a person, the person has to manually take measures to counter the glare, by adjusting a light source causing the glare or placing something in the line of sight between the light source and his eyes.
For indoor situations, the sunglasses developed by Dynamic Eye, Inc. are undesirable in that the sunglasses create an uncomfortable dark spot in the wearer's visual field, which also disturbs the wearer's esthetic appearance to others.
Embodiments of the present disclosure address the source of glare, via automatic control of appropriate blinds or adjusting the artificial light sources inside a building that are causing the glare.
According to one aspect of the present invention there is provided a method of reducing glare perceived by a user, the method comprising: receiving at least one sensor output signal; detecting a position and orientation of the user using at least one or more of the at least one sensor output signal; plotting the position and orientation of the user on a predetermined map of an environment of the user, the map including a location of one, or any combination of: one or more controllable artificial light source in the environment of the user; and one or more indirect light source in the environment of the user; detecting that glare is perceived by the user and identifying a light source in the environment of the user causing the glare using the map; and based on said detection, adjusting said light source to reduce the glare.
The at least one sensor output signal may comprise a sensor output signal that is output from an outward facing sensor that is positioned on glasses worn by the user
The method may comprise detecting one or more of the positon and orientation of the user using the sensor output signal output from the outward facing sensor.
Detecting that glare is perceived by the user may be based on the sensor output signal that is output from the outward facing sensor.
In embodiments whereby the at least one sensor output signal comprises the sensor output signal that is output from the outward facing sensor, the method may comprise detecting an intensity of light that is directed towards the user using the sensor output signal that is output from the outward facing sensor, and detecting that glare is perceived by the user based on said intensity.
The at least one sensor output signal may comprise a sensor output signal that is output from an inward facing sensor that is positioned on glasses worn by the user, and detecting that glare is perceived by the user may be based on monitoring the user's eyes using the sensor output signal that is output from the inward facing sensor.
The method may comprise detecting that glare is perceived by the user based on one or more a pupil size of the user's eyes and whether a reflection image is visible on the cornea of the user's eyes.
The method may comprise determining an angle of incidence of the light causing the glare, and using said angle and the map to identify said light source.
The method may comprise determining an orientation of the user's head using the at least one sensor output signal; and using the determined head orientation and the map to detect that glare is perceived by the user and identify the light source in the environment of the user causing the glare.
Detecting the position of the user may comprise measuring signals transmitted between glasses worn by the user and a plurality of wireless reference nodes of a location network.
The method may comprise detecting the orientation of the user based on a sensor output signal output from an inertial sensor that is positioned on glasses worn by the user.
The at least one sensor output signal comprises a sensor output signal that is output from an image sensor of a computer device in the environment of the user. In these embodiments the method may comprises: determining an orientation of the user's head using the sensor output signal output from the image sensor of the computer device; receiving position and orientation information of the computer device from the computer device; and detecting the position and orientation of the user using the determined head orientation and the received position and orientation information.
Detecting that glare is perceived by the user may be based on the sensor output signal output from the image sensor of the computer device.
The method may further comprise adapting light coming from one or more of: (i) at least one controllable artificial light source of the one or more controllable artificial light source in the environment of the user, and (ii) at least one indirect light source of the one or more indirect light source in the environment of the user, to compensate for the adjustment of light coming from said light source.
When the light source identified as causing the glare is a controllable artificial light source, the method may comprise transmitting a lighting command to the controllable artificial light source to control the light emitted by the controllable artificial light source to reduce the glare.
When the light source identified as causing the glare is an indirect light source, the method may comprise transmitting a lighting command to a controllable window blind unit to reduce an intensity of the light coming from said indirect light source to reduce the glare.
When the light source identified as causing the glare is a window having electrically controllable transmission and/or reflection properties, the method may comprise transmitting a control signal to the window to control the transmission and/or reflection properties of the window to reduce an intensity of the light passing through said window to reduce the glare.
According to another aspect of the present disclosure there is provided a computer program product comprising code embodied on a computer-readable medium and being configured so as when executed on a processor of a control unit to perform any of the methods disclosed herein.
According to another aspect of the present disclosure there is provided a control unit for reducing glare perceived by a user, the control unit comprising an input for receiving at least one sensor output signal and configured to: detect a position and orientation of the user using at least one or more of the at least one sensor output signal; plot the position and orientation of the user on a predetermined map of an environment of the user that is stored in a memory coupled to the control unit, the map including a location of one, or any combination of: one or more controllable artificial light source in the environment of the user; and one or more indirect light source in the environment of the user; detect that glare is perceived by the user and identify a light source in the environment of the user causing the glare using the map; and based on said detection, adjust said light source to reduce the glare.
Wherein the light source identified as causing the glare is a controllable artificial light source (308) coupled to the control unit, the control unit may be configured to transmit a lighting command to the controllable artificial light source to control the light emitted by the controllable artificial light source to reduce the glare.
Wherein the light source identified as causing the glare is an indirect light source, the control unit may be configured to transmit a lighting command to a controllable window blind unit coupled to the control unit to reduce an intensity of the light coming from said indirect light source to reduce the glare.
Wherein the light source identified as causing the glare is a window having electrically controllable transmission and/or reflection properties coupled to the control unit, the control unit may be configured to transmit a control signal to the window to control the transmission and/or reflection properties of the window to reduce an intensity of the light passing through said window to reduce the glare.
These and other aspects will be apparent from the embodiments described in the following. The scope of the present disclosure is not intended to be limited by this summary nor to implementations that necessarily solve any or all of the disadvantages noted.
BRIEF DESCRIPTION OF THE DRAWINGS
For a better understanding of the present disclosure and to show how embodiments may be put into effect, reference is made to the accompanying drawings in which:
Figure 1 illustrates a scenario whereby glare is perceived by a person due to light from the sun;
Figure 2 illustrates wearable glasses in accordance with embodiments of the present disclosure;
Figure 3 is a schematic block diagram of a control system for reducing glare perceived by a person;
Figure 4 is a flow chart of a method implemented by a control unit of the control system to reduce glare perceived by a person; and Figure 5 illustrates how the control unit may control controllable window blinds to reduce the glare perceived by a person due to light from the sun.
DETAILED DESCRIPTION
Embodiments will now be described by way of example only.
Figure 1 illustrates a person 100 present in an indoor environment comprising a window 106. As shown in Figure 1 a beam of light 104 emitted by the sun may pass through the window 106 and be perceived as glare by the person 100. The light beam causing the glare may be directly incident into the eyes of the person or reflect off of a surface in the indoor environment. The term "indirect light source" is used herein to refer to an object through which light (artificial or natural) may pass through or may reflect off of, and which itself does not generate and emit light. Examples of an indirect light source include a window or a reflective surface in the environment of the user.
Whilst Figure 1 illustrates one example scenario, additionally or alternatively, glare may be caused by artificial light emitted by one or more controllable artificial light sources that are present in the indoor environment. A controllable artificial light source is a device for emitting illumination for illuminating an environment, comprising at least one light source such as an LED, an LED string or array, a gas discharge lamp or a filament bulb, plus any associated socket, housing and/or support. A controllable artificial light source may be installed at a fixed location within the environment, e.g. in the ceiling and/or walls, and/or on light poles fixed to the floor or ground.
Embodiments of the present disclosure address this glare by the use of glasses that are wearable by the person 100. Figure 2 illustrates glasses 200 that are wearable by person 100 (otherwise referred to herein as a user).
The glasses 200 comprise one or more lens and a frame having a central portion that fits over the nose bridge of a wearer, and a left and right supporting extension which are intended to fit over a user's ears. It will be appreciated that the form of the glasses 200 shown in Figure 2 is merely an example, for instance although the supporting extensions are shown to be substantially straight, they could terminate with curved parts to more comfortably fit over the ears in the manner of conventional spectacles.
The glasses 200 comprise one or more sensor for detecting glare and monitoring glare reduction that is implemented in accordance with embodiments of the present disclosure. The glasses 200 may comprise an outward facing sensor 202 which faces away from the user's body such that the indoor environment is in the field of view of the outward facing sensor 202. The outward facing sensor 202 is an image sensor (e.g. a camera) comprising a two-dimensional array of light detecting elements (photodiode, phototransistor or other suitable device) also called pixels. The outward facing sensor 202 is configured to output image data indicative of the intensity of the light impinging upon each light detecting element. Thus, images captured by the outward facing sensor 202 can be represented as a two-dimensional array of intensity values (pixel data). Due to being outward facing, the sensor 202 is operable to detect the intensity of light that is directed towards the user's eyes.
Alternatively or additionally, the glasses 200 may comprise an inward facing sensor 204 which faces towards the user's body such that one of the user's eyes is in the field of view of the inward facing sensor 204. The inward facing sensor 204 is an image sensor (example of which has been described above with reference to the outward facing sensor 202) that is positioned to monitor the user's pupil response to incoming light. The inward facing sensor 204 may be mounted at any suitable position on the glasses 200 provided that it is able to provide such a function. For example, the inward facing sensor may be mounted on a supporting extension or other portion of the frame, or on a lens of the glasses 200.
The iris of the human eye functions like the diaphragm of a camera, controlling the amount of light reaching the back of the eye by automatically adjusting the size of the pupil (aperture). A greater intensity of light causes the iris to adjust its size which causes the size of the pupil to constrict thus allowing less light to be incident on the retina, whereas a lower intensity of light causes the iris to adjust its size which causes the size of pupil to increase (dilate) thus allowing more light to be incident on the retina.
The inward facing sensor 204 is used to monitor a user's pupil response to glare, it does this by monitoring the size (e.g. diameter) of the user's pupil.
The glasses 200 further comprises a wireless network interface 206 which is coupled to the outward facing sensor 202 and inward facing sensor 204. The wireless network interface is used to transmit image data output from the outward facing sensor 202 and inward facing sensor 204 to an external control unit 302. The wireless network interface may for example be a Wi-Fi, Zigbee, Bluetooth or other short-range radio frequency (RF) wireless access technology interface.
The glasses 200 and control unit 302 form part of a control system 300 which is illustrated in Figure 3. Operation of the control unit 302 will now be described with reference to Figures 3 and 4. Figure 4 is a flow chart of a process 400 implemented by the control unit 302 to reduce glare perceived by user 100.
At step 402 the control unit 302 receives sensor output signal(s) output from one or more sensors. The control unit 302 comprises a suitable network interface (wired or wireless) to facilitate receipt of data transmitted by the one or more sensors. The types of sensor output signals that the control unit 302 may receive at step S402 is described in more detail below.
At step S404, the control unit 302 determines the position and orientation of the user 100.
In terms of determining the orientation of the user 100, the glasses 200 may comprise one or more orientation sensor that is configured to transmit data to the control unit 302 via the wireless network interface 206.
The orientation sensor integrated into the glasses 200 may comprise an inertial sensor. For example, the orientation sensor may comprise a gyroscope and/or an
accelerometer and/or a magnetometer. The orientation sensor may provide a single or multi axis measurement(s). That is, the orientation sensor may provide a measurement pertaining to the rotation about one or more axes. In this embodiment, the control unit 302 is configured to determine the orientation of the user 100 based on the measurements received from the inertial sensor.
In other embodiments, an image sensor integrated into the glasses 200 may be configured to transmit image data to the control unit 302. In this embodiment, the control unit 302 is configured to determine the orientation and/or position of the user 100 based on performing image processing on images captured by the image sensor. In particular, the control unit 302 may perform known computer vision techniques to recognize objects in the user's view and determine their position in three-dimensions. This information can then be used by the control unit 302 to determine the user's positon and/or orientation. The image sensor providing image data for use in orientation sensing may be a dedicated sensor that is provided in addition to the outward facing sensor 202 and/or inward facing sensor 204.
Alternatively, image data output by the outward facing sensor 202 may be used by the control unit 302 to determine the positon and/or orientation of the user 100.
As shown in Figure 3, a computer device (e.g. a phone, tablet, laptop etc.) comprising an image sensor (camera) 306 which is positioned in the environment of the user 100, may be coupled to the control unit 302 via a wired or wireless connection. In this embodiment, the camera 306 is configured to capture images of the user 100 and transmit image data to the control unit 302. The control unit 302 may be configured to determine the orientation of the user 100 based on performing image processing on images received from the image sensor 306. In particular, the control unit 302 may determine the head orientation of the user 100 by executing one or more face tracking algorithms. In combination with position and orientation information of the computer device which is received from the computer device via the wired or wireless connection, the control unit 302 may then determine the positon and orientation of the user 100.
In other embodiments, the user's position in the environment may be determined by the control unit 302 using a localization technique (described in more detail below), and this information may then be used to assist the control unit 302 in determining the orientation of the user 100 via the captured image data output by an image sensor (integrated into glasses 200 or a remote device).
In terms of determining the position of the user 100, the control unit 302 may determine the location of the glasses 200 (and thus the location of the user 100 waring the glasses 200) using a localization technique.
The control unit 302 may determine the location of the glasses 200 with respect to a location network 304 comprising a plurality of wireless reference nodes, in some cases also referred to as anchor nodes. These anchors are wireless nodes whose locations are known a priori, typically being recorded in a location database (not shown in Figure 3) which can be queried by the control unit 302 to look up the location of a node. The anchor nodes thus act as reference nodes for localization. Measurements are taken of the signals transmitted between the glasses 200 and a plurality of anchor nodes, for instance the RSSI (receiver signal strength indicator), ToA (time of arrival) and/or AoA (angle of arrival) of the respective signal. Given such a measurement from three or more nodes, the location of the glasses 200 may then be determined by the control unit 302 relative to the location network 304 using techniques such as trilateration, multilateration or triangulation. Given the relative location of the glasses 200 and the known locations of the anchor nodes, this in turn allows the location of the glasses 200 (and thus the user 100) to be determined in more absolute terms, e.g. relative to a map or floorplan.
The control unit 302 may determine the location of the glasses 200 based on a
"fingerprint" of a known environment of the user 100. The fingerprint comprises a set of data points each corresponding to a respective one of a plurality of locations throughout the environment in question. Each data point is generated during a training phase by placing the glasses 200 at the respective location, taking a measurement of the signals received from or by any reference nodes within range at the respective location (e.g. a measure of signal strength such as RSSI), and storing these measurements in a database along with the coordinates of the respective location. The data point is stored along with other such data points in order to build up a fingerprint of the signal measurements as experienced at various locations within the environment. Once deployed, the control unit 302 is configured to compare the signal measurements stored in the fingerprint with signal measurements currently experienced by the glasses 200 whose location is desired to be known, in order to estimate the location of the glasses 200 (and thus the user 100) relative to the corresponding coordinates of the points in the fingerprint. For example this may be done by approximating that the glasses 200 are located at the coordinates of the data point having the closest matching signal measurements, or by interpolating between the coordinates of a subset of the data points having signal measurements most closely matching those currently experienced by the device. The fingerprint can be pre-trained in a dedicated training phase before the fingerprint is deployed, by systematically placing a test device at various different locations in the environment. Alternatively or additionally, the fingerprint can built up dynamically by receiving submissions of signal measurements experienced by the actual devices of actual users in an ongoing training phase.
The reference signals transmitted between the reference nodes and the glasses 200 are the signals whose measurements are used to determine the location of the glasses 200.
In a device-centric approach the reference nodes each broadcast a signal and the glasses 200 listen, detecting one or more of those that are currently found in range and taking a respective signal measurement of each. Each reference node may be configured to broadcast its reference signal repeatedly. The respective measurement taken of the respective reference signal from each detected anchor node may for example comprise a measurement of signal strength (e.g. RSSI), time of flight (ToF), angle of arrival (AoA), and/or any other property that varies with distance or location. In this approach, the control unit 302 receives signal measurements for computing the location of the user 100 that are transmitted from the glasses 200 via the wireless network interface 206.
In a network-centric approach on the other hand, the glasses 200 broadcasts a reference signal and the reference nodes listen, detecting an instance of the signal at one or more of those nodes that are currently in range. In this case the glasses 200 may broadcast its reference signal repeatedly. The respective measurement taken of each instance of the reference signal from the glasses 200 may comprise a measure of signal strength (e.g. RSSI) or time of flight (ToF), angle of arrival (AoA), and/or any other property that varies with distance or location. In this approach, the control unit 302 receives signal measurements for computing the location of the user 100 that are transmitted from the one or more of those nodes that are currently in range of the glasses 200.
The reference signals referred to above may be RF signals (for example using Wi-Fi, ZigBee or Bluetooth, or other such short-range RF technology) or ultrasound signals.
Once the control unit 302 determines the position and orientation of the user 100 at step S404, the process 400 proceeds to step S406.
At step S406, the control unit 302 plots the position and orientation of the user 100 determined at step S404 on a map of the space that the user 100 is currently in, the map may for example be a floorplan of a room, building or complex. This map is stored in memory (not shown in Figure 3) coupled to the control unit 302. This map includes information about the location of light sources in the environment of the user 100 including the location of any controllable artificial light sources coupled to the control unit 302 and any indirect light sources in the environment of the user 100 (e.g. windows through which light from an uncontrollable light source (e.g. the sun) may pass through, and/or reflective surface(s) in the environment of the user 100 which could potentially cause glare).
At step S408, the control unit 302 detects that glare is experienced by the user 100 and identifies the light source in the environment causing the glare using the map.
In some embodiments, step S408 is performed by sensing of the glare based on the image data received from one or more of the outward facing sensor 202 and the inward facing sensor 204 of the glasses 200.
The control unit 302 may perform this detection based on detecting an intensity of light that is directed towards the user 100 using the image data output from the outward facing sensor 202, and detecting that glare is perceived by the user based on this detected intensity. For example, the control unit 302 may detect that an intensity value output by one or more pixel of the pixel array of the outward facing sensor 202 exceeds a predetermined threshold. Alternatively, the control unit 302 may be configured to determine an average of the plurality of intensity values that define a captured image, and perform this detection based on detecting that the average intensity of a captured image exceeds a predetermined threshold.
In one scenario, the user 100 may be using a computing device (e.g. tablet, laptop, television, mobile telephone etc.) comprising a display. The control unit 302 may detect that that glare is perceived by the user 100 by observing reflections that a light source can have on the display which the user 100 is looking at, using the image data output by the outward facing sensor 202.
Using the image data output from the outward facing sensor 202, the control unit 302 is able to determine an angle Θ of incidence of the light causing the glare.
Alternatively or additionally, the control unit 302 may perform this detection based on monitoring the user's eye's response to light in the environment based on image data received from the inward facing sensor 204.
The inward facing sensor 204 can see changes in the user's pupil size in response to light changes. The control unit 302 may maintain predetermined advisable pupil size information for each of a plurality of light levels. The control unit 302 determines an average of the plurality of intensity values that define an image captured by the outward facing sensor 202, and if the user's pupil size (determined using image data output by the inward facing sensor 204) is smaller than advisable for the average light level measured by the outward facing sensor 202, then the control unit 302 is configured to detect that glare is perceived by the user 100.
The inward facing sensor 204 can also monitor the user's eye's to see whether a refiection image is visible on the cornea of one of user's eyes, this is known as cornea refiection. For example if the user 100 stands near a window the indoor environment, a refiection image of the window will be visible in the user's eye(s). The control unit 302 may be configured to process the image data output by the inward facing sensor 204 to detect whether a high luminance reflection image is visible on the cornea of one of user's eyes, and detect that glare is perceived by the user 100 in responsive to detecting that a high luminance refiection image is visible. It is not necessary for the control unit 302 to perform image recognition of the light source in the processing of the image data (although it could do). It is sufficient for the control unit 302 to detect pixels with an above-threshold luminance levels in the image of the inward facing sensor 204 in order to classify them as glare (and determine its angle in relation to the user's head that is described in more detail below).
The control unit 302 is also able to determine an angle Θ of incidence of the light causing the glare, based on the position of the reflection image that is visible on the cornea of one of user's eyes. For example, if the refiection image that is visible is in the centre of the cornea, the control unit 302 can deduce that the source of the light causing the glare is straight in front of the user 100 (independent on where the user 100 is looking). If the refiection image that is visible is in a corner of the cornea, the control unit 302 can deduce that the source of the light causing the glare is to the side of the user 100. It will be appreciated that in this example, the outward facing sensor 202 is not required for glare detection (although it still might be used to assist with user position detection in relation to the map referred to above).
The glare may also be detected at step S408 based on the image data received from the computer device comprising the image sensor (camera) 306 that is positioned in the environment of the user 100. For example image data captured by camera of a smartphone can be used by the control unit 302 to observe the effect of the glare on the face of the user 100 by spotting high intensity light on the face of the user 100. Using the image data output from the image sensor (camera) 306, the control unit 302 is also able to determine an angle Θ of incidence of the light causing the glare.
Once the control unit 302 has determined an angle Θ of incidence of the light causing the glare using a method described above, the control unit 302 performs geometric calculations to compare the user position, orientation and angle Θ in relation to the map to trace the glare back to a light source e.g. a controllable artificial light source or indirect light source in the environment of the user 100 to identify the light source as being responsible for causing the glare.
The control unit 302 may use the time of day and/or sensed weather conditions as further inputs into the geometric calculations described above. For example if it is a cloudy day or night time, then it is less likely that the glare is caused by the sun, so the control unit 302 can more actively focus on the artificial light sources as potential sources of the glare rather than the indirect light sources.
Whilst the embodiments described above comprise sensing the glare, in other embodiments, step S408 is performed by predicting that glare will be perceived by the user 100 without sensing the glare in some way.
In these other embodiments, step S408 is performed by the control unit 302 using an orientation of the user's head determined at step S404, and the user's position plotted on the map (performed at step S406) to deduce that the user will likely suffer from glare and identify the light source in the environment of the user causing the glare. That is, by knowing the position of light sources (controllable artificial light sources and/or indirect light sources), and the position and orientation of the user's eyes, the control unit can predict if light from a light source will fall onto the user's face.
The control unit 302 may use its knowledge on which of the controllable artificial light sources are turned on, and the intensity of the light emitted from the controllable artificial light sources that are turned on in order to make this prediction. The control unit 302 may use image data received by an image sensor 306 positioned in the environment of the user 100 (e.g. positioned in or located near to window blinds, work stations, displays and reflective surfaces) may be used by the control unit 302 in order to make this prediction. For example an image sensor 306 may be positioned in or located near to window blinds, and the control unit 302 determines the angle at which light is passing through the window based on image data output by the image sensor 306. The control unit knows where the blinds are located and where the user is and how he is orientated by using the map referred to above. The control unit 302 can then deduce whether sunlight passing through the window will fall on the user's face, thus causing glare.
The control unit 302 may use the time of day and/or sensed weather conditions as further inputs in order to make this prediction.
As shown in Figure 3, the control unit 302 may be coupled to one or more controllable light sources 308. The control unit 302 may be coupled to the controllable light source(s) 308 by way of a wired or wireless connection. The light source(s) 308 are controllable in that characteristics of the light emitted by the light source(s) 308 may be controlled by the control unit 302. The control unit 302 is configured to control the light emitted by the light source(s) 308 by supplying lighting commands to the light source(s) 308. If at step S408, the control unit 302 traces the glare back to one of the controllable light sources 308, at step S410 the control unit 302 is configured to transmit one or more lighting command to the controllable light source 308 to adjust the light emitted by the controllable light source 308 to reduce the glare perceived by the user 100. In this scenario, the control unit 302 may perform step S410 by transmitting a lighting command to the controllable light source 308 to reduce the intensity of light emitted by the controllable light source 308. This causes the controllable light source 308 to turn off such that no light is emitted by the controllable light source 308 or emit light at a reduced (dimmed down) illumination level. Alternatively or additionally, the control unit 302 may perform step S410 by transmitting a lighting command to the controllable light source 308 to adjust the direction of light emitted by the controllable light source 308 away from the user 100.
As shown in Figure 3, the control unit 302 may be coupled to one or more controllable window blind unit 310 each associated with a respective window in the environment of the user 100. The control unit 302 may be coupled to the controllable window blind unit(s) 310 by way of a wired or wireless connection. The control unit 302 is configured to control the controllable window blind unit(s) 310 by supplying control commands to the controllable window blind unit(s) 310. If at step S408, the control unit 302 traces the glare back to a window in the environment of the user 100 through which light from an uncontrollable light source (e.g. the sun) passes through, at step S410 the control unit 302 is configured to transmit one or more control command to the controllable window blind unit 310 associated with the window to reduce the amount of light passing through the window.
The controllable window blind unit 310 may comprise physical blinds which are controllable to extend vertically along the height of the window, or extend horizontally along the width of the window. In this example, at step S410 the control unit 302 transmits one or more control command to the controllable window blind unit 310 to extend the physical blinds to cover a portion of or the whole window to reduce the amount of light passing through the window.
In other embodiments, the controllable window blind unit 310 may be a transparent electronic display (e.g. transparent LCD display) that is positioned over a window in the environment of the user 100 or that replaces a conventional window completely. In this example, at step S410 the control unit 302 transmits one or more control command to the transparent electronic display to control pixels of the transparent electronic display to reduce the amount of light passing through the transparent electronic display by way of virtual blinds. This is illustrated in Figure 5 which shows the glasses 200 worn by user 100 being used to control a transparent electronic display 500 that has replaced a conventional window in a building. As shown in Figure 5, the control unit 302 may transmit one or more control command to the transparent electronic display to control pixels 502 to reduce the intensity of light 504 that is directed towards the user (or block completely). The control unit 302 may control all of the pixels of the transparent electronic display to shade the whole display (i.e. window) or control only a subset of the total pixels of the transparent electronic display to provide local shading (as shown in Figure 5).
In other embodiments, windows in the environment of the user 100 may have electrically controllable transmission and/or reflection properties. If at step S408, the control unit 302 traces the glare back to a window in the environment of the user 100 (having electrically controllable transmission and/or reflection properties) through which light from an uncontrollable light source (e.g. the sun) passes through, at step S410 the control unit 302 is configured to transmit one or more control signal to the window to reduce the amount of light passing through the window. This may be done by applying diffusion, filters or controlling the polarization of light passing through the window. This may be achieved through the use of an electrically controllable film on the surface of the window or using a window which can change its surface structure (e.g. flat, curved ribbed etc.) using electro- active polymers.
Although not shown in Figure 3, the control unit 302 may be coupled to one or more controllable surfaces in the environment of the user 100. As described above, the light beam causing the glare may reflect off of a surface in the environment of the user 100. If the control unit 302 traces the light causing the glare back to a controllable surface, at step S410 the control unit 302 may transmit one or more control signal to the controllable surface to resolve the glare. For example, the control unit 302 may change the structure of the reflective surface (e.g. flat, curved ribbed etc.) via electro-active polymers to reduce the glare.
Alternatively, the control unit 302 may change the orientation of the reflective surface e.g. tilting a desktop surface or display surface, to change the direction at which light reflects off of the reflective surface.
In accordance with embodiments described above, the control system 300 provides automatic control of a building system (window, blinds, lighting, surface) to correct for glare perceived by a user without disturbing the visual field of the user.
In one scenario, the user 100 may be using a computing device (e.g. tablet, laptop, television, mobile telephone etc.) comprising a display. The control unit 302 may detect that that glare is perceived by the user 100 in accordance with any of the embodiments described above. In this scenario, the control unit 302 may receive an intensity (brightness) value of the light emitted by the display of the computing device. This intensity information may be communicated from the computing device to the control unit 302 by way of a wireless or wired connection. By having heuristic knowledge (e.g. a mathematical function) on typical pupil sizes in relation to intensity (brightness) levels, the control unit 302 is able calculate a desired pupil diameter or diameter range based on the received intensity value. In accordance with embodiments of the present disclosure described above, the control unit 302 operates to reduce the glare. Image data received from the inward facing sensor 204 can be used to monitor the effectiveness of the glare reduction by monitoring whether the user's pupil diameter has increased to the calculated desired pupil diameter to determine whether further steps need to be taken to resolve the glare (e.g. further reduce the intensity of light emitted by a controllable light source 308).
In the embodiments described above whereby the control unit 302 controls the light coming from a controllable artificial light source or an indirect light source to reduce the glare, the control unit 302 may additionally take steps to adapt the light emitted by artificial light sources (e.g. increase intensity) and/or to adapt the light coming from an indirect light source (e.g. by raising blinds of other windows to compensate for lowering the blinds for the window through which the light that causes the glare passes) to compensate. This ensures that the amount of light in the environment of the user 100 stays the same when for example certain blinds are adjusted to reduce glare, so that for example there is an equal amount of light for the user 100 to do their tasks.
It will be appreciated that the above embodiments have been described only by way of example.
In the embodiments described above the time of day may be determined locally on the control unit 302 using a timer or clock, alternatively the time of day may be communicated to the control unit 302 from a remote device. Similarly, weather conditions may be sensed locally on the control unit 302 using a weather sensor, alternatively the weather conditions may be communicated to the control unit 302 from a remote device.
The functionality of the control unit 302 that is described herein may be implemented in code (software) stored on a memory comprising one or more storage media, and arranged for execution on a processor comprising on or more processing units. The code is configured so as when fetched from the memory and executed on the processor to perform operations in line with embodiments discussed below. Alternatively it is not excluded that some or all of the functionality of the control unit 302 is implemented in dedicated hardware circuitry, or configurable hardware circuitry like a field-programmable gate array (FPGA).
Other variations to the disclosed embodiments can be understood and effected by those skilled in the art in practicing the claimed invention, from a study of the drawings, the disclosure, and the appended claims. In the claims, the word "comprising" does not exclude other elements or steps, and the indefinite article "a" or "an" does not exclude a plurality. The terms "controller" and "module" as used herein generally represent software, firmware, hardware, or a combination thereof. In the case of a software implementation, the controller or module represents program code that performs specified tasks when executed on a processor (e.g. CPU or CPUs). The program code can be stored in one or more computer readable memory devices. A single processor or other unit may fulfil the functions of several items recited in the claims. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage. A computer program may be stored/distributed on a suitable medium, such as an optical storage medium or a solid-state medium supplied together with or as part of other hardware, but may also be distributed in other forms, such as via the Internet or other wired or wireless telecommunication systems. Any reference signs in the claims should not be construed as limiting the scope.

Claims

CLAIMS:
1. A method of reducing glare perceived by a user (100), the method comprising:
receiving at least one sensor output signal (S402);
detecting a position and orientation of the user using at least one or more of the at least one sensor output signal (S404);
plotting the position and orientation of the user on a predetermined map of an environment of the user (S406), the map including a location of one, or any combination of: one or more controllable artificial light source in the environment of the user; and one or more indirect light source in the environment of the user;
detecting that glare is perceived by the user and identifying a light source in the environment of the user causing the glare using the map (S408);
characterized in that the method comprises the step of
based on said detection, adjusting said light source to reduce the glare (S410).
2. The method of claim 1, wherein the at least one sensor output signal comprises a sensor output signal that is output from an outward facing sensor (202) that is positioned on glasses (200) worn by the user (100).
3. The method of claim 2, wherein the method comprising detecting one or more of the positon and orientation of the user (100) using the sensor output signal output from the outward facing sensor (202).
4. The method of claim 2 or 3, wherein detecting that glare is perceived by the user (100) is based on the sensor output signal output from the outward facing sensor (202).
5. The method of any preceding claim 2, wherein the at least one sensor output signal comprises a sensor output signal that is output from an inward facing sensor (204) that is positioned on glasses (200) worn by the user (100), and detecting that glare is perceived by the user (100) is based on monitoring the user's eyes using the sensor output signal that is output from the inward facing sensor (204).
6. The method of any of claims 4 or 5, wherein the method comprises
determining an angle of incidence of the light causing the glare, and using said angle and the map to identify said light source.
7. The method of any of claims 1 to 3, wherein the method comprises
determining an orientation of the user's head using the at least one sensor output signal; and using the determined head orientation and the map to detect that glare is perceived by the user (100) and identify the light source in the environment of the user (100) causing the glare.
8. The method of any preceding claim, wherein the at least one sensor output signal comprises a sensor output signal that is output from an image sensor of a computer device in the environment of the user (100).
9. The method of claim 8, wherein the method comprises:
determining an orientation of the user's head using the sensor output signal output from the image sensor of the computer device;
receiving position and orientation information of the computer device from the computer device; and
detecting the position and orientation of the user using the determined head orientation and the received position and orientation information.
10. The method of any preceding claim, wherein the method further comprises adapting light coming from one or more of: (i) at least one controllable artificial light source of the one or more controllable artificial light source in the environment of the user (100), and (ii) at least one
indirect light source of the one or more indirect light source in the environment of the user (100), to compensate for the adjustment of light coming from said light source.
11. A computer program product comprising code embodied on a computer- readable medium and being configured so as when executed on a processor of a control unit (302) to perform the method of any of claims 1 to 10.
12. A control unit (302) for reducing glare perceived by a user (100), the control unit (302) comprising an input for receiving at least one sensor output signal and configured to:
detect a position and orientation of the user (100) using at least one or more of the at least one sensor output signal;
plot the position and orientation of the user (100) on a predetermined map of an environment of the user that is stored in a memory coupled to the control unit, the map including a location of one, or any combination of: one or more controllable artificial light source (308) in the environment of the user (100) ; and one or more indirect light source in the environment of the user (100);
detect that glare is perceived by the user (100) and identify a light source in the environment of the user (100) causing the glare using the map ;
characterized in that the control unit (302) is arranged to,
based on said detection, adjust said light source (308) to reduce the glare.
13. The control unit (302) of claim 12, wherein the light source identified as causing the glare is a controllable artificial light source (308) coupled to the control unit (302) , the control unit (302) is configured to transmit a lighting command to the controllable artificial light source (308) to control the light emitted by the controllable artificial light source (308) to reduce the glare.
PCT/EP2016/077117 2015-11-16 2016-11-09 Glare reduction WO2017084940A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP15194671.2 2015-11-16
EP15194671 2015-11-16

Publications (1)

Publication Number Publication Date
WO2017084940A1 true WO2017084940A1 (en) 2017-05-26

Family

ID=54705371

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2016/077117 WO2017084940A1 (en) 2015-11-16 2016-11-09 Glare reduction

Country Status (1)

Country Link
WO (1) WO2017084940A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10586316B2 (en) 2017-08-07 2020-03-10 Morphotrust Usa, Llc Reduction of glare in imaging documents
US10733469B2 (en) 2017-12-29 2020-08-04 Idemia Identity & Security USA LLC Capturing digital images of documents

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1993021624A1 (en) 1992-04-15 1993-10-28 Reveo, Inc. Intelligent electro-optical system and method for automatic glare reduction
US20090204291A1 (en) 2008-02-13 2009-08-13 Cernasov Nathalie Grace Automatic glare reduction system for vehicles
US7651220B1 (en) 2005-11-07 2010-01-26 Ram Pattikonda Selective system for blocking glare in a specific location of a user's field of vision
WO2014195821A1 (en) 2013-06-04 2014-12-11 Koninklijke Philips N.V. A light monitoring system, a glare prevention system, a vehicle and a method of monitoring glare

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1993021624A1 (en) 1992-04-15 1993-10-28 Reveo, Inc. Intelligent electro-optical system and method for automatic glare reduction
US7651220B1 (en) 2005-11-07 2010-01-26 Ram Pattikonda Selective system for blocking glare in a specific location of a user's field of vision
US20090204291A1 (en) 2008-02-13 2009-08-13 Cernasov Nathalie Grace Automatic glare reduction system for vehicles
WO2014195821A1 (en) 2013-06-04 2014-12-11 Koninklijke Philips N.V. A light monitoring system, a glare prevention system, a vehicle and a method of monitoring glare

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10586316B2 (en) 2017-08-07 2020-03-10 Morphotrust Usa, Llc Reduction of glare in imaging documents
US10733469B2 (en) 2017-12-29 2020-08-04 Idemia Identity & Security USA LLC Capturing digital images of documents
US11210542B2 (en) 2017-12-29 2021-12-28 Idemia Identity & Security USA LLC Capturing digital images of documents

Similar Documents

Publication Publication Date Title
US10948983B2 (en) System and method for utilizing gaze tracking and focal point tracking
EP3048949B1 (en) Gaze tracking variations using dynamic lighting position
US9913340B2 (en) Method and device for controlling illumination levels
JP6652251B2 (en) Display device, display method, and display program
EP2713359B1 (en) Method and apparatus for controlling screen brightness corresponding to variation of illumination
CN107111373B (en) Dynamic camera or lamp operation
US20210020141A1 (en) Information processing apparatus, information processing method, and recording medium
CN107731179B (en) Display control method and device, storage medium and air conditioner
US10783835B2 (en) Automatic control of display brightness
EP3804286B1 (en) Electronic device and operating method of controlling brightness of light source
US11400862B2 (en) Vision-based interactive control apparatus and method of controlling rear-view mirror for vehicle
EP3521978A1 (en) Apparatus and method for tracking a focal point in a head mounted display system
EP3851875A1 (en) Brightness adjustment method and device, and storage medium
WO2017084940A1 (en) Glare reduction
WO2018206691A1 (en) Method for controlling a display parameter of a mobile device and computer program product
CA3166085A1 (en) A method, device, terminal, and computer-readable storage medium for adjusting font size
CN114365077B (en) Viewer synchronized illumination sensing
US20200121506A1 (en) Visibility enhancing eyewear
WO2018098992A1 (en) Method and device for screen control and computer storage medium
US20220308663A1 (en) Image display device, display control device, and display control method, and program and recording medium
US20240085695A1 (en) Systems and methods for improved quality of experience in augmented reality displays using light intensity measurements
US20240087248A1 (en) Systems and methods for improved quality of experience in augmented reality displays using light intensity measurements
CA3212219A1 (en) Systems and methods for improved quality of experience in augmented reality displays using light intensity measurements
CN117677995A (en) Presenting visual distraction to a user looking at a display for a long period of time
CN102346533A (en) Electronic device with power-saving mode and method for controlling electronic device to enter power-saving mode

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16798108

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16798108

Country of ref document: EP

Kind code of ref document: A1