EP3559900A1 - System and method for compensation of reflection on a display device - Google Patents
System and method for compensation of reflection on a display deviceInfo
- Publication number
- EP3559900A1 EP3559900A1 EP17883609.4A EP17883609A EP3559900A1 EP 3559900 A1 EP3559900 A1 EP 3559900A1 EP 17883609 A EP17883609 A EP 17883609A EP 3559900 A1 EP3559900 A1 EP 3559900A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- display device
- reflection
- scene
- images
- luminance values
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 70
- 230000001939 inductive effect Effects 0.000 claims abstract description 62
- 230000000694 effects Effects 0.000 claims abstract description 40
- 230000033001 locomotion Effects 0.000 claims description 42
- 238000013500 data storage Methods 0.000 claims description 4
- 230000001965 increasing effect Effects 0.000 claims description 4
- 235000019557 luminance Nutrition 0.000 description 28
- 210000003128 head Anatomy 0.000 description 13
- 230000000875 corresponding effect Effects 0.000 description 10
- 230000004313 glare Effects 0.000 description 6
- 230000000116 mitigating effect Effects 0.000 description 5
- 238000012360 testing method Methods 0.000 description 5
- 230000008859 change Effects 0.000 description 4
- 238000012545 processing Methods 0.000 description 4
- 230000005540 biological transmission Effects 0.000 description 3
- 238000004590 computer program Methods 0.000 description 3
- 238000010586 diagram Methods 0.000 description 3
- 230000008901 benefit Effects 0.000 description 2
- 230000002457 bidirectional effect Effects 0.000 description 2
- 238000000576 coating method Methods 0.000 description 2
- 238000005315 distribution function Methods 0.000 description 2
- 210000001061 forehead Anatomy 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 238000005286 illumination Methods 0.000 description 2
- 238000010606 normalization Methods 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 230000006978 adaptation Effects 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000015556 catabolic process Effects 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 238000001816 cooling Methods 0.000 description 1
- 230000002596 correlated effect Effects 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 238000006731 degradation reaction Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000000916 dilatatory effect Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000013213 extrapolation Methods 0.000 description 1
- 238000009501 film coating Methods 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 238000009499 grossing Methods 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 210000001747 pupil Anatomy 0.000 description 1
- 239000002096 quantum dot Substances 0.000 description 1
- 230000004043 responsiveness Effects 0.000 description 1
- 238000000926 separation method Methods 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 238000010998 test method Methods 0.000 description 1
- 239000010409 thin film Substances 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
Classifications
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/10—Intensity circuits
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/90—Dynamic range modification of images or parts thereof
- G06T5/94—Dynamic range modification of images or parts thereof based on local image properties, e.g. for local contrast enhancement
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/50—Image enhancement or restoration using two or more images, e.g. averaging or subtraction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/514—Depth or shape recovery from specularities
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/55—Depth or shape recovery from multiple images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/36—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
- G09G5/363—Graphics controllers
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30196—Human being; Person
- G06T2207/30201—Face
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2320/00—Control of display operating conditions
- G09G2320/06—Adjustment of display parameters
- G09G2320/066—Adjustment of display parameters for control of contrast
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2320/00—Control of display operating conditions
- G09G2320/06—Adjustment of display parameters
- G09G2320/068—Adjustment of display parameters for control of viewing angle adjustment
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2320/00—Control of display operating conditions
- G09G2320/06—Adjustment of display parameters
- G09G2320/0686—Adjustment of display parameters with two or more screen areas displaying information with different brightness or colours
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2320/00—Control of display operating conditions
- G09G2320/08—Arrangements within a display terminal for setting, manually or automatically, display parameters of the display terminal
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2354/00—Aspects of interface with display user
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2360/00—Aspects of the architecture of display systems
- G09G2360/14—Detecting light within display terminals, e.g. using a single or a plurality of photosensors
- G09G2360/144—Detecting light within display terminals, e.g. using a single or a plurality of photosensors the light being ambient light
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2380/00—Specific applications
- G09G2380/10—Automotive applications
Definitions
- the technical field generally relates to performing digital image processing to compensate for the reflection of one or more specular reflections within an ambient environment.
- the light may be reflected in multiple ways.
- reflectance can be quantified in terms of diffuseness of reflection, varying between fully diffuse to fully specular.
- shining a spot light on a perfect mirror will provide only specular reflection such that the spot light will only be visible in the reflected image when the mirror is displayed from an angle complimentary to the angle of the spot light. That is, if the spot light is located to the left of center of the mirror and shown onto the mirror from 45 degrees, the spot light will only be visible when it is viewed by an observer located 45 degrees to the right of the mirror.
- diffuse surfaces will accept light and reflect it in all angles.
- a commonly applied method to mitigate the effect of ambient involves raising the black level, but this only serves to further undermine contrast and does little to compensate for the confound between displayed content and reflected background information.
- Compensation for low-light conditions can be achieved by manipulating color and tone-mapping appropriately.
- bright ambient environments pose a number of difficulties, including screen reflections, surrounding glare, and limited display brightness.
- the first two problems can be solved by eliminating the third one, i.e., creating a brighter display to overcome the ambient environment.
- the unacceptable price in a mobile device is power consumption, but for automotive, it is mostly a question of technological limits. Displays only get so bright before special cooling is required with the latest LED-backlit LCD or OLED panels.
- a method for compensating for reflection on a display device incudes capturing one or more images of a scene facing the display device, identifying from the captured images one or more reflection-inducing zones located within the scene facing the display device, determining specular reflection effect on the display device caused by the reflection-inducing zones, and adjusting a target image to be displayed on the display device based on the determined reflection effect.
- a computer-implemented system includes at least one data storage device; and at least one processor operably coupled to the at least one storage device, the at least one processor being configured for performing the methods described herein according to various aspects.
- a computer-implemented system includes at least one data storage device; and at least one processor operably coupled to the at least one storage device, the at least one processor being configured for
- a computer-readable storage medium includes computer executable instructions for performing the methods described herein according to various aspects.
- Figure 1 a illustrates a first sample image showing reflection effect on the display device caused by a window in the scene facing the display device
- Figure 1 b illustrates a second sample image showing reflection effect on the display device caused by a window in the scene facing the display device
- Figure 2 illustrates a flowchart of the operational steps of an example method for compensating for reflection on a display device
- Figure 3 illustrates a flowchart of the operational steps of an example method for identifying one or more reflection-inducing zones
- Figure 4 illustrates a flowchart of the operational steps of an example method for determining distance of light generating objects in the scene from the display device
- Figure 5 illustrates a flowchart of the operational steps of an example method for determining the reflection effect on the display device
- Figure 6 illustrates a schematic diagram showing the relative positions of a viewer, the display device, and an image capture device
- Figure 7 illustrates a flowchart of the operational steps of an example method for adjusting the target image to be displayed based on the reflection effect
- Figure 8a illustrates the first sample image showing reflection effect on the display device caused by a window in the scene facing the display device in which compensation for reflection is mismatched
- Figure 8b illustrates the second sample image showing reflection effect on the display device caused by a window in the scene facing the display device in which compensation for reflection is mismatched
- Figure 8c illustrates the first sample image showing reflection effect on the display device caused by a window in the scene facing the display device in which reflection compensation according to various exemplary embodiments described herein has been applied;
- Figure 8d illustrates the second sample image showing reflection effect on the display device caused by a window in the scene facing the display device in which reflection compensation according to various exemplary embodiments described herein has been applied;
- Figure 8e illustrates the first sample image showing reflection effect on the display device caused by a window in the scene facing the display device in which reflection compensation according to various exemplary embodiments described herein has been applied;
- Figure 8f illustrates the second sample image showing reflection effect on the display device caused by a window in the scene facing the display device in which reflection compensation according to various exemplary embodiments described herein has been applied;
- Figure 9a is an image captured of the scene facing the display device;
- Figure 9b is a mask showing reflection zones (in white) determined from the image of the scene facing the display device;
- Figure 10a is a third sample image prior to processing;
- Figure 10b illustrates the third sample image after adjustment for reflection compensation;
- Figure 10c illustrates the displayed third sample image showing reflection effect and without reflection compensation
- Figure 10d illustrates the displayed third sample image in which reflection compensation according to various exemplary embodiments described herein has been applied.
- various example embodiments described herein provide for a system and method for compensating for reflections caused by light- generating objects in the scene facing a display device by capturing images of the scene, identifying in the images reflection-inducing zones corresponding to the light generating objects, estimating the reflection effect on the display device from the reflection-inducing zones and adjusting a target image to be displayed based on the estimated reflection effect.
- the reflection-inducing zones may be zones that cause specular reflection and the estimating estimates the specular reflection effect on the display device.
- One or more reflection compensation systems and methods described herein may be implemented in computer programs executing on programmable computers, each comprising at least one processor, a data storage system (including volatile and non-volatile memory and/or storage elements), at least one input device, and at least one output device.
- the programmable computer may be a programmable logic unit, a mainframe computer, server, and personal computer, cloud based program or system, laptop, personal data assistance, cellular telephone, smartphone, wearable device, tablet device, virtual reality devices, smart display devices (ex: Smart TVs), set-top box, video game console, or portable video game devices.
- Each program is preferably implemented in a high level procedural or object oriented programming and/or scripting language to communicate with a computer system.
- the programs can be implemented in assembly or machine language, if desired. In any case, the language may be a compiled or interpreted language.
- Each such computer program is preferably stored on a storage media or a device readable by a general or special purpose programmable computer for configuring and operating the computer when the storage media or device is read by the computer to perform the procedures described herein.
- the systems may be embedded within an operating system running on the programmable computer.
- the system may be implemented in hardware, such as within a CPU or video card (GPU).
- the systems, processes and methods of the described embodiments are capable of being distributed in a computer program product comprising a computer readable medium that bears computer-usable instructions for one or more processors.
- the medium may be provided in various forms including one or more diskettes, compact disks, tapes, chips, wireline transmissions, satellite transmissions, internet transmission or downloadings, magnetic and electronic storage media, digital and analog signals, and the like.
- the computer-usable instructions may also be in various forms including compiled and non-compiled code.
- the one or more reflection compensation system and methods described herein is applied where an image or video (hereinafter referred to as a "target image") is to be displayed on an electronic display device.
- the electronic display device may be a computer monitor, a screen of mobile device (ex: tablet, smartphone, laptop, wearable device), a screen of video game console, a TV, etc.
- the display device may be implemented using display technologies such as OLED, LCD, quantum dot display, laser projector, CRT, etc.
- Figures 1 a and 1 b show a first and second sample image representing a common reflection problem. Light-generating objects in the area facing the display device causes reflection on the display device that appear as whitish highlights on the display device. [0040] Referring now to Figure 2, therein illustrated is a flowchart of the operational steps of an example method 100 for compensating for specular reflection on a display device.
- one or more images of a scene facing the display device is captured.
- the display device is to be used to display one or more target images.
- the scene facing the display device corresponds to the environment in front of the display device.
- any object in the scene that emits light that causes specular reflection on the display device is referred herein generally as a "light-generating object". It will be understood that the light-generating object may directly emit light that causes specular reflections, or the light-generating object may be reflecting light from an external source, the reflected light further causing specular reflections on the display device.
- the one or more images may be captured sequentially, such as in a video. Accordingly, the scene being captured in the images may change over the sequence, such as due to a change in orientation of the image capture device or changes to objects in the scene.
- the one or more images of the scene can be captured by an image capture device.
- the image capture device may be positioned to be offset by a known distance and orientation from the display device.
- the image capture device is located in proximity of the display device and is facing the same direction as the display device.
- the image capture device may be an external camera positioned in proximity of the display device.
- the image capture device may be an embedded camera, such as the front facing camera of a mobile device (smartphone, tablet, laptop with webcam, video game console, etc.).
- the image capture device may be a combination of capture devices, such as a combination of a camera and an ambient light sensor.
- the camera and the ambient light sensor are located in proximity of one another such that a scene captured by the camera substantially corresponds to the scene captured by the ambient light sensor. It will be appreciated that various mobile devices are now offered with both a camera and an ambient light sensor.
- the image capture devices may have two or more cameras, which may be operated to capture the scene while providing depth information of objects within the scene (ex: stereoscopic cameras).
- An additional device operable to determine depth such as a time-of-flight sensor, can also be used.
- the images captured of the scene may be down-sampled to a lower resolution, which may improve processing speed. It will be appreciated that steps described herein that operate on captured images of the scene facing the display can refer to the down-sampled captured images.
- the scene facing the display device represented in the images captured by the image capture device is defined by the field of view of the capture device.
- one or more reflection-inducing zones located within the one or more captured images are identified.
- the reflection-inducing zones correspond to light-generating objects in the scene that can cause specular reflections on the display device.
- the reflection-inducing zones are areas of the capture images that have a sufficiently high luminance value that indicates bright light-generating objects in the scene.
- the specular reflection effect on the display device caused by the light-generating objects, and as represented by the reflection- inducing zones in the captured images, are determined.
- the specular reflection effect represents an estimation of how a viewer viewing the display device would perceive specular reflections caused by light-generating objects in the scene facing the display device.
- a target image that is to be displayed on the display device is adjusted based on the reflection effect.
- the target image may be adjusted to reduce or mitigate the reflections perceived by the viewer.
- the adjustment may include digitally processing the target image.
- FIG. 3 therein illustrated is a flowchart of the operational steps of an example method for identifying one or more reflection- inducing zones located within the one or more captured images.
- the method may correspond to substeps of step 1 16.
- an area of the one or more captured images that is not a reflection-inducing zone is identified. This zone corresponds to a part of the scene facing the display device that will not cause significant specular reflections to be perceived by the viewer.
- the face of the viewer is captured within the images of the scene facing the display device and zone of the images corresponding to a portion of the face of the viewer is used as a reference area to set a threshold for identifying reflection-inducing zones of the captured images.
- a threshold for identifying reflection-inducing zones of the captured images.
- an area of the face corresponding to the bridge of the viewer's nose may be used.
- the area may also include parts of the forehead and portions of each eye of the user.
- a threshold for determining reflection-inducing zones is set.
- the threshold may be set as a luminance value that is a multiple (ex: 100 times) of the average of the luminance values of the pixels forming the reference area within the captured images.
- the threshold may be set as a factor of a maximum image value (ex: high luminance in the reference area such that the threshold exceeds maximum luminance pixel values).
- the areas of the captured image that have luminance values that exceed the threshold are determined to be reflection-inducing zones.
- a smoothing or dilating may be applied to remove reflection- inducing zones below a certain size.
- the distances of each light-generating object represented by the reflection-inducing zones from the display device are determined. That is, for each reflection-inducing zone identified from step 224, the distance of the real-life light-generating object represented by that reflection- inducing zone is determined.
- the distances of each light-generating object may be determined from known properties of the scene facing the display device. This may be the case where the scene is fixed relative to the display device.
- the location of each light-generating object in the scene, including their distance from the display device, can be predefined and the identified reflection- inducing zones are matched to its corresponding light-generating object.
- a display device such as computer monitor or a TV may be in a fixed position in a space, such as within a room, and light-generating objects found within the space (ex: walls, windows, light fixtures, lamps) are predefined.
- the display device in a fixed position may be an electronic billboard or other display device positioned in a public space.
- a display device may be in a fixed position inside the interior cabin of an automobile and light-generating objects of the cabin (ex: windows of the vehicle, lights inside the cabin) are predefined. It will be appreciated that although the automobile is movable, the display device remains in a fixed position relative to the interior of the cabin.
- properties of the scene such as location and brightness of light-generating objects, may be known ahead of time.
- properties may be pre-measured and pre-stored.
- the distance of light-generating objects within the scene from the display device is determined from sensed motion of the display device and the movement of reflection-inducing zones within images captured of the scene as the display device is moved.
- the distance of the light-generating objects can be calculated based on parallax effect.
- the determining of distances of light generating objects within the scene may take into account the offset of the image capture device from the display device. Additionally or alternatively, the determining of the distances of the light generating objects within the scene may take into account the position of the viewer, such as the viewer's eyes.
- FIG. 4 therein illustrated is a flowchart showing the operational steps of an example method 232 for determining distance of light- generating objects in the scene from the display device.
- the motion of the image capture device is sensed.
- the motion may be sensed by a sensor external to the image capture device but that is representative of the motion of the image capture device.
- the display device is a mobile device (ex: smartphone, tablet, laptop, portable video console)
- the motion may be sensed with a motion sensor of the mobile device, such as a gyroscope.
- a motion sensor of the mobile device such as a gyroscope.
- the movement of reflection inducing zones within the scene is determined from the plurality of images of the scene captured during movement of the image capture device.
- the sensed motion of the image capture device is correlated within the determined movement of reflection inducing zones to determine the distance of light-generating objects represented by the reflection inducing zones from the display device.
- lateral movement of the image capture device is sensed.
- edges of the reflection-inducing zones that are approximately perpendicular to the direction of motion are identified.
- the movement of the edges within the scene represented by the plurality of captured images are identified. Movement of edges that are counter to the sensed motion is ignored.
- Edges with least amount of movement within the captured scene are determined to be located at a higher distance (ex: infinity) from the display device. Edges with greater motion are determined to be closer to the display device.
- Occlusion objects within the scene are also detected.
- Occlusion objects correspond to objects located between a light-generating object and the display device and acts to block some of the reflection exhibited on the display device.
- FIG. 5 therein illustrated is a flowchart of the operational steps of an example method 124 for determining the reflection effect on the display device caused by the reflection-inducing zones identified at step 1 16.
- the position of a viewer viewing the display device is determined.
- Object recognition of the images captured of the scene facing the display device can be performed to recognize the viewer.
- the eyes of the viewer are located.
- the position of the viewer can be tracked over time.
- step 416 the reflection-inducing zones identified at 1 16 and distance of each light-generating objects represented by the reflection-inducing zones are received.
- reflection zones are determined based on the position of the viewer and information pertaining to the reflection-inducing zones and distances of the corresponding light-generating objects from display device.
- the reflection zones that are determined represent the reflection exhibited on the display device as perceived by the user caused by light-generating objects in the scene. It will be appreciated that the reflection zones may cover only a portion of the area of the display device. For example, some areas of the display device do not correspond to a reflection-inducing zone and therefore are determined to not exhibit reflection.
- Figure 6 therein illustrated is a schematic diagram showing the relative positions of a viewer, the display device and an image capture device.
- reflections on the display device perceived by the viewer correspond to the viewer's view of light-generating objects in the scene as seen from the virtual view position, which corresponds to the position of the viewer's eyes mirrored over the display device.
- the image capture device that captures images of the scene is offset from the virtual view position. Accordingly, an extrapolation is applied to determine how light-generating objects represented as reflection-inducing zones in the images captured would be seen from the virtual view position.
- the reflection zones may be represented as a veiling glare in the form of a 2-D array, wherein values in the array define the luminance value of the reflection zones as perceived by the viewer on the display device.
- FIG. 7 therein illustrated is a flowchart of the operational steps of an example method 132 for adjusting the target image to be displayed based on the reflection effect.
- the reflection zones determined at step 424 are received.
- the veiling glare defining luminance values of the reflection zone is also received.
- the luminance values of the reflection zone are compared with luminance values of the target image to determine the adjustment to be applied to the target image. For example, the luminance value of the reflection zone at a given area on the display device is compared with luminance value at a corresponding area (when displayed on the display device) of the target image. The comparison of luminance values may be carried out on a pixel-by-pixel basis. Upsampling of the veiling glare may applied.
- the luminance values of the pixels within the one or more subareas areas are decreased.
- the adjusted target image is displayed on the display device.
- EXAMPLE IMPLEMNTATION 1 The basic concept is to take continuous video, ambient light and motion sensor data from a mobile device, and use it to deduce in real-time the current reflections seen by the user on his or her screen. The head position is simultaneously tracked in order to reproject the bright regions as seen by the front- facing camera. [0086]
- the main challenge with veil subtraction lies in the accurate prediction of the reflection image as seen from the viewer's perspective. If the reflection image is in the wrong place, the results may be deteriorated due to subtracting the wrong veil, as shown in Figures 8a and 8b.
- Veil estimation requires (a) knowing where the viewer's eyes are located and (b) knowing the brightness and position of significant reflection sources are, both relative to the display.
- the device being used is equipped with a front-facing camera, an ambient light sensor, and motion sensors to provide when and how the display is moving. These features are typical on mobile devices currently, and are likely to be available with improved specifications in the future, but current capabilities are sufficient. An aim is to keep the computational complexity low as well so as not to tax the CPU, GPU, or battery too heavily.
- Edges that move strongly in the wrong direction are ignored as they must be moving in the scene. Edges with the least motion are assumed to be at infinity (vanishing point or horizon). Edges with greater motion are assumed to be nearer, and their distance is computed based on parallax against the horizon line and intrinsic camera parameters.
- Reprojection is also simplified, as a small number of discrete positions are moved along a set of 3-D contours and in-filling to estimate new highlights. This is designed as a lightweight process.
- a threshold is set empirically based on the captured image pixel value at the bridge of the user's nose. Since it is expected this will be proportional to the viewer's eye adaptation and already in camera brightness units, whatever the exposure happens to be, this serves as a convenient way to set the threshold.
- a square area equal to the inter-ocular spacing is averaged, which covers most of the nose, some of the forehead, and roughly half of each eye.
- the highlight threshold is set to a multiple (ex: 100 times) of this average, or a factor (ex: 0.92) of the maximum image value in a [0,1 ] range, whichever is smaller.
- Down-sampled capture image pixels that are above threshold using a 2-D bitmap are marked. This bitmap is subsequently eroded and dilated to remove isolated highlights that would be too small to remedy.
- the front camera geometry and determined head position is used to extrapolate the view behind the screen that corresponds to the reflected image from the user's perspective.
- distance estimates for all the scene pixels is needed, which is obtained from a call-back function. In the demo for a car, this function will use the mock-up car's window geometry.
- the virtual view that would not require any reprojection corresponds to an impossible position behind the display.
- Reprojecting highlights captured by the front camera depends on the distances to objects in the scene.
- the distance to the viewer's head outline can be estimated from eye-tracking data, and other distances based on a fixed automobile demo geometry.
- the diagram shown also simplifies the problem by showing only one eye. Since it is assumed that the viewer has two eyes, the reprojection is performed twice and overlay the results. A 50% factor may be used for each highlight after normalization based on the ambient sensor value.
- the overall strategy is to subtract the veil due to highlights where target image values permit, and raise other values as necessary to subtract the veil in target regions that would otherwise be too dark for subtraction. This fits the goal of maintaining local contrast despite highlights, at the expense of manipulating brightness in some highlight regions.
- the overall effect is an interesting one, which sits visually between emissive and reflective displays. In brighter regions of the target image, where veil subtraction just works, the veil largely disappears and colors are restored. In darker regions, the image highlight is like dappled light on a reflection print, bringing up local brightness while preserving contrast and color appearance. This ends up being much less objectionable than other manipulations tested.
- the highlight mitigation method can be described by the following formula:
- V avg local average Y of veiling glare
- Target image values are converted to single-channel floating-point in a [0, 1 ] range.
- the Li ow and Lhigh values are calculated within small regions of the target image used to set the local multiplier m.
- This down-sampled multiplier image as well as the veil image are smoothed (blurred) by a certain amount to avoid the appearance of artificial boundaries.
- the headroom constant k may be used to provide additional range on displays that can boost small regions to bright values, but have difficulty maintaining high total output, such as OLED devices. Settings above 1 .0 will slightly dim the display everywhere that added brightness is not needed to compensate for highlights.
- EXAMPLE IMPLEMENTATION 2 is used to provide additional range on displays that can boost small regions to bright values, but have difficulty maintaining high total output, such as OLED devices. Settings above 1 .0 will slightly dim the display everywhere that added brightness is not needed to compensate for highlights.
- the automotive application is a constrained subproblem that avoids the need to estimate the distances to highlight boundaries in the scene, since the rough geometry of the vehicle is known. The viewer's head position still needs to be tracked based on front-camera data. For example, Android comes with built-in calls that perform this task.
- the viewer's eye position together with the camera field of view and position with respect to the display area are used to reproject bright region locations to where they are expected to appear in the reflected screen.
- This step is performed on a pixel-by-pixel basis in over-threshold regions of the captured image, but at a reduced resolution to maintain responsiveness. In this situation, shifts in the viewer's head position and changes in the scene behind the viewer are responded to. Some delay (ex: on the order of fraction of seconds) is acceptable.
- the example implementation seeks to simulate an in-dash display roughly positioned in front of the driver.
- Static parameters such as display resolution, size, screen reflectance, front camera geometry and intrinsics (FOV, etc.)
- the first C++ class is GetHighlights, and its purpose is to determine the position and relative intensity of reflected highlights visible to the viewer. All of the above information is needed except for the target display image (f), which is applied in another class, HideHighlights.
- the GetHighlights class performs the following operations on the input:
- step (5) is specifically designed to compensate for this lack of calibration, substituting the absolute value from the ambient sensor and inferring that most of the measured light is represented somewhere in the image, even if it shows only as white. By scaling the highlights by the ambient sensor reading, the recorded highlights are obtained back into roughly the correct range.
- the only assumption is that the exposure is bright enough to track the user's eye positions and dim enough that everything else is not completely blown out. In cases where there is nothing significantly above the brightness of the viewer's face, no highlights will be returned and the loop can be paused until the ambient sensor detects a change in the lighting.
- the result produced by the GetHighlights class is a low-resolution image matching the display's aspect ratio with the highlights the viewer is expected to see reflected at the moment the front image was captured.
- This estimated highlight image then gets passed along with the target display image (f) to the HideHighlights class to perform the following steps: (6) Compute a multiplier at each display pixel that brightens up the image enough to subtract the offending highlights as described in Method section
- step (6) how pixel values translate to absolute luminance on the display needs to be known. This should be determined by the brightness setting available in Android, but there seems to be a complex, dynamic relation between this setting and the actual pixel luminances on the OLED display.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Computer Hardware Design (AREA)
- Computer Graphics (AREA)
- Controls And Circuits For Display Device (AREA)
Abstract
Description
Claims
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201662436667P | 2016-12-20 | 2016-12-20 | |
PCT/CA2017/051526 WO2018112609A1 (en) | 2016-12-20 | 2017-12-18 | System and method for compensation of reflection on a display device |
Publications (4)
Publication Number | Publication Date |
---|---|
EP3559900A1 true EP3559900A1 (en) | 2019-10-30 |
EP3559900A4 EP3559900A4 (en) | 2020-06-17 |
EP3559900B1 EP3559900B1 (en) | 2021-08-18 |
EP3559900B8 EP3559900B8 (en) | 2021-09-22 |
Family
ID=62624571
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP17883609.4A Active EP3559900B8 (en) | 2016-12-20 | 2017-12-18 | System and method for compensation of reflection on a display device |
Country Status (6)
Country | Link |
---|---|
US (3) | US11250811B2 (en) |
EP (1) | EP3559900B8 (en) |
JP (1) | JP7181202B2 (en) |
CN (1) | CN110235171B (en) |
CA (1) | CA3047805A1 (en) |
WO (1) | WO2018112609A1 (en) |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2018112609A1 (en) | 2016-12-20 | 2018-06-28 | Irystec Software Inc. | System and method for compensation of reflection on a display device |
US11159737B2 (en) * | 2019-10-14 | 2021-10-26 | Google Llc | Exposure change control in low light environments |
EP4053682A1 (en) * | 2021-03-01 | 2022-09-07 | Nokia Technologies Oy | User device screen |
Family Cites Families (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5442484A (en) | 1992-01-06 | 1995-08-15 | Mitsubishi Denki Kabushiki Kaisha | Retro-focus type lens and projection-type display apparatus |
US5854661A (en) | 1997-09-30 | 1998-12-29 | Lucent Technologies Inc. | System and method for subtracting reflection images from a display screen |
US6411306B1 (en) | 1997-11-14 | 2002-06-25 | Eastman Kodak Company | Automatic luminance and contrast adustment for display device |
US20040070565A1 (en) * | 2001-12-05 | 2004-04-15 | Nayar Shree K | Method and apparatus for displaying images |
US7545397B2 (en) | 2004-10-25 | 2009-06-09 | Bose Corporation | Enhancing contrast |
US7725022B2 (en) * | 2006-08-22 | 2010-05-25 | Qualcomm Incorporated | Dynamic automatic exposure compensation for image capture devices |
JP2009031337A (en) * | 2007-07-24 | 2009-02-12 | Funai Electric Co Ltd | Video display device |
JP2009244700A (en) * | 2008-03-31 | 2009-10-22 | Equos Research Co Ltd | Image display |
JP5540537B2 (en) * | 2009-03-24 | 2014-07-02 | 株式会社オートネットワーク技術研究所 | Control device, control method, and computer program |
US9380292B2 (en) * | 2009-07-31 | 2016-06-28 | 3Dmedia Corporation | Methods, systems, and computer-readable storage media for generating three-dimensional (3D) images of a scene |
US20120229487A1 (en) * | 2011-03-11 | 2012-09-13 | Nokia Corporation | Method and Apparatus for Reflection Compensation |
US8559753B2 (en) | 2011-09-23 | 2013-10-15 | The Boeing Company | Reflection removal system |
JP5443533B2 (en) * | 2012-03-22 | 2014-03-19 | 株式会社東芝 | Image processing apparatus, image display apparatus, and image processing method |
CN104584113B (en) * | 2012-08-15 | 2017-03-08 | 富士胶片株式会社 | Display device |
JP2015022525A (en) * | 2013-07-19 | 2015-02-02 | 富士通株式会社 | Information processing device, method for detecting subject portion, and program |
KR20150039458A (en) * | 2013-10-02 | 2015-04-10 | 삼성전자주식회사 | Display apparatus and control method for the same |
JP6432159B2 (en) * | 2014-05-22 | 2018-12-05 | 凸版印刷株式会社 | Information display device, information display method, and information display program |
US9645008B2 (en) * | 2014-08-25 | 2017-05-09 | Apple Inc. | Light sensor windows for electronic devices |
WO2018112609A1 (en) | 2016-12-20 | 2018-06-28 | Irystec Software Inc. | System and method for compensation of reflection on a display device |
-
2017
- 2017-12-18 WO PCT/CA2017/051526 patent/WO2018112609A1/en unknown
- 2017-12-18 CN CN201780079262.3A patent/CN110235171B/en active Active
- 2017-12-18 US US16/471,156 patent/US11250811B2/en active Active
- 2017-12-18 JP JP2019533169A patent/JP7181202B2/en active Active
- 2017-12-18 CA CA3047805A patent/CA3047805A1/en active Pending
- 2017-12-18 EP EP17883609.4A patent/EP3559900B8/en active Active
-
2022
- 2022-02-14 US US17/671,492 patent/US11783796B2/en active Active
-
2023
- 2023-10-09 US US18/377,948 patent/US20240119915A1/en active Pending
Also Published As
Publication number | Publication date |
---|---|
US20200027423A1 (en) | 2020-01-23 |
EP3559900B1 (en) | 2021-08-18 |
US20220208144A1 (en) | 2022-06-30 |
WO2018112609A1 (en) | 2018-06-28 |
CN110235171A (en) | 2019-09-13 |
EP3559900A4 (en) | 2020-06-17 |
JP2020504836A (en) | 2020-02-13 |
JP7181202B2 (en) | 2022-11-30 |
EP3559900B8 (en) | 2021-09-22 |
US20240119915A1 (en) | 2024-04-11 |
US11783796B2 (en) | 2023-10-10 |
US20200258474A2 (en) | 2020-08-13 |
US11250811B2 (en) | 2022-02-15 |
CA3047805A1 (en) | 2018-06-28 |
CN110235171B (en) | 2023-12-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11783796B2 (en) | System and method for compensation of reflection on a display device | |
EP3602248B1 (en) | Selective application of reprojection processing on layer sub-regions for optimizing late stage reprojection power | |
US11480804B2 (en) | Distributed foveated rendering based on user gaze | |
US10380802B2 (en) | Projecting augmentation images onto moving objects | |
EP3827416B1 (en) | Lighting estimation for augmented reality | |
US9652662B2 (en) | Image processing device and image processing method | |
CN112805755B (en) | Information processing apparatus, information processing method, and recording medium | |
EP3065107B1 (en) | Coherent motion estimation for stereoscopic video | |
US20180144446A1 (en) | Image processing apparatus and method | |
US11308321B2 (en) | Method and system for 3D cornea position estimation | |
US11170578B1 (en) | Occlusion detection | |
US11544910B2 (en) | System and method for positioning image elements in augmented reality system | |
CN114365077A (en) | Viewer synchronized illumination sensing | |
CN114450185A (en) | Image display device, display control method, program, and recording medium | |
CN112106115A (en) | Method of estimating light for augmented reality and electronic device thereof | |
Ward et al. | 75‐3: Reducing Glare from Reflected Highlights in Mobile and Automotive Displays | |
NZ756028B2 (en) | Selective application of reprojection processing on layer sub-regions for optimizing late stage reprojection power |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20190702 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) | ||
A4 | Supplementary search report drawn up and despatched |
Effective date: 20200520 |
|
RIC1 | Information provided on ipc code assigned before grant |
Ipc: G06T 7/55 20170101ALI20200514BHEP Ipc: G06T 5/50 20060101ALI20200514BHEP Ipc: G06T 5/00 20060101AFI20200514BHEP Ipc: G09G 5/36 20060101ALI20200514BHEP |
|
GRAP | Despatch of communication of intention to grant a patent |
Free format text: ORIGINAL CODE: EPIDOSNIGR1 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: GRANT OF PATENT IS INTENDED |
|
INTG | Intention to grant announced |
Effective date: 20210318 |
|
GRAS | Grant fee paid |
Free format text: ORIGINAL CODE: EPIDOSNIGR3 |
|
GRAA | (expected) grant |
Free format text: ORIGINAL CODE: 0009210 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE PATENT HAS BEEN GRANTED |
|
AK | Designated contracting states |
Kind code of ref document: B1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
REG | Reference to a national code |
Ref country code: GB Ref legal event code: FG4D |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R081 Ref document number: 602017044499 Country of ref document: DE Owner name: FAURECIA LRYSTEC INC., MONTREAL, CA Free format text: FORMER OWNER: IRYSTEC SOFTWARE INC, MONTREAL, QUEBEC, CA |
|
REG | Reference to a national code |
Ref country code: CH Ref legal event code: PK Free format text: BERICHTIGUNG B8 Ref country code: CH Ref legal event code: EP |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R096 Ref document number: 602017044499 Country of ref document: DE |
|
RAP2 | Party data changed (patent owner data changed or rights of a patent transferred) |
Owner name: FAURECIA IRYSTEC INC. |
|
REG | Reference to a national code |
Ref country code: IE Ref legal event code: FG4D Ref country code: AT Ref legal event code: REF Ref document number: 1422313 Country of ref document: AT Kind code of ref document: T Effective date: 20210915 |
|
REG | Reference to a national code |
Ref country code: SE Ref legal event code: TRGR |
|
REG | Reference to a national code |
Ref country code: LT Ref legal event code: MG9D |
|
REG | Reference to a national code |
Ref country code: NL Ref legal event code: MP Effective date: 20210818 |
|
REG | Reference to a national code |
Ref country code: AT Ref legal event code: MK05 Ref document number: 1422313 Country of ref document: AT Kind code of ref document: T Effective date: 20210818 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: HR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210818 Ref country code: FI Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210818 Ref country code: ES Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210818 Ref country code: RS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210818 Ref country code: LT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210818 Ref country code: AT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210818 Ref country code: BG Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20211118 Ref country code: NO Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20211118 Ref country code: PT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20211220 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: PL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210818 Ref country code: LV Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210818 Ref country code: GR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20211119 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: NL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210818 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: DK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210818 |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R097 Ref document number: 602017044499 Country of ref document: DE |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: SM Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210818 Ref country code: SK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210818 Ref country code: RO Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210818 Ref country code: EE Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210818 Ref country code: CZ Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210818 Ref country code: AL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210818 |
|
PLBE | No opposition filed within time limit |
Free format text: ORIGINAL CODE: 0009261 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT |
|
26N | No opposition filed |
Effective date: 20220519 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: MC Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210818 Ref country code: IT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210818 |
|
REG | Reference to a national code |
Ref country code: CH Ref legal event code: PL |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: SI Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210818 |
|
REG | Reference to a national code |
Ref country code: BE Ref legal event code: MM Effective date: 20211231 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: LU Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20211218 Ref country code: IE Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20211218 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: BE Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20211231 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: LI Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20211231 Ref country code: CH Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20211231 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: CY Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210818 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: HU Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT; INVALID AB INITIO Effective date: 20171218 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: GB Payment date: 20231124 Year of fee payment: 7 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: SE Payment date: 20231121 Year of fee payment: 7 Ref country code: FR Payment date: 20231122 Year of fee payment: 7 Ref country code: DE Payment date: 20231121 Year of fee payment: 7 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: MK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210818 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: TR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20210818 |