EP2845165A2 - Alerte de lumière ambiante pour capteur d'image - Google Patents
Alerte de lumière ambiante pour capteur d'imageInfo
- Publication number
- EP2845165A2 EP2845165A2 EP13727016.1A EP13727016A EP2845165A2 EP 2845165 A2 EP2845165 A2 EP 2845165A2 EP 13727016 A EP13727016 A EP 13727016A EP 2845165 A2 EP2845165 A2 EP 2845165A2
- Authority
- EP
- European Patent Office
- Prior art keywords
- ambient light
- problematic
- light source
- camera component
- user
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F9/00—Arrangements for program control, e.g. control units
- G06F9/06—Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
- G06F9/44—Arrangements for executing specific programs
- G06F9/451—Execution arrangements for user interfaces
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/521—Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04803—Split screen, i.e. subdividing the display area or the window area into separate subareas
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30168—Image quality inspection
Definitions
- Gated three-dimensional (3-D) cameras for example time-of- flight (TOF) cameras, provide distance measurements to objects in a scene by illuminating a scene and capturing reflected light from the illumination. The distance measurements make up a depth map of the scene from which a 3-D image of the scene is generated.
- TOF time-of- flight
- Ambient lighting of a captured scene can interfere with the light provided by the 3-D camera and can result in incorrect distance measurements.
- "ambient light” is any light not supplied by the 3-D camera. It is therefore known to compensate for moderate levels of ambient light.
- the 3-D camera captures a frame of ambient light, while light from the 3-D camera is turned off or otherwise not received by the camera. The measured ambient light is thereafter subtracted from the light emitted by and reflected to the 3-D camera to allow for accurate distance measurement based on light from the camera alone.
- Embodiments of the present technology relate to an image camera component and its method of operation.
- the image camera component detects when ambient light within a field of view of the camera component interferes with operation of the camera component to correctly identify distances to objects within the field of view of the camera component.
- the image camera component may cause an alert to be generated so that a user can ameliorate the problematic ambient light source.
- the alert may include displaying a representation of the problematic ambient light source, and a position of the user relative to the problematic ambient light source.
- the alert may further include an indication of the degree of interference of the problematic ambient light source.
- the alert may further suggest an action to ameliorate the problem.
- the present technology relates to a method of detecting a problematic ambient light source using an image camera component capturing a field of view, the method comprising: (a) measuring ambient light within the field of view; (b) determining whether the amount of ambient light measured in said step (a) interferes with the operation of the image camera component; and (c) alerting a user as to the existence of the problematic ambient light source if it is determined in said step (b) that the amount of ambient light measured in said step (a) interferes with the operation of the image camera component.
- a further example of the present technology relates to a method of detecting a problematic ambient light source using an image camera component measuring distances to objects within a field of view, the method comprising: (a) measuring ambient light within the field of view; (b) determining whether photopixels of the image camera component are receiving ambient light at levels which prevent the image camera component from properly measuring distances to objects within the field of view; and (c) alerting a user as to the existence of the problematic ambient light source if it is determined in said step (b) that photopixels of the image camera component are receiving ambient light at levels which prevent the image camera component from properly measuring distances to objects within the field of view.
- the 3-D camera comprising: a photosurface including a plurality of pixels capable of measuring ambient light; a processor for processing data received from the photosurface, and an ambient light feedback engine executed by the processor for identifying a problematic ambient light source within the field of view from data received from the photosurface, and for alerting a user of the problematic ambient light source when identified so that the user can intervene to ameliorate the problem caused by the problematic ambient light source.
- Figure 1A illustrates an example embodiment of a target recognition, analysis, and tracking system in which embodiments of the technology can operate.
- Figure IB illustrates a further example embodiment of a target recognition, analysis, and tracking system in which embodiments of the technology can operate.
- Figure 2 shows a block diagram of an example of a capture device that may be used in the target recognition, analysis, and tracking system in which embodiments of the technology can operate.
- Figure 3 schematically shows an embodiment of a gated 3-D camera which can be used to measure distances to a scene.
- Figure 4 is a flowchart illustrating operation of an embodiment of the present technology
- Figure 5 is a screen illustration indicating a problematic ambient light source to the user.
- Figure 6 is a user interacting with the target recognition, analysis, and tracking system after correction of the problematic ambient light source.
- the camera includes a feedback system which measures ambient light and determines presence of a source of ambient light in its field of view (FOV) that is disruptive of satisfactory 3-D operation.
- a source of ambient light is at times referred to herein as a problematic ambient light source.
- the feedback system may generate an alert to indicate presence of the problematic source to a user of the 3-D camera.
- the alert may be provided on a visual display which indicates the location of the source so that the user can remove it, or reduce intensity of the disruption it causes.
- the feedback system may further indicate an example of a corrective action to be undertaken.
- Embodiments of the feedback system of the present disclosure may be provided as part of a time-of-flight 3-D camera used to track moving targets in a target recognition, analysis, and tracking system 10.
- the system 10 may provide a natural user interface (NUI) for gaming and other applications.
- NUI natural user interface
- the feedback system of the present disclosure may be used in a variety of applications other than a target recognition, analysis, and tracking system 10.
- the feedback system may be used in a variety of 3-D cameras other than time-of-flight cameras which use light to measure a distance to objects in the FOV.
- a target recognition, analysis, and tracking system 10 which may be used to recognize, analyze, and/or track a human target such as the user 18.
- Embodiments of the target recognition, analysis, and tracking system 10 include a computing device 12 for executing a gaming or other application.
- the computing device 12 may include hardware components and/or software components such that computing device 12 may be used to execute applications such as gaming and non-gaming applications.
- computing device 12 may include a processor such as a standardized processor, a specialized processor, a microprocessor, or the like that may execute instructions stored on a processor readable storage device for performing processes of the device 10 when active and running on full power.
- the system 10 further includes a capture device 20 for capturing image and audio data relating to one or more users and/or objects sensed by the capture device.
- the capture device 20 may be used to capture information relating to body and hand movements and/or gestures and speech of one or more users, which information is received by the computing environment and used to render, interact with and/or control aspects of a gaming or other application. Examples of the computing device 12 and capture device 20 are explained in greater detail below.
- Embodiments of the target recognition, analysis and tracking system 10 may be connected to an audio/visual (A/V) device 16 having a display 14.
- the device 16 may for example be a television, a monitor, a high-definition television (HDTV), or the like that may provide game or application visuals and/or audio to a user.
- the computing device 12 may include a video adapter such as a graphics card and/or an audio adapter such as a sound card that may provide audio/visual signals associated with the game or other application.
- the A/V device 16 may receive the audio/visual signals from the computing device 12 and may then output the game or application visuals and/or audio associated with the audio/visual signals to the user 18.
- the audio/visual device 16 may be connected to the computing device 12 via, for example, an S-Video cable, a coaxial cable, an HDMI cable, a DVI cable, a VGA cable, a component video cable, or the like.
- the computing device 12, the A/V device 16 and the capture device 20 may cooperate to render an avatar or on-screen character 19 on display 14.
- Fig. 1A shows a user 18 playing a soccer gaming application. The user's movements are tracked and used to animate the movements of the avatar 19.
- the avatar 19 mimics the movements of the user 18 in real world space so that the user 18 may perform movements and gestures which control the movements and actions of the avatar 19 on the display 14.
- the capture device 20 is used in a NUI system where, for example, a user 18 is scrolling through and controlling a user interface 21 with a variety of menu options presented on the display 14.
- the computing device 12 and the capture device 20 may be used to recognize and analyze movements and gestures of a user's body, and such movements and gestures may be interpreted as controls for the user interface.
- Fig. 2 illustrates an example embodiment of the capture device 20 that may be used in the target recognition, analysis, and tracking system 10.
- the capture device 20 may be configured to capture video having a depth image that may include depth values via any suitable technique including, for example, time-of-flight, structured light or the like.
- the capture device 20 may organize the calculated depth information into "Z layers," or layers that may be perpendicular to a Z axis extending from the depth camera along its line of sight.
- X and Y axes may be defined as being perpendicular to the Z axis.
- the Y axis may be vertical and the X axis may be horizontal. Together, the X, Y and Z axes define the 3-D real world space captured by capture device 20.
- the capture device 20 may include an image camera component 22.
- the image camera component 22 may be a depth camera that may capture the depth image of a scene.
- the depth image may include a two-dimensional (2-D) pixel area of the captured scene where each pixel in the 2-D pixel area may represent a depth value such as a length or distance in, for example, centimeters, millimeters, or the like of an object in the captured scene from the camera.
- the image camera component 22 may include an IR light component 24, 3-D camera 26, and an RGB camera 28 that may be used to capture the depth image of a scene.
- the IR light component 24 of the capture device 20 may emit an infrared light onto the scene and may then use sensors (described in greater detail below with reference to Fig. 3) to detect the backscattered light from the surface of one or more targets and objects in the scene using, for example, the 3-D camera 26 and/or the RGB camera 28.
- pulsed infrared light may be used such that the time between an outgoing light pulse and a corresponding incoming light pulse may be measured and used to determine a physical distance from the capture device 20 to a particular location on the targets or objects in the scene. Additionally, in other example embodiments, the phase of the outgoing light wave may be compared to the phase of the incoming light wave to determine a phase shift. The phase shift may then be used to determine a physical distance from the capture device 20 to a particular location on the targets or objects.
- time-of-flight analysis may be used to indirectly determine a physical distance from the capture device 20 to a particular location on the targets or objects by analyzing the intensity of the reflected beam of light over time via various techniques including, for example, shuttered light pulse imaging.
- the capture device 20 may use a structured light to capture depth information.
- patterned light i.e., light displayed as a known pattern such as a grid pattern or a stripe pattern
- the pattern may become deformed in response.
- Such a deformation of the pattern may be captured by, for example, the 3-D camera 26 and/or the RGB camera 28 and may then be analyzed to determine a physical distance from the capture device 20 to a particular location on the targets or objects.
- ambient light may affect the measurements taken by 3-D 26 and/or RGB camera 28.
- the capture device 20 may further include an ambient light feedback engine 100 which is a software engine for detecting a source of ambient light and alerting the user as to the location of the source of the ambient light. Further details of the ambient light feedback engine 100 are explained below. In alternative embodiments, the ambient light feedback engine 100 may be implemented in part or in whole on computing device 12.
- the capture device 20 may further include a processor 32 that may be in operative communication with the image camera component 22 and ambient light feedback engine 100.
- the processor 32 may include a standardized processor, a specialized processor, a microprocessor, or the like that may execute instructions that may include instructions for receiving the depth image, determining whether a suitable target may be included in the depth image, converting the suitable target into a skeletal representation or model of the target, or any other suitable instruction.
- the capture device 20 may further include a memory component 34 that may store the instructions that may be executed by the processor 32, images or frames of images captured by the 3-D camera or RGB camera, or any other suitable information, images, or the like.
- the memory component 34 may include random access memory (RAM), read only memory (ROM), cache, Flash memory, a hard disk, or any other suitable storage component.
- RAM random access memory
- ROM read only memory
- cache Flash memory
- a hard disk or any other suitable storage component.
- the memory component 34 may be a separate component in communication with the image camera component 22 and the processor 32.
- the memory component 34 may be integrated into the processor 32 and/or the image camera component 22.
- the capture device 20 may be in communication with the computing device 12 via a communication link 36.
- the communication link 36 may be a wired connection including, for example, a USB connection, a Firewire connection, an Ethernet cable connection, or the like and/or a wireless connection such as a wireless 802.11b, g, a, or n connection.
- the computing device 12 may provide a clock to the capture device 20 that may be used to determine when to capture, for example, a scene via the communication link 36.
- the capture device 20 may provide the depth information and images captured by, for example, the 3-D camera 26 and/or the RGB camera 28. With the aid of these devices, a partial skeletal model may be developed with the resulting data provided to the computing device 12 via the communication link 36.
- Fig. 3 schematically shows an embodiment of a gated 3-D image camera component 22 which can be used to measure distances to a scene 130 having objects schematically represented by objects 131 and 132.
- the camera component 22, which is represented schematically, comprises a lens system, represented by a lens 121, a photosurface 300 with at least two capture areas on which the lens system images the scene, and a suitable light source 24.
- a suitable light source are a laser or an LED, or an array of lasers and/or LEDs, that is controllable to illuminate scene 130 with pulses of light by control circuitry 124.
- control circuitry 124 comprises clock logic or has access to a clock to generate the timing necessary for the synchronization.
- the control circuitry 124 comprises a laser or LED drive circuit using for example a current or a voltage which drives electronic circuitry to drive the light source 24 at the predetermined pulse width.
- the control circuitry 124 also has access to a power supply (not shown) and logic for generating different voltage levels as needed.
- the control circuitry 124 may additionally or alternatively have access to the different voltage levels and logic for determining the timing and conductive paths to which to apply the different voltage levels for turning ON and OFF the respective image capture areas.
- control circuitry 124 controls light source 24 to emit a train of light pulses, schematically represented by a train 140 of square light pulses 141 having a pulse width, to illuminate scene 130.
- a train of light pulses is typically used because a light source may not provide sufficient energy in a single light pulse so that enough light is reflected by objects in the scene from the light pulse and back to the camera to provide satisfactory distance measurements to the objects.
- Intensity of the light pulses, and their number in a light pulse train are set so that an amount of reflected light captured from all the light pulses in the train is sufficient to provide acceptable distance measurements to objects in the scene.
- the radiated light pulses are infrared (IR) or near infrared (NIR) light pulses.
- the short capture period may have duration about equal to the pulse width. In one example, the short capture period may be 10-15ns and the pulse width may be about 10ns.
- the long capture period may be 30-45ns in this example. In another example, the short capture period may be 20ns, and the long capture period may be about 60ns. These periods are by way of example only, and the time periods in embodiments may vary outside of these ranges and values.
- control circuitry 124 turns ON or gates ON the respective image capture area of photosurface 300 based on whether a gated or ungated period is beginning. When the image capture area is gated ON, light sensitive or light sensing elements such as photopixels, capture light.
- the capture of light refers to receiving light and storing an electrical representation of it.
- the control circuitry 124 sets the short capture period to be the duration equal to the light pulse width.
- the light pulse width, short capture period duration, and a delay time T define a spatial "imaging slice" of scene 130 bounded by minimum and maximum boundary distances.
- the camera captures light reflected from the scene during gated capture periods only for objects of the scene located between the lower bound distance and the upper bound distance. During the ungated period, the camera tries to capture all the light reflected from the pulses by the scene that reaches the camera for normalization of the gated light image data.
- the light from light component 24 is switched off, and the pixels receive only ambient light.
- the ambient light may be measured and subtracted from the light (pulsed and ambient) received in the pixels of the photosurface 300. This allows the ambient light to be subtracted so that the processors may determine distance measurements to objects in the FOV based on light reflected from the light component 24 alone.
- Light reflected by objects in scene 130 from light pulses 141 is schematically represented by trains 145 of light pulses 146 for a few regions 131 and 132 of scene 130.
- the reflected light pulses 146 from objects in scene 130 located in the imaging slice are focused by the lens system 121 and imaged on light sensitive pixels (or photopixels) 302 of the gated ON area of the photosurface 300.
- Amounts of light from the reflected pulse trains 145 are imaged on photopixels 302 of photosurface 300 and stored during capture periods for use in determining distances to objects of scene 130 to provide a 3-D image of the scene.
- control circuitry 124 is communicatively coupled to the processor 32 of the image capture device 20 to communicate messages related to frame timing and frame transfer.
- the stored image data captured by the photosurface 300 is readout to a frame buffer in memory 34 for further processing, such as for example by the processor 32 and/or computing device 12 of the target recognition, analysis and tracking system 10 shown in Fig. 2.
- moderate levels of ambient light may be corrected for when taking distance measurements with image camera component 22. In operation, however, it may happen that there are high levels of ambient light on at least portions of the FOV. Generally, where a small number of pixels register too much ambient light, these pixels may be disregarded, and the camera component 22 may still return accurate distance measurements. However, where a predetermined number of pixels indicate an amount of ambient light that is too high for correction, the image camera component 22 indicates a malfunction and does not provide distance measurements.
- Embodiments of the present disclosure address this problem by implementing an ambient light feedback engine 100, as shown schematically in Fig. 2, and as now explained with reference to the flowchart of Fig. 4 and the illustrations of Figs. 5 and 6.
- examples of the ambient light feedback engine 100 may be implemented by processor 32 associated with the image camera component 22.
- the engine 100 may be implemented by processor 32 of capture device 20 and/or by a processor in the computing device 12.
- a step 200 the amount of light incident on each of the photopixels 302 of photosurface 300 is measured and stored. This may occur during intervals where no light from the IR light component 24 is received on the photosurface 300. Alternatively or additionally, this may occur when the photopixels 302 receive both ambient light and IR light from component 24.
- the ambient light feedback engine 100 determines whether a predetermined number of photopixels have measured ambient light above a threshold value.
- a photopixel receiving ambient light above the threshold is referred to herein as an ambient-saturated photopixel.
- this threshold value for an ambient-saturated photopixel may be an amount of ambient light which prevents accurate determination of the time of flight of the light from the IR component 24 to that photopixel 302. That is, after the interval where ambient light is measured alone, the image camera component 22 is not able to compensate for the ambient light so that the operation of the image camera component is impaired. In the case of a time of flight 3-D camera, this means that the 3-D camera is not able to properly measure distances to objects within the field of view.
- the threshold value for an ambient-saturated photopixel may vary in alternative embodiments. This threshold may be set at a point where ambient light causes even the slightest interference with the determination of distances to objects in the field of view. Alternatively, the threshold may be set at a point where ambient light causes some small but acceptable interference with the determination of distances to objects in the field of view.
- the number of ambient-saturated photopixels 302 which comprise the predetermined number of ambient-saturated photopixels may vary.
- the predetermined number of ambient-saturated photopixels may be some number or percentage, for example 10% to 50% of the total number of photopixels 302 on photosurface 300.
- the predetermined number of ambient saturated photopixels may be reached when a given percentage of photopixels in a certain cluster of photopixels are ambient-saturated.
- a small lamp in the FOV may provide ambient light which adversely affects only a cluster of photopixels.
- the percentage of ambient-saturated pixels in a cluster of photopixels of a given size exceeds some percentage, for example above 50%, this may satisfy the condition of step 204.
- the percentages given above are by way of example only, and may vary above or below those set forth in further embodiments.
- step 204 may be satisfied by some combination of the percentage of overall photopixels which are ambient-saturated and the percentage of photopixels within a given cluster of photopixels that are ambient-saturated.
- step 204 If the number of ambient-saturated photopixels is less than the predetermined number in step 204, the engine 100 returns to step 200 for the next measurement of light incident on the photopixels. On the other hand, if the number of ambient-saturated photopixels exceeds the predetermined number in step 204, the engine 100 performs one or more of a variety of steps to notify the user of a problem with an ambient light source in the FOV and, possibly, suggest corrective action.
- the ambient light feedback engine 100 may notify the user of an excessive ambient light source in the FOV. This notification may be performed by a variety of methods.
- the engine 100 may cause the computing device 12 to display an alert on the display as to the problematic ambient light source.
- the alert may be audible over speakers associated with the device 10.
- the engine 100 may identify the location of the problematic ambient light source by examining which photopixels 302 are affected. Once the area is identified, the FOV may be shown to the user on display 14 with the problematic ambient light source highlighted on the display. For example, Figs. 1A and IB show a user 18 in a room with a window 25. The daylight coming into the window 25 may be providing too much ambient light. As such, in step 212, the engine 100 may cause the computing device 12 to display the FOV with the problematic ambient light source highlighted, as shown for example in Fig. 5. In Fig. 5, the displayed FOV shows highlighting 102 around the window 25 to indicate that that is the source of the problem.
- the problematic ambient light source may be highlighted with an outline 102 around the light source, as shown in Fig. 5. Alternatively or additionally, the problematic area may be highlighted by shading, as also shown in Fig. 5. The location of the problematic ambient light source may be highlighted in other ways in further embodiments.
- the view of Fig. 5 may also show the user positioned relative to the problematic light source to make it easier for the user to identify the location of the problematic ambient light source.
- the view of the scene captured by capture device 20 may be displayed to the user on display 14 from a variety of different perspectives using known transformation matrices so that the position of the problematic light source relative to the user may be clearly displayed to the user on display 14.
- the representation of the user and problematic light source displayed to the user may be an animation including an icon representing the highlighted ambient light source and an icon representing the user 18.
- it may be video captured by the capture device 20 showing the user and the problematic ambient light source, with the highlight 102 added to the video.
- Fig. 5 the view of the user and problematic light source takes up essentially the full display 14.
- the view shown in Fig. 5 may be made smaller, so that it is placed on a portion of the display 14, with the remainder of the screen still showing the original content the user was viewing/interacting with.
- each such problematic area may be identified in steps 200 and 204, and shown to user 18 in step 212.
- the ambient light feedback engine 100 may also determine and display an intensity scale 104 (Fig. 5) indicating the degree, or magnitude, of interference of the problematic ambient light source.
- the processor 32 in the capture device 20 can determine the number of photopixels affected by the problematic ambient light source. The number and proximity of affected photopixels can be translated into a degree of interference, and that degree can be displayed to the user in step 214.
- Fig. 5 shows an intensity scale 104 comprised of a number of dots 106.
- the degree of interference can be relayed graphically to the user 18 over display 14 in any of a variety of different ways, including by the length of a bar, a color intensity map, etc.
- Step 214 and the intensity scale 104 may be omitted in further embodiments.
- the ambient light feedback engine 100 may further suggest one or more corrective actions in steps 218-230. For example, given the measured amount of ambient light, and the shape pattern of the ambient light, the engine 100 may be able to characterize the source of light by comparison to data representing predefined light sources stored in memory (memory 34 in capture device 20, or memory within the computing device). For example, where it is determined that the problematic ambient light is in the shape of a rectangle on a wall within the FOV, the engine 100 may interpret this as a window. Where it is determined that the problematic ambient light is in the shape of a circle or oval within the FOV, the engine 100 may interpret this as a lamp or light fixture in the FOV. Other examples of known ambient light sources are contemplated.
- the engine 100 may suggest a corrective action in step 218.
- a corrective action display 110 which in this example displays the message, "Too much light coming in the window. Try covering the window.” It is understood that this specific wording is by way of example and the concept may be expressed in a wide variety of ways.
- the user may close a shade 40, as shown in Fig. 6.
- step 222 the engine 100 checks whether a corrective action was taken. This can be determined by measuring the ambient light on photosurface 300 as explained above. If no corrective action was taken, and there is too much ambient light for accurate distance measurements by camera component 22, then the engine 100 may cause the computing device 12 to display an error message in step 224.
- step 222 if it is determined in step 222 that a corrective action was taken, the engine 100 checks in step 226 whether the corrective action ameliorated the problem of excessive ambient light. Again, this may be performed by measuring the ambient light on photosurface 300. If the problem was successfully corrected, the routine may return to step 200 and begin monitoring light anew. However, if the corrective action did not solve the problem in step 226, the engine 100 can check in step 230 whether other potential corrective actions are available (stored in memory).
- step 230 If there are no other available potential corrective actions in step 230, the engine 100 may cause the computing device 12 to display an error message in step 234. If there are further potential corrective actions in step 230, the routine returns to step 218 and displays another potential corrective action. Steps 218 and 230 for suggesting one or more corrective actions may be omitted in further embodiments.
- the present system allows a user solve the problem of excessive ambient light which in the past could render a device 10 inoperable.
- a user is alerted as to the existence and location of a problematic ambient light source so that the user can intervene to remove the ambient light source and solve the problem.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Software Systems (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Optics & Photonics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Stroboscope Apparatuses (AREA)
Abstract
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/316,211 US9244583B2 (en) | 2011-12-09 | 2011-12-09 | Adjusting user interface screen order and composition |
PCT/US2013/025479 WO2013086543A2 (fr) | 2011-12-09 | 2013-02-11 | Alerte de lumière ambiante pour capteur d'image |
Publications (2)
Publication Number | Publication Date |
---|---|
EP2845165A2 true EP2845165A2 (fr) | 2015-03-11 |
EP2845165A4 EP2845165A4 (fr) | 2015-08-19 |
Family
ID=52462765
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP13727016.1A Withdrawn EP2845165A4 (fr) | 2011-12-09 | 2013-02-11 | Alerte de lumière ambiante pour capteur d'image |
Country Status (2)
Country | Link |
---|---|
EP (1) | EP2845165A4 (fr) |
WO (1) | WO2013086543A2 (fr) |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
IL239919A (en) * | 2015-07-14 | 2016-11-30 | Brightway Vision Ltd | Branded template lighting |
US10282614B2 (en) | 2016-02-18 | 2019-05-07 | Microsoft Technology Licensing, Llc | Real-time detection of object scanability |
WO2017145152A1 (fr) * | 2016-02-24 | 2017-08-31 | Superb Reality Ltd. | Système et procédé de différenciation d'objet dans un espace tridimensionnel |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6816200B1 (en) * | 1998-08-31 | 2004-11-09 | Neostar, Inc. | Method and apparatus for detecting camera sensor intensity saturation |
US7714265B2 (en) * | 2005-09-30 | 2010-05-11 | Apple Inc. | Integrated proximity sensor and light sensor |
JP4395150B2 (ja) * | 2006-06-28 | 2010-01-06 | 富士フイルム株式会社 | 距離画像センサ |
WO2009030419A2 (fr) * | 2007-08-28 | 2009-03-12 | Valeo Schalter Und Sensoren Gmbh | Procédé et système pour évaluer des valeurs de luminosité sur des images détectées, pour systèmes de reconnaissance d'environnement à interprétation d'images, notamment en termes de distinction jour/nuit |
US7907061B2 (en) * | 2007-11-14 | 2011-03-15 | Intersil Americas Inc. | Proximity sensors and methods for sensing proximity |
KR101565969B1 (ko) * | 2009-09-01 | 2015-11-05 | 삼성전자주식회사 | 깊이 정보를 추정할 수 있는 방법과 장치, 및 상기 장치를 포함하는 신호 처리 장치 |
WO2011059127A1 (fr) * | 2009-11-13 | 2011-05-19 | 한국과학기술연구원 | Détecteur à infrarouge et procédé de détection l'utilisant |
US8866837B2 (en) * | 2010-02-02 | 2014-10-21 | Microsoft Corporation | Enhancement of images for display on liquid crystal displays |
US9194953B2 (en) * | 2010-10-21 | 2015-11-24 | Sony Corporation | 3D time-of-light camera and method |
KR101669412B1 (ko) * | 2010-11-01 | 2016-10-26 | 삼성전자주식회사 | 3d 카메라를 위한 깊이 정보 측정 방법 및 장치 |
-
2013
- 2013-02-11 WO PCT/US2013/025479 patent/WO2013086543A2/fr active Application Filing
- 2013-02-11 EP EP13727016.1A patent/EP2845165A4/fr not_active Withdrawn
Also Published As
Publication number | Publication date |
---|---|
WO2013086543A2 (fr) | 2013-06-13 |
WO2013086543A3 (fr) | 2013-08-22 |
EP2845165A4 (fr) | 2015-08-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108370438B (zh) | 范围选通的深度相机组件 | |
US10927969B2 (en) | Auto range control for active illumination depth camera | |
CN113556528B (zh) | 用于通过深度传感相机使用光发射在弱光条件下捕获视频影像的方法和系统 | |
EP2997395B1 (fr) | Réduction d'interférences de systèmes de temps de vol (tof) | |
US9148637B2 (en) | Face detection and tracking | |
CN108463740A (zh) | 使用结构化光和飞行时间的深度映射 | |
US20150070489A1 (en) | Optical modules for use with depth cameras | |
CN112189147B (zh) | 一种飞行时间ToF相机和一种ToF方法 | |
US20150085075A1 (en) | Optical modules that reduce speckle contrast and diffraction artifacts | |
US8605205B2 (en) | Display as lighting for photos or video | |
US10616561B2 (en) | Method and apparatus for generating a 3-D image | |
US20130208091A1 (en) | Ambient light alert for an image sensor | |
US11143879B2 (en) | Semi-dense depth estimation from a dynamic vision sensor (DVS) stereo pair and a pulsed speckle pattern projector | |
WO2017112028A1 (fr) | Système et procédé d'amélioration de la performance du rapport signal sur bruit d'un système de caméra de profondeur | |
US10325377B2 (en) | Image depth sensing method and image depth sensing apparatus | |
WO2013086543A2 (fr) | Alerte de lumière ambiante pour capteur d'image | |
TWI526706B (zh) | 影像系統 | |
US20190147280A1 (en) | Image processing method and electronic apparatus for foreground image extraction | |
US9953213B2 (en) | Self discovery of autonomous NUI devices | |
TWI542892B (zh) | 影像系統 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
17P | Request for examination filed |
Effective date: 20140813 |
|
AK | Designated contracting states |
Kind code of ref document: A2 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
RAP1 | Party data changed (applicant data changed or rights of an application transferred) |
Owner name: MICROSOFT TECHNOLOGY LICENSING, LLC |
|
A4 | Supplementary search report drawn up and despatched |
Effective date: 20150722 |
|
RIC1 | Information provided on ipc code assigned before grant |
Ipc: G06F 3/01 20060101ALI20150716BHEP Ipc: G06F 3/048 20130101ALI20150716BHEP Ipc: G06T 5/00 20060101AFI20150716BHEP Ipc: G06F 3/03 20060101ALI20150716BHEP |
|
17Q | First examination report despatched |
Effective date: 20150804 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN |
|
18D | Application deemed to be withdrawn |
Effective date: 20151215 |
|
DAX | Request for extension of the european patent (deleted) |