US20190066366A1 - Methods and Apparatus for Decorating User Interface Elements with Environmental Lighting - Google Patents

Methods and Apparatus for Decorating User Interface Elements with Environmental Lighting Download PDF

Info

Publication number
US20190066366A1
US20190066366A1 US16/163,305 US201816163305A US2019066366A1 US 20190066366 A1 US20190066366 A1 US 20190066366A1 US 201816163305 A US201816163305 A US 201816163305A US 2019066366 A1 US2019066366 A1 US 2019066366A1
Authority
US
United States
Prior art keywords
image
light
light source
scene
exemplary embodiment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US16/163,305
Inventor
Bobby Gene Burrough
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to US16/163,305 priority Critical patent/US20190066366A1/en
Publication of US20190066366A1 publication Critical patent/US20190066366A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/203D [Three Dimensional] animation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/388Volumetric displays, i.e. systems where the image is built up from picture elements distributed through a volume
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3179Video signal processing therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence

Definitions

  • the present invention relates to the operation of image processing systems. More specifically, the present invention relates to the processing of images derived from a surrounding environment.
  • User interfaces have evolved significantly over the past forty years. They have progressed through command-driven terminals, mouse-driven two-dimensional (2D) graphical user interfaces, and touch-driven 2D graphical user interfaces.
  • the computer has displayed information via forms and effects crafted by a user interface (UI) designer or developer. Regions of color, brightness, and contrast are crafted in such a way as to imply, for example, depth and lighting. Such effects help the user visually understand and organize large quantities of information. To date, such effects have been implemented statically in two dimensions. However, the way a graphical user interface is rendered does not react to the lighting of the environment in which the computer is used.
  • UI controls are decorated with lighting effects that are derived from the environment surrounding a device. The net effect is that the controls appear as if they exist in the user's real, physical environment, rather than an unlit, or arbitrarily lit, virtual environment.
  • images of the environment surrounding a device are captured via one or more image sensors. Those images are analyzed via one or more heuristics that recognize positions and characteristics of various light sources in the image. The positions of those light sources relative to the device (or device display) are modeled in a virtual three-dimensional (3D) scene.
  • UI elements to be rendered on a device display are located within the 3D scene and decorated based on the location and characteristics of the light sources. For example, the UI elements are decorated with highlights and/or shadows based on the characteristics of the light sources.
  • the decorated UI elements in the 3D scene are subsequently rendered and displayed, user experiences are improved by increasing immersiveness of the displayed UI elements, reducing cognitive load, and increasing the degree to which the users may visually understand what they are seeing.
  • a mobile computing device with an imaging sensor performs operations to decorate UI elements with environmental lighting.
  • the image sensor captures an image of the device's surroundings, which is then analyzed to find the brightest light sources in the image. That information is then used as a control parameter for a Phong shading algorithm which shades UI elements accordingly.
  • the decorated UI elements are then rendered on a device display.
  • a method in an exemplary embodiment, includes acquiring an image of an environment, analyzing the image using at least one heuristic to determine a light configuration of a light source in the image, and decorating a UI element based on the light configuration of the light source to generate a decorated UI element.
  • an apparatus in an exemplary embodiment, includes an image sensor that acquires an image of an environment, a light detector that analyzes the image using at least one heuristic to determine a light configuration of a light source in the image, and a renderer that decorates a UI element based on the light configuration of the light source to generate a decorated UI element.
  • FIG. 1 shows devices comprising exemplary embodiments of a UI element lighting system
  • FIG. 2 shows a device that includes an exemplary embodiment of a UI element lighting system
  • FIG. 3 shows an exemplary embodiment of a 3D scene that illustrates relationships between a notional light source, a 3D UI element, a 2D UI element, and the resulting visual effects shown in the 3D scene;
  • FIG. 4 shows exemplary embodiments of an image of an environment surrounding a device and a brightness map that illustrates regions of brightness in the image
  • FIG. 5A shows how the brightness map shown in FIG. 4 relates to an image coordinate space
  • FIG. 5B shows an exemplary embodiment of a graph that illustrates a latitudinal and longitudinal relationship between the field of view of an image sensor and the brightness map shown in FIG. 5A ;
  • FIG. 6 shows an exemplary embodiment illustrating how a notional light source is positioned in a 3D scene according to latitude and longitude values determined from the graph shown in FIG. 5B ;
  • FIG. 7 shows exemplary embodiments of UI elements for use with exemplary embodiments of the UI element lighting system
  • FIG. 8 shows a detailed exemplary embodiment of an image sensor for use with exemplary embodiments of the UI element lighting system
  • FIG. 9 shows a detailed exemplary embodiment of a UI element lighting system
  • FIG. 10 shows detailed exemplary embodiments of image sensors and image receiver shown in FIG. 9 ;
  • FIG. 11 shows a detailed exemplary embodiment of a lighting processor and memory shown in FIG. 9 ;
  • FIG. 12 shows an exemplary embodiment of a method for decorating UI elements with environmental lighting.
  • FIG. 1 shows devices 100 comprising exemplary embodiments of a UI element lighting system (UIELS).
  • the UIELS operates to decorate UI elements using lighting effects derived from 2D images taken of the surrounding environment.
  • the devices shown include tablet computer 102 , notebook computer 104 , cell phone 106 , and smart phone 108 .
  • embodiments of the UIELS are suitable for use with virtually any type of device to decorate UI elements using lighting effects derived from the surrounding environment and are not limited to the devices shown.
  • the UIELS also is suitable for use with automobile dashboard systems, billboards, stadium big screens and virtually all types of display devices.
  • FIG. 2 shows a device 200 that includes an exemplary embodiment of a UI element lighting system 202 .
  • the UIELS 202 includes lighting configuration unit (LCU) 204 and image sensor 206 .
  • the image sensor 206 operates to acquire real-time 2D images of the environment surrounding the device 200 .
  • the LCU 204 operates to receive the real-time 2D images and decorate UI elements using lighting effects derived from the 2D images taken of the surrounding environment.
  • the device 200 comprises a device display 208 that displays visual UI elements, such as interactive UI elements 210 and 212 .
  • the interactive UI element 210 has a top surface 214 that shows a first function 216 that is performed by the device 200 when a user touches the UI element 210 .
  • the interactive UI element 212 has a top surface 218 that shows a second function 220 that is performed by the device 200 when a user touches the UI element 212 .
  • the UIELS 202 operates to detect and locate lights sources in the environment surrounding the device 200 from images acquired by the sensor 206 . For example, the UIELS 202 detects the light source 210 and determines is position relative to the device 200 and/or the orientation of the display screen 208 . The UIELS 202 also detects characteristics or attributes of the light source 210 , such as its size, intensity, location, and color characteristics. The UIELS 202 uses this information about the light source 210 to decorate the UI elements 210 and 212 . Thus, based on the characteristics of the light source 210 , the UIELS 202 operates to provide appropriate highlighting of the surfaces 214 and 218 .
  • the UIELS 202 also provides appropriate shadowing of the UI elements, such as the shadow 222 provided for each UI element that is determined from the relative location, size, and intensity of the light source 210 . It should be noted that the UIELS 202 is not limited to detecting only one light source; thus, multiple light sources can be detected and their associated locations and characteristics used to decorate UI elements provided on the display 208 .
  • changes in the orientation and/or position of the image sensor 206 can be captured in one or more additional real-time images resulting in changes in the relative location of the light sources in the additional images and corresponding changes to the UI element decorations.
  • FIG. 3 shows an exemplary embodiment of a 3D scene 300 that illustrates relationships between a notional light source 302 , a 3D UI element 304 , a 2D UI element 306 , and the resulting visual effects shown in the 3D scene 300 .
  • the 3D scene 300 is intended to encompass all geometric notions such as points, lines, line segments, vectors, edges, faces, polygons, etc., though they may have some number of dimensions other than three. For example, it is possible to place 2D elements such as planes, quads, triangles, etc. inside the 3D scene 300 . Furthermore, a point has zero dimensions, but may be positioned in the 3D scene 300 .
  • 3D scene also includes scenes in which any of the dimensions are ignored or discarded (i.e. a 2D scene is a 3D scene in which one of the dimensions remains zero).
  • 3D encompasses any geometric notion which may be expressed or represented in a three-dimensional space, such as the 3D scene 300 .
  • a notional light source is a notion which is used as a control parameter for rendering of visual effects.
  • the sides of a cube may each have a greater or lesser degree of brightness due to their orientation relative to a notional light source, so as to appear to be “lit” by that light source.
  • Notional light sources do not exist in real, physical space, but are modeled in the 3D scene to illuminate UI elements positioned the 3D scene.
  • Arranging the notional light source 302 in the 3D scene 300 is accomplished by configuring the size, position, orientation, color, and brightness of the notional light source 302 to match that of an actual light source in the environment surrounding the device.
  • the notional light source 302 is then used to determine highlighting and shadows that result when light from the light source strikes the elements 304 and 306 .
  • the element 304 comprises top surfaces 308 and 310 , which are highlighted by light from the light source 302 .
  • the element 304 also comprises side surfaces 312 and 314 , which also are highlighted by light from the light source 302 .
  • the location of the light source 302 also results in shadow effect 316 , which is cast when the light from the light source 302 strikes the element 304 .
  • the element 306 comprises top surface 318 , which also is highlighted by light from the light source 302 .
  • the location of the light source 302 also results in shadow effect 320 , which is cast when the light from the light source 302 strikes the element 306 .
  • the 3D scene may comprise multiple light sources and that the effects of each light source on the elements 304 and 306 can be modeled.
  • the 3D scene 300 comprises one or more light sources.
  • Each light source comprises a plurality of characteristics, such as location, size, intensity, color, and any other associated characteristics.
  • the 3D scene also comprises one or more UI elements.
  • Each UI element comprises one or more surfaces that are highlighted by the light source.
  • Each UI element may also produce one or more shadows when struck by each light source.
  • each UI element is decorated by the highlighting and shadows to form a decorated UI element for display.
  • FIG. 4 shows exemplary embodiments of an image 402 of an environment surrounding a device and a brightness map 404 that illustrates regions of brightness in the image 402 .
  • the image 402 may be captured by the image sensor 206 shown in FIG. 2 .
  • the captured image includes a bright light source represented by the Sun 408 .
  • the image sensor is arranged such that its focal axis is aligned (or can be translated) to the normal of the device display screen. Given this angle, and a particular field of view of the image sensor, the relative angle between the screen and each pixel in the image 402 can be calculated. For each two-dimensional coordinate in the captured image, corresponding azimuthal angles to features, objects, or characteristics in the image can be calculated.
  • the brightness map 404 is a map of light intensity and illustrates two bright regions 406 in the captured image.
  • the two bright regions are measured and form notional light sources in a 3D scene.
  • the brightness map 404 is created by converting each pixel in the image 402 from its initial red-green-blue (RGB) value to a luminosity value L.
  • RGB red-green-blue
  • L luminosity value
  • the values of R, G, and B are constrained to the range [0, 1].
  • the luminosity value can be computed from the following expression.
  • the pixel(s) with the greatest luminosity are selected to represent one or more light sources. It should be noted that the above implementation is exemplary and that any suitable technique can be used to generate the luminosity map 404 .
  • characteristics of the light source are determined from the image 402 , the brightness map 404 or both. For example, the location of the light source, the size of the light source, the color of the light source, the intensity of the light source, along with any other characteristics are determined from the image 402 , the brightness map 404 or both.
  • FIG. 5A shows how the brightness map 404 shown in FIG. 4 relates to an image coordinate space 500 .
  • Point P 502 represents a bright region of the map 404 , which has been selected by a suitable heuristic.
  • the image 402 is analyzed according to the heuristic to detect light sources (e.g., bright regions 406 ) in the environment. By recognizing the location of bright regions in the image, the corresponding azimuthal coordinates relative to the device screen are calculated.
  • light sources e.g., bright regions 406
  • FIG. 5B shows an exemplary embodiment of a graph 504 that illustrates the latitudinal and longitudinal relationship between the field of view of the image sensor that captured the image 402 and the brightness map 404 shown in FIG. 5A .
  • the point P 502 which resides at the same location as depicted in FIG. 5A , corresponds to a particular latitude and longitude of the detected light source.
  • the image space 504 shows how the corresponding azimuthal coordinates of point P 502 relative to the device screen are calculated. With these coordinates, a notional light source in the 3D scene 300 is positioned at the proper location.
  • FIG. 6 shows an exemplary embodiment illustrating how a notional light source is positioned in a 3D scene 600 according to latitude and longitude values determined from the graph 504 shown in FIG. 5B .
  • a light source 602 is positioned along a sphere 604 based on the latitude and longitude values determined from the graph 504 and oriented such that it emits light in the direction of the origin of the sphere 604 .
  • the light source 602 is positioned according to the latitude and longitude of point P 502 .
  • the color and brightness are configured according to both color or non-color (e.g. ISO level) data provided by the image sensor (e.g., sensor 206 ).
  • Shadow region 608 associated with a UI element 606 is arranged to appear as it would naturally when derived from the position of the arranged light source 602 and further occluded by the UI element 606 .
  • the surfaces 610 and 612 of the UI element 606 are highlighted with color and intensity according to the color and intensity of the configured light source 602 and their orientation to the light source 602 .
  • FIG. 7 shows exemplary embodiments of UI elements 700 for use with the exemplary embodiments of the UIELS 202 .
  • the UI elements 700 comprise a graphic UI element 702 , a structural UI element 704 , a button UI element 706 , an emoji UI element 708 , and character UI elements 710 .
  • the UI elements 700 are exemplary and that other types of UI elements can be lighted with the UIELS 202 .
  • FIG. 8 shows a detailed exemplary embodiment of an image sensor 800 for use with the UIELS 202 .
  • the image sensor 800 is suitable for use as part of the image sensors 206 shown in FIG. 2 .
  • the image sensor 800 comprises a sensor body 802 that houses an image sensor 806 that is covered by a lens 804 .
  • the lens 804 may be a wide-angle, fisheye lens. However, other factors such as cost and form factor may affect the choice of lens design for a given implementation.
  • the lens 804 operates to provide a wide field of view of the surrounding environment that is captured by the image sensor 806 .
  • different sensor/lens combinations are used to acquire a desired field of view of the surrounding environment. Evaluation of a particular sensor/lens configuration should consider the accuracy of the system's ability to project the image onto the surface of the image sensor 806 .
  • FIG. 9 shows a detailed exemplary embodiment of a UI element lighting system 900 .
  • the UIELS 900 is suitable for use as the UIELS 202 shown in FIG. 2 .
  • the UIELS 900 comprises one or more image sensors 902 and a lighting configuration unit 904 .
  • the image sensors 902 comprise one or more high-resolution image sensors that output real-time 2D images.
  • each image sensor can output a stream of real-time 2D image frames at 30 frames per second (fps) (or other suitable frame rate).
  • the stream of 2D images output from the images sensors 902 is shown at 912 .
  • the image sensors comprise an ambient light sensor and/or a low pixel count sensor. Information from these sensors is also output to the LCU 904 .
  • the LCU 904 includes an image receiver 906 , lighting processor 908 and memory 910 .
  • the image receiver 906 receives one or more real-time images 912 from the image sensors 902 and processes these images into a real-time 2D image stream 914 that is passed to the lighting processor 908 .
  • the image stream 912 comprises images from multiple image sensors
  • the image receiver 906 operates to combine these images into the real-time 2D image stream 914 .
  • the image receiver 906 stitches together multiple images from the image sensors 902 to generate the real-time 2D image stream 914 that provides a desired (e.g.,)360° field of view around the image sensors 902 .
  • the image receiver 906 also can receive data from an ambient light sensor and/or a low pixel count sensor that are part of the image sensors 902 and passes this data to the lighting processor 908 .
  • the lighting processor 908 receives the stream of images 914 and detects one or more light sources within the images. When a light source is detected, characteristics of the light source are determined. For example, the relative location, size, intensity, and color of each light source is determined. Other characteristics also may be determined. The lighting processor 908 saves the information for each light source as a light configuration 916 in the memory 910 .
  • the lighting processor 908 uses the light configurations 916 to generate light sources in a 3D scene 920 stored in the memory 910 .
  • each light source is positioned along a sphere based on its computed latitude and longitude values as illustrated and described with reference to FIG. 6 .
  • UI elements can be placed in the 3D scene and decorated according to the light configurations 916 and their orientation to the light configurations.
  • the lighting processor 908 uses the light sources in the 3D scene 920 to decorate UI elements. For example, lighting processor 908 retrieves UI elements 918 from the memory 910 and places and decorates each UI element in the 3D scene. In an exemplary embodiment, the lighting processor 908 uses the light characteristic in each configured light source in the 3D scene 920 to determine highlighting and shadowing for each UI element based on it orientation to each light source. After all the UI elements have been configured and decorated in the 3D scene 920 , the lighting processor 908 retrieves the 3D scene 920 and outputs the lighted UI elements 924 for display on the device display screen.
  • the UI elements 918 are decorated without being placed in the 3D scene 920 .
  • the lighting processor 908 retrieves UI elements 918 that are to be decorated and displayed.
  • the lighting processor 908 retrieves the light configurations 916 from the memory 910 and uses information in the light configurations to directly decorate each UI element. As each UI element is decorated it is output for display as part of the lighted UI elements 924 .
  • FIG. 10 shows detailed exemplary embodiments of the image sensors 902 and the image receiver 906 shown in FIG. 9 .
  • the image sensors 902 comprise one or more image sensors that capture images of the environment (or region) surrounding the device to which the image sensors 902 are mounted.
  • the image sensors 902 comprise one or more camera sensors that are arranged in such a way as to maximally cover the field of view (up to and even beyond 360°).
  • the image sensors 902 comprise two opposing camera sensors, each with 180° field of view, that cover a full sphere encompassing the device to which the images sensors 902 are mounted.
  • the implementation of two camera sensors, each with a 180° field of view enables a bona fide 360° field of view to be obtained.
  • the image sensors may include but are not limited to high resolution (HD) cameras, video cameras (e.g., outputting 30-60 fps), color or black and white cameras, and/or cameras having special lenses (e.g., wide angle or fish eye). If two cameras each having a 180° field of view are used, they may be placed in opposition to each other to obtain a 360° field of view. Other configurations include four cameras each with 90° field of view to obtain a 360° field of view, or multiple cameras with asymmetrical fields of view that are combined to obtain a 360° field of view.
  • HD high resolution
  • video cameras e.g., outputting 30-60 fps
  • color or black and white cameras e.g., color or black and white cameras
  • special lenses e.g., wide angle or fish eye
  • the image sensors include but are not limited to a high-resolution imaging sensor(s) that provides color data with which lights and shadows may be configured.
  • the image sensor(s) include but are not limited to an auto-exposing imaging sensor that provides both color data and brightness of the device's surroundings by way of an ISO value which results from the sensor(s) auto-exposure mechanism.
  • the image sensors include but are not limited to an ambient light sensor and/or other low pixel count image sensor that provides either color data or brightness of the device's surroundings or both.
  • the image receiver 906 comprises an image sensor interface (I/F) 1002 , image controller 1004 , and image output I/F 1006 .
  • the image sensor I/F 1002 comprises, logic, registers, storage elements, and/or discrete components that operate to received image data from the image sensors 902 and to pass this image data to the image controller 1004 .
  • the image controller 1004 comprises at least one of a processor, CPU, gate array, programmable logic, registers, logic, and/or discrete components that operate to receive real-time images from the image sensors 902 provided by the image sensor I/F 1002 .
  • the image controller 1004 operates to process those images into a real-time 2D image stream that is output to the image output interface 1006 .
  • the image sensors 902 may include multiple image sensors that each output real-time 2D images or other image related data, such as average brightness.
  • the image controller 1004 operates to combine these multiple real-time images into a real-time 2D image stream where each image provides a wide field of view around the image sensors 902 .
  • each image may provide a 360° field of view around the image sensors 902 .
  • the image controller 1004 operates to stitch together (or combine in any other way) multiple images received from the image sensors 902 to form the real-time 2D output image stream 1010 .
  • the image controller 1004 includes a memory 1008 to facilitate combining images from multiple image sensors.
  • the image controller 1004 outputs the real-time 2D image stream 1010 to the image output I/F 1006 , which generates the real-time 2D image stream 914 output.
  • the real-time 2D image stream 914 is output from the image receiver 906 to the lighting processor 908 .
  • FIG. 11 shows a detailed exemplary embodiment of the lighting processor 908 and memory 910 shown in FIG. 9 .
  • the lighting processor 908 comprises an image input I/F 1102 , image pre-processor 1104 , light detector 1106 , and UI element renderer 1108 .
  • the lighting processor 908 comprises at least one of a CPU, processor, programmable logic, state machine, registers, memory, and/or discrete components that operate to perform the functions described below.
  • the image input I/F 1102 operates to receive real-time 2D images 414 from the image receiver 406 and passes these images to the image pre-processor 1104 .
  • the received images may be stored or buffered by the image input I/F 1102 .
  • the image pre-processor 1104 operates to receive a stream of real-time 2D images from the image input I/F 1102 .
  • the image pre-processor 1104 processes the received images to allow for the detection of light sources in the images.
  • the image pre-processor 1104 converts the images it received into a luminosity map that indicates intensity or brightness. This pre-processed image is then output to the light detector 1106 and optionally can be stored in the memory 916 .
  • the light detector 1106 receives processed and unprocessed images from the image pre-processor 1104 and detects lights sources within the images. For example, the light detector 1106 detects lights sources and generates associated light configurations 916 that are stored in the memory 910 .
  • the light configurations 916 includes light location values 1110 and light characteristic values 1112 .
  • the light location values 1110 include but are not limited to latitude and longitude position values as illustrated in FIG. 6 .
  • the light characteristic values 1112 include but are not limited to intensity, size, color, and any other light characteristics.
  • the UI element renderer 1108 retrieves the light configurations 916 from the memory 910 and uses these configurations to configure light sources in the 3D scene 920 .
  • the color and brightness of the light sources are configured according to both color and non-color (e.g. ISO level) data provided by the image sensor.
  • the UI element renderer 1108 then retrieves UI elements 918 from the memory 910 and adds these elements in the 3D scene. Based on the light sources, surface highlighting is performed to highlight surfaces of the UI elements based on the location and characteristics of the light sources. Next, shadowing elements for the UI elements are arranged to appear as they would naturally, derived from the position of the arranged scene lights, further occluded by the UI elements in the 3D scene, and cast into the 3D scene. The intensity and color of the shadowing are configured according both color and non-color data provided by the image sensor(s). Once the 3D scene is created, the UI element renderer 1108 retrieves the scene from the memory 910 and outputs the lighted UI elements 924 to a device display.
  • a 3D scene and its related elements take may be explicitly expressed in the memory 910 (e.g., data structures including position, orientation, and scale) or they may be implicit (e.g., a visual effect like a shadow which is calculated from scene data and either stored as image data or immediately displayed).
  • a 3D scene may include elements which are stored in volatile or non-volatile storage (e.g., main memory, hard disk drive, CPU cache or registers, GPU memory, GPU cache or registers), or sent to a display device.
  • a graphical user interface is provided with UI elements that are predominantly two-dimensional (e.g., planar windows), but includes shadows which are derived from the angle of a notional light source.
  • the light source is directional and expressed in terms of (M, N) where M and N are angles relative to the normal of the display. M and N are the horizontal and vertical components respectively.
  • a window region is expressed in terms of (X, Y) and (H, W) where X and Y are the horizontal and vertical position of the window in screen space respectively, and H and W are the height and width in screen space respectively.
  • FIG. 12 shows an exemplary embodiment of a method 1200 for decorating UI elements with environmental lighting.
  • the method 1200 is suitable for use with the UIELS 900 shown in FIG. 9 .
  • a real-time 2D image is acquired.
  • the 2D image is acquired from one or more image sensors 302 .
  • the image sensors can be part of a camera system attached to a hand-held device.
  • the acquired image provides a 360° field of view of the region surrounding the location of the image sensors.
  • the image sensors 302 output images at a frame rate of 30 fps.
  • the image sensors include but are not limited to a high-resolution imaging sensor(s) that provides color data, auto-exposing imaging sensor(s) that provides both color data and brightness of the device's surroundings by way of an ISO value, and/or ambient light sensor(s) or other low-pixel-count image sensor which provides either color data or brightness of the device's surroundings or both.
  • an optional operation is performed to combine images from multiple sensors into the acquired real-time 2D image. For example, if several images are acquired by multiple image sensors, these images are combined into one image by connecting the images together or otherwise stitching the images to form one real-time 2D image. Thus, multiple images and/or image data are combined in virtually any desired way.
  • the image controller 404 performs this operation.
  • the real-time 2D image is pre-processed.
  • the images are pre-processed to generate a luminosity map that shows intensity values for each pixel of the image.
  • the image pre-processor 1104 performs this operation as described above.
  • light sources and their relative positions and characteristics are detected in the real-time 2D image.
  • the light detector 1106 performs this operation to generate a light configuration for each detected light source.
  • the detector 1106 performs one or more heuristics on the image and/or pre-processed image to detect light sources and their relative positions and characteristics
  • the generated light configurations are stored in a memory.
  • the detector 1106 stores the generates light configurations in the memory 910 .
  • a 3D scene is generated that includes light sources determined from the stored light configurations.
  • the renderer 1108 generates the 3D scene 920 in memory 916 that includes lights sources based on the stored light configurations.
  • a UI element is retrieved from memory and added to the 3D scene.
  • the renderer 1108 retrieves the UI element from the stored UI elements 918 and adds the UI element to the 3D scene 920 .
  • the UI element is lighted based on the position and characteristics of the light sources configured in the 3D scene.
  • the renderer 1108 performs this operation.
  • the renderer 1108 lights the surfaces of the UI element based on location of the light sources relative to the surfaces.
  • shadows are added to the UI element that are determined based on the position and characteristics of the light sources configured in the 3D scene. For example, the renderer 1108 performs this operation. For example, the renderer 1108 determines shadows cast by the UI element when struck by light from the light sources. The lighted and shadowed UI element forms a decorated UI element.
  • the decorated (lighted and shadowed) UI elements are displayed on a display screen.
  • the renderer 1108 obtains the lighted and shadowed elements from the 3D scene (or from storage) and outputs them to a device display as indicated at 924 .
  • the method 1200 operates to decorate UI elements with environmental lighting. It should be noted that although the method 1200 describes specific operations, these operations may be changed, modified, rearranged, added to, and subtracted from within the scope of the embodiments.

Abstract

Methods and apparatus for decorating user interface elements with environment lighting. In an exemplary embodiment, a method is provided that includes acquiring an image of an environment, analyzing the image using at least one heuristic to determine a light configuration of a light source in the image, and decorating a user interface (UI) element based on the light configuration of the light source to generate a decorated UI element. In an exemplary embodiment, an apparatus is provided that includes an image sensor that acquires an image of an environment, a light detector that analyzes the image using at least one heuristic to determine a light configuration of a light source in the image, and a renderer that decorates a UI element based on the light configuration of the light source to generate a decorated UI element.

Description

    PRIORITY
  • This application is a continuation application of U.S. patent application Ser. No. 15/705,671, filed on Sep. 15, 2017 in the name of the same inventor and entitled “Methods and Apparatus for Decorating User Interface Elements with Environmental Lighting,” which further claims the benefit of priority based upon U.S. Provisional Patent Application having Application No. 62/527,778, filed on Jun. 30, 2017, and entitled “GENERATION AND USE OF DYNAMIC REAL-TIME ENVIRONMENT MAPS,” all of which are hereby incorporated herein by reference in their entireties.
  • FIELD
  • The present invention relates to the operation of image processing systems. More specifically, the present invention relates to the processing of images derived from a surrounding environment.
  • BACKGROUND
  • User interfaces have evolved significantly over the past forty years. They have progressed through command-driven terminals, mouse-driven two-dimensional (2D) graphical user interfaces, and touch-driven 2D graphical user interfaces. In each generation of user interface, the computer has displayed information via forms and effects crafted by a user interface (UI) designer or developer. Regions of color, brightness, and contrast are crafted in such a way as to imply, for example, depth and lighting. Such effects help the user visually understand and organize large quantities of information. To date, such effects have been implemented statically in two dimensions. However, the way a graphical user interface is rendered does not react to the lighting of the environment in which the computer is used.
  • Therefore, it would be desirable to have a mechanism that allows characteristics of the surrounding environment to be utilized in rendering visual displays to support a variety of display applications.
  • SUMMARY
  • In various exemplary embodiments, methods and apparatus are provided for decorating UI elements with environmental lighting. For example, UI controls are decorated with lighting effects that are derived from the environment surrounding a device. The net effect is that the controls appear as if they exist in the user's real, physical environment, rather than an unlit, or arbitrarily lit, virtual environment.
  • In various exemplary embodiments, images of the environment surrounding a device are captured via one or more image sensors. Those images are analyzed via one or more heuristics that recognize positions and characteristics of various light sources in the image. The positions of those light sources relative to the device (or device display) are modeled in a virtual three-dimensional (3D) scene. UI elements to be rendered on a device display are located within the 3D scene and decorated based on the location and characteristics of the light sources. For example, the UI elements are decorated with highlights and/or shadows based on the characteristics of the light sources. When the decorated UI elements in the 3D scene are subsequently rendered and displayed, user experiences are improved by increasing immersiveness of the displayed UI elements, reducing cognitive load, and increasing the degree to which the users may visually understand what they are seeing.
  • In an exemplary embodiment, a mobile computing device with an imaging sensor performs operations to decorate UI elements with environmental lighting. The image sensor captures an image of the device's surroundings, which is then analyzed to find the brightest light sources in the image. That information is then used as a control parameter for a Phong shading algorithm which shades UI elements accordingly. The decorated UI elements are then rendered on a device display.
  • In an exemplary embodiment, a method is provided that includes acquiring an image of an environment, analyzing the image using at least one heuristic to determine a light configuration of a light source in the image, and decorating a UI element based on the light configuration of the light source to generate a decorated UI element.
  • In an exemplary embodiment, an apparatus is provided that includes an image sensor that acquires an image of an environment, a light detector that analyzes the image using at least one heuristic to determine a light configuration of a light source in the image, and a renderer that decorates a UI element based on the light configuration of the light source to generate a decorated UI element.
  • Additional features and benefits of the exemplary embodiments of the present invention will become apparent from the detailed description, figures and claims set forth below.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The exemplary embodiments of the present invention will be understood more fully from the detailed description given below and from the accompanying drawings of various embodiments of the invention, which should not be taken to limit the invention to the specific embodiments, but are for explanation and understanding only.
  • FIG. 1 shows devices comprising exemplary embodiments of a UI element lighting system;
  • FIG. 2 shows a device that includes an exemplary embodiment of a UI element lighting system;
  • FIG. 3 shows an exemplary embodiment of a 3D scene that illustrates relationships between a notional light source, a 3D UI element, a 2D UI element, and the resulting visual effects shown in the 3D scene;
  • FIG. 4 shows exemplary embodiments of an image of an environment surrounding a device and a brightness map that illustrates regions of brightness in the image;
  • FIG. 5A shows how the brightness map shown in FIG. 4 relates to an image coordinate space;
  • FIG. 5B shows an exemplary embodiment of a graph that illustrates a latitudinal and longitudinal relationship between the field of view of an image sensor and the brightness map shown in FIG. 5A;
  • FIG. 6 shows an exemplary embodiment illustrating how a notional light source is positioned in a 3D scene according to latitude and longitude values determined from the graph shown in FIG. 5B;
  • FIG. 7 shows exemplary embodiments of UI elements for use with exemplary embodiments of the UI element lighting system;
  • FIG. 8 shows a detailed exemplary embodiment of an image sensor for use with exemplary embodiments of the UI element lighting system;
  • FIG. 9 shows a detailed exemplary embodiment of a UI element lighting system;
  • FIG. 10 shows detailed exemplary embodiments of image sensors and image receiver shown in FIG. 9;
  • FIG. 11 shows a detailed exemplary embodiment of a lighting processor and memory shown in FIG. 9; and
  • FIG. 12 shows an exemplary embodiment of a method for decorating UI elements with environmental lighting.
  • DETAILED DESCRIPTION
  • The purpose of the following detailed description is to provide an understanding of one or more embodiments of the present invention. Those of ordinary skill in the art will realize that the following detailed description is illustrative only and is not intended to be in any way limiting. Other embodiments will readily suggest themselves to such skilled persons having the benefit of this disclosure and/or description.
  • In the interest of clarity, not all of the routine features of the implementations described herein are shown and described. It will, of course, be understood that in the development of any such actual implementation, numerous implementation-specific decisions may be made in order to achieve the developer's specific goals, such as compliance with application and business-related constraints, and that these specific goals will vary from one implementation to another and from one developer to another. Moreover, it will be understood that such a development effort might be complex and time-consuming, but would nevertheless be a routine undertaking of engineering for those of ordinary skill in the art having the benefit of the embodiments of this disclosure.
  • Various exemplary embodiments illustrated in the drawings may not be drawn to scale. Rather, the dimensions of the various features may be expanded or reduced for clarity. In addition, some of the drawings may be simplified for clarity. Thus, the drawings may not depict all of the components of a given apparatus (e.g., device) or method. The same reference indicators will be used throughout the drawings and the following detailed description to refer to the same or like parts.
  • FIG. 1 shows devices 100 comprising exemplary embodiments of a UI element lighting system (UIELS). For example, the UIELS operates to decorate UI elements using lighting effects derived from 2D images taken of the surrounding environment. For example, the devices shown include tablet computer 102, notebook computer 104, cell phone 106, and smart phone 108. It should be noted that embodiments of the UIELS are suitable for use with virtually any type of device to decorate UI elements using lighting effects derived from the surrounding environment and are not limited to the devices shown. For example, the UIELS also is suitable for use with automobile dashboard systems, billboards, stadium big screens and virtually all types of display devices.
  • FIG. 2 shows a device 200 that includes an exemplary embodiment of a UI element lighting system 202. For example, the UIELS 202 includes lighting configuration unit (LCU) 204 and image sensor 206. The image sensor 206 operates to acquire real-time 2D images of the environment surrounding the device 200. The LCU 204 operates to receive the real-time 2D images and decorate UI elements using lighting effects derived from the 2D images taken of the surrounding environment. For example, the device 200 comprises a device display 208 that displays visual UI elements, such as interactive UI elements 210 and 212. The interactive UI element 210 has a top surface 214 that shows a first function 216 that is performed by the device 200 when a user touches the UI element 210. The interactive UI element 212 has a top surface 218 that shows a second function 220 that is performed by the device 200 when a user touches the UI element 212.
  • The UIELS 202 operates to detect and locate lights sources in the environment surrounding the device 200 from images acquired by the sensor 206. For example, the UIELS 202 detects the light source 210 and determines is position relative to the device 200 and/or the orientation of the display screen 208. The UIELS 202 also detects characteristics or attributes of the light source 210, such as its size, intensity, location, and color characteristics. The UIELS 202 uses this information about the light source 210 to decorate the UI elements 210 and 212. Thus, based on the characteristics of the light source 210, the UIELS 202 operates to provide appropriate highlighting of the surfaces 214 and 218. The UIELS 202 also provides appropriate shadowing of the UI elements, such as the shadow 222 provided for each UI element that is determined from the relative location, size, and intensity of the light source 210. It should be noted that the UIELS 202 is not limited to detecting only one light source; thus, multiple light sources can be detected and their associated locations and characteristics used to decorate UI elements provided on the display 208.
  • Since the detected light sources and their characteristics are derived from a real-time image acquired from the image sensor 206, changes in the orientation and/or position of the image sensor 206 (e.g., when the device 200 is moved) can be captured in one or more additional real-time images resulting in changes in the relative location of the light sources in the additional images and corresponding changes to the UI element decorations.
  • FIG. 3 shows an exemplary embodiment of a 3D scene 300 that illustrates relationships between a notional light source 302, a 3D UI element 304, a 2D UI element 306, and the resulting visual effects shown in the 3D scene 300. The 3D scene 300 is intended to encompass all geometric notions such as points, lines, line segments, vectors, edges, faces, polygons, etc., though they may have some number of dimensions other than three. For example, it is possible to place 2D elements such as planes, quads, triangles, etc. inside the 3D scene 300. Furthermore, a point has zero dimensions, but may be positioned in the 3D scene 300. The notion of a 3D scene also includes scenes in which any of the dimensions are ignored or discarded (i.e. a 2D scene is a 3D scene in which one of the dimensions remains zero). In summary, the expression “3D” encompasses any geometric notion which may be expressed or represented in a three-dimensional space, such as the 3D scene 300.
  • In an exemplary embodiment, a notional light source is a notion which is used as a control parameter for rendering of visual effects. For example, the sides of a cube may each have a greater or lesser degree of brightness due to their orientation relative to a notional light source, so as to appear to be “lit” by that light source. Notional light sources do not exist in real, physical space, but are modeled in the 3D scene to illuminate UI elements positioned the 3D scene.
  • Arranging the notional light source 302 in the 3D scene 300 is accomplished by configuring the size, position, orientation, color, and brightness of the notional light source 302 to match that of an actual light source in the environment surrounding the device. The notional light source 302 is then used to determine highlighting and shadows that result when light from the light source strikes the elements 304 and 306. For example, the element 304 comprises top surfaces 308 and 310, which are highlighted by light from the light source 302. The element 304 also comprises side surfaces 312 and 314, which also are highlighted by light from the light source 302. The location of the light source 302 also results in shadow effect 316, which is cast when the light from the light source 302 strikes the element 304. The element 306 comprises top surface 318, which also is highlighted by light from the light source 302. The location of the light source 302 also results in shadow effect 320, which is cast when the light from the light source 302 strikes the element 306. It should be noted that the 3D scene may comprise multiple light sources and that the effects of each light source on the elements 304 and 306 can be modeled.
  • In summary, the 3D scene 300 comprises one or more light sources. Each light source comprises a plurality of characteristics, such as location, size, intensity, color, and any other associated characteristics. The 3D scene also comprises one or more UI elements. Each UI element comprises one or more surfaces that are highlighted by the light source. Each UI element may also produce one or more shadows when struck by each light source. Thus, each UI element is decorated by the highlighting and shadows to form a decorated UI element for display.
  • FIG. 4 shows exemplary embodiments of an image 402 of an environment surrounding a device and a brightness map 404 that illustrates regions of brightness in the image 402. In an exemplary embodiment, the image 402 may be captured by the image sensor 206 shown in FIG. 2. The captured image includes a bright light source represented by the Sun 408. In one exemplary embodiment, the image sensor is arranged such that its focal axis is aligned (or can be translated) to the normal of the device display screen. Given this angle, and a particular field of view of the image sensor, the relative angle between the screen and each pixel in the image 402 can be calculated. For each two-dimensional coordinate in the captured image, corresponding azimuthal angles to features, objects, or characteristics in the image can be calculated. This information is the basis for configuring light sources and their corresponding effects in the 3D scene. The brightness map 404 is a map of light intensity and illustrates two bright regions 406 in the captured image. In an exemplary embodiment, the two bright regions are measured and form notional light sources in a 3D scene.
  • In an exemplary embodiment, the brightness map 404 is created by converting each pixel in the image 402 from its initial red-green-blue (RGB) value to a luminosity value L. The values of R, G, and B are constrained to the range [0, 1]. The luminosity value can be computed from the following expression.

  • L=sqrt(0.299*R̂2+0.587*Ĝ2+0.114* B̂2)
  • Once each pixel has been converted to luminosity, the pixel(s) with the greatest luminosity are selected to represent one or more light sources. It should be noted that the above implementation is exemplary and that any suitable technique can be used to generate the luminosity map 404.
  • In an exemplary embodiment, characteristics of the light source are determined from the image 402, the brightness map 404 or both. For example, the location of the light source, the size of the light source, the color of the light source, the intensity of the light source, along with any other characteristics are determined from the image 402, the brightness map 404 or both.
  • FIG. 5A shows how the brightness map 404 shown in FIG. 4 relates to an image coordinate space 500. Point P 502 represents a bright region of the map 404, which has been selected by a suitable heuristic. For example, the image 402 is analyzed according to the heuristic to detect light sources (e.g., bright regions 406) in the environment. By recognizing the location of bright regions in the image, the corresponding azimuthal coordinates relative to the device screen are calculated.
  • FIG. 5B shows an exemplary embodiment of a graph 504 that illustrates the latitudinal and longitudinal relationship between the field of view of the image sensor that captured the image 402 and the brightness map 404 shown in FIG. 5A. The point P 502, which resides at the same location as depicted in FIG. 5A, corresponds to a particular latitude and longitude of the detected light source. For example, the image space 504 shows how the corresponding azimuthal coordinates of point P 502 relative to the device screen are calculated. With these coordinates, a notional light source in the 3D scene 300 is positioned at the proper location.
  • FIG. 6 shows an exemplary embodiment illustrating how a notional light source is positioned in a 3D scene 600 according to latitude and longitude values determined from the graph 504 shown in FIG. 5B. A light source 602 is positioned along a sphere 604 based on the latitude and longitude values determined from the graph 504 and oriented such that it emits light in the direction of the origin of the sphere 604. For example, the light source 602 is positioned according to the latitude and longitude of point P 502. The color and brightness are configured according to both color or non-color (e.g. ISO level) data provided by the image sensor (e.g., sensor 206). Shadow region 608 associated with a UI element 606 is arranged to appear as it would naturally when derived from the position of the arranged light source 602 and further occluded by the UI element 606. The surfaces 610 and 612 of the UI element 606 are highlighted with color and intensity according to the color and intensity of the configured light source 602 and their orientation to the light source 602.
  • FIG. 7 shows exemplary embodiments of UI elements 700 for use with the exemplary embodiments of the UIELS 202. For example, the UI elements 700 comprise a graphic UI element 702, a structural UI element 704, a button UI element 706, an emoji UI element 708, and character UI elements 710. It should be noted that the UI elements 700 are exemplary and that other types of UI elements can be lighted with the UIELS 202.
  • FIG. 8 shows a detailed exemplary embodiment of an image sensor 800 for use with the UIELS 202. For example, the image sensor 800 is suitable for use as part of the image sensors 206 shown in FIG. 2. The image sensor 800 comprises a sensor body 802 that houses an image sensor 806 that is covered by a lens 804. For example, the lens 804 may be a wide-angle, fisheye lens. However, other factors such as cost and form factor may affect the choice of lens design for a given implementation.
  • In this particular embodiment, the lens 804 operates to provide a wide field of view of the surrounding environment that is captured by the image sensor 806. In other embodiments, different sensor/lens combinations are used to acquire a desired field of view of the surrounding environment. Evaluation of a particular sensor/lens configuration should consider the accuracy of the system's ability to project the image onto the surface of the image sensor 806.
  • FIG. 9 shows a detailed exemplary embodiment of a UI element lighting system 900. For example, the UIELS 900 is suitable for use as the UIELS 202 shown in FIG. 2. The UIELS 900 comprises one or more image sensors 902 and a lighting configuration unit 904. The image sensors 902 comprise one or more high-resolution image sensors that output real-time 2D images. For example, each image sensor can output a stream of real-time 2D image frames at 30 frames per second (fps) (or other suitable frame rate). The stream of 2D images output from the images sensors 902 is shown at 912. In another embodiment, the image sensors comprise an ambient light sensor and/or a low pixel count sensor. Information from these sensors is also output to the LCU 904.
  • In an exemplary embodiment, the LCU 904 includes an image receiver 906, lighting processor 908 and memory 910. The image receiver 906 receives one or more real-time images 912 from the image sensors 902 and processes these images into a real-time 2D image stream 914 that is passed to the lighting processor 908. For example, if the image stream 912 comprises images from multiple image sensors, the image receiver 906 operates to combine these images into the real-time 2D image stream 914. For example, in an exemplary embodiment, the image receiver 906 stitches together multiple images from the image sensors 902 to generate the real-time 2D image stream 914 that provides a desired (e.g.,)360° field of view around the image sensors 902. The image receiver 906 also can receive data from an ambient light sensor and/or a low pixel count sensor that are part of the image sensors 902 and passes this data to the lighting processor 908.
  • The lighting processor 908 receives the stream of images 914 and detects one or more light sources within the images. When a light source is detected, characteristics of the light source are determined. For example, the relative location, size, intensity, and color of each light source is determined. Other characteristics also may be determined. The lighting processor 908 saves the information for each light source as a light configuration 916 in the memory 910.
  • In an exemplary embodiment, the lighting processor 908 uses the light configurations 916 to generate light sources in a 3D scene 920 stored in the memory 910. For example, each light source is positioned along a sphere based on its computed latitude and longitude values as illustrated and described with reference to FIG. 6. Once all the lights sources have been positioned in the 3D scene 920, UI elements can be placed in the 3D scene and decorated according to the light configurations 916 and their orientation to the light configurations.
  • In an exemplary embodiment, the lighting processor 908 uses the light sources in the 3D scene 920 to decorate UI elements. For example, lighting processor 908 retrieves UI elements 918 from the memory 910 and places and decorates each UI element in the 3D scene. In an exemplary embodiment, the lighting processor 908 uses the light characteristic in each configured light source in the 3D scene 920 to determine highlighting and shadowing for each UI element based on it orientation to each light source. After all the UI elements have been configured and decorated in the 3D scene 920, the lighting processor 908 retrieves the 3D scene 920 and outputs the lighted UI elements 924 for display on the device display screen.
  • In another exemplary embodiment, the UI elements 918 are decorated without being placed in the 3D scene 920. For example, the lighting processor 908 retrieves UI elements 918 that are to be decorated and displayed. The lighting processor 908 retrieves the light configurations 916 from the memory 910 and uses information in the light configurations to directly decorate each UI element. As each UI element is decorated it is output for display as part of the lighted UI elements 924.
  • FIG. 10 shows detailed exemplary embodiments of the image sensors 902 and the image receiver 906 shown in FIG. 9. In an exemplary embodiment, the image sensors 902 comprise one or more image sensors that capture images of the environment (or region) surrounding the device to which the image sensors 902 are mounted. In an exemplary embodiment, the image sensors 902 comprise one or more camera sensors that are arranged in such a way as to maximally cover the field of view (up to and even beyond 360°). For example, in one embodiment, the image sensors 902 comprise two opposing camera sensors, each with 180° field of view, that cover a full sphere encompassing the device to which the images sensors 902 are mounted. In an exemplary embodiment, the implementation of two camera sensors, each with a 180° field of view enables a bona fide 360° field of view to be obtained.
  • In various exemplary embodiments, the image sensors may include but are not limited to high resolution (HD) cameras, video cameras (e.g., outputting 30-60 fps), color or black and white cameras, and/or cameras having special lenses (e.g., wide angle or fish eye). If two cameras each having a 180° field of view are used, they may be placed in opposition to each other to obtain a 360° field of view. Other configurations include four cameras each with 90° field of view to obtain a 360° field of view, or multiple cameras with asymmetrical fields of view that are combined to obtain a 360° field of view.
  • In one embodiment, the image sensors include but are not limited to a high-resolution imaging sensor(s) that provides color data with which lights and shadows may be configured. In another embodiment, the image sensor(s) include but are not limited to an auto-exposing imaging sensor that provides both color data and brightness of the device's surroundings by way of an ISO value which results from the sensor(s) auto-exposure mechanism. In still another embodiment, the image sensors include but are not limited to an ambient light sensor and/or other low pixel count image sensor that provides either color data or brightness of the device's surroundings or both.
  • In an exemplary embodiment, the image receiver 906 comprises an image sensor interface (I/F) 1002, image controller 1004, and image output I/F 1006. The image sensor I/F 1002 comprises, logic, registers, storage elements, and/or discrete components that operate to received image data from the image sensors 902 and to pass this image data to the image controller 1004.
  • In an exemplary embodiment, the image controller 1004 comprises at least one of a processor, CPU, gate array, programmable logic, registers, logic, and/or discrete components that operate to receive real-time images from the image sensors 902 provided by the image sensor I/F 1002. The image controller 1004 operates to process those images into a real-time 2D image stream that is output to the image output interface 1006. For example, the image sensors 902 may include multiple image sensors that each output real-time 2D images or other image related data, such as average brightness. The image controller 1004 operates to combine these multiple real-time images into a real-time 2D image stream where each image provides a wide field of view around the image sensors 902. For example, each image may provide a 360° field of view around the image sensors 902. In an embodiment, the image controller 1004 operates to stitch together (or combine in any other way) multiple images received from the image sensors 902 to form the real-time 2D output image stream 1010. In one embodiment, the image controller 1004 includes a memory 1008 to facilitate combining images from multiple image sensors.
  • Once acquisition and processing of the image sensor data is complete, the image controller 1004 outputs the real-time 2D image stream 1010 to the image output I/F 1006, which generates the real-time 2D image stream 914 output. For example, as shown in FIG. 9, the real-time 2D image stream 914 is output from the image receiver 906 to the lighting processor 908.
  • FIG. 11 shows a detailed exemplary embodiment of the lighting processor 908 and memory 910 shown in FIG. 9. In an exemplary embodiment, the lighting processor 908 comprises an image input I/F 1102, image pre-processor 1104, light detector 1106, and UI element renderer 1108. In an exemplary embodiment, the lighting processor 908 comprises at least one of a CPU, processor, programmable logic, state machine, registers, memory, and/or discrete components that operate to perform the functions described below.
  • In an exemplary embodiment, the image input I/F 1102 operates to receive real-time 2D images 414 from the image receiver 406 and passes these images to the image pre-processor 1104. In an exemplary embodiment, the received images may be stored or buffered by the image input I/F 1102.
  • In an exemplary embodiment, the image pre-processor 1104 operates to receive a stream of real-time 2D images from the image input I/F 1102. The image pre-processor 1104 processes the received images to allow for the detection of light sources in the images. For example, the image pre-processor 1104 converts the images it received into a luminosity map that indicates intensity or brightness. This pre-processed image is then output to the light detector 1106 and optionally can be stored in the memory 916.
  • In an exemplary embodiment, the light detector 1106 receives processed and unprocessed images from the image pre-processor 1104 and detects lights sources within the images. For example, the light detector 1106 detects lights sources and generates associated light configurations 916 that are stored in the memory 910. In an exemplary embodiment, the light configurations 916 includes light location values 1110 and light characteristic values 1112. For example, the light location values 1110 include but are not limited to latitude and longitude position values as illustrated in FIG. 6. The light characteristic values 1112 include but are not limited to intensity, size, color, and any other light characteristics.
  • In an exemplary embodiment, the UI element renderer 1108 retrieves the light configurations 916 from the memory 910 and uses these configurations to configure light sources in the 3D scene 920. The color and brightness of the light sources are configured according to both color and non-color (e.g. ISO level) data provided by the image sensor.
  • The UI element renderer 1108 then retrieves UI elements 918 from the memory 910 and adds these elements in the 3D scene. Based on the light sources, surface highlighting is performed to highlight surfaces of the UI elements based on the location and characteristics of the light sources. Next, shadowing elements for the UI elements are arranged to appear as they would naturally, derived from the position of the arranged scene lights, further occluded by the UI elements in the 3D scene, and cast into the 3D scene. The intensity and color of the shadowing are configured according both color and non-color data provided by the image sensor(s). Once the 3D scene is created, the UI element renderer 1108 retrieves the scene from the memory 910 and outputs the lighted UI elements 924 to a device display.
  • Additionally, the form which a 3D scene and its related elements take may be explicitly expressed in the memory 910 (e.g., data structures including position, orientation, and scale) or they may be implicit (e.g., a visual effect like a shadow which is calculated from scene data and either stored as image data or immediately displayed). Furthermore, a 3D scene may include elements which are stored in volatile or non-volatile storage (e.g., main memory, hard disk drive, CPU cache or registers, GPU memory, GPU cache or registers), or sent to a display device.
  • In an exemplary embodiment, a graphical user interface is provided with UI elements that are predominantly two-dimensional (e.g., planar windows), but includes shadows which are derived from the angle of a notional light source.
  • The light source is directional and expressed in terms of (M, N) where M and N are angles relative to the normal of the display. M and N are the horizontal and vertical components respectively. A window region is expressed in terms of (X, Y) and (H, W) where X and Y are the horizontal and vertical position of the window in screen space respectively, and H and W are the height and width in screen space respectively. The screen space coordinates of a shadow U units behind the window plane placed at (X, Y) in screen space, the shadow is placed at (X′, Y′) and can be expressed as follows.

  • X′=X−U*tan(M)

  • Y′=Y−U*tan(N)

  • X′=X−U tan(M)

  • Y′=Y−U tan(N)
  • The four corners of the shadow in screen space are represented as follows.

  • (X′, Y′)

  • (X′+W, Y′)

  • (X′, Y′+H)

  • (X′+W, Y′+H)
  • FIG. 12 shows an exemplary embodiment of a method 1200 for decorating UI elements with environmental lighting. For example, the method 1200 is suitable for use with the UIELS 900 shown in FIG. 9.
  • At block 1202, a real-time 2D image is acquired. For example, in an exemplary embodiment, the 2D image is acquired from one or more image sensors 302. For example, the image sensors can be part of a camera system attached to a hand-held device. In one embodiment, the acquired image provides a 360° field of view of the region surrounding the location of the image sensors. In an exemplary embodiment, the image sensors 302 output images at a frame rate of 30 fps. In exemplary embodiments, the image sensors include but are not limited to a high-resolution imaging sensor(s) that provides color data, auto-exposing imaging sensor(s) that provides both color data and brightness of the device's surroundings by way of an ISO value, and/or ambient light sensor(s) or other low-pixel-count image sensor which provides either color data or brightness of the device's surroundings or both.
  • At block 1204, an optional operation is performed to combine images from multiple sensors into the acquired real-time 2D image. For example, if several images are acquired by multiple image sensors, these images are combined into one image by connecting the images together or otherwise stitching the images to form one real-time 2D image. Thus, multiple images and/or image data are combined in virtually any desired way. In an exemplary embodiment, the image controller 404 performs this operation.
  • At block 1206, the real-time 2D image is pre-processed. For example, the images are pre-processed to generate a luminosity map that shows intensity values for each pixel of the image. In an exemplary embodiment, the image pre-processor 1104 performs this operation as described above.
  • At block 1208, light sources and their relative positions and characteristics are detected in the real-time 2D image. For example, in an exemplary embodiment, the light detector 1106 performs this operation to generate a light configuration for each detected light source. In an exemplary embodiment, the detector 1106 performs one or more heuristics on the image and/or pre-processed image to detect light sources and their relative positions and characteristics
  • At block 1210, the generated light configurations are stored in a memory. For example, the detector 1106 stores the generates light configurations in the memory 910.
  • At block 1212, a 3D scene is generated that includes light sources determined from the stored light configurations. For example, the renderer 1108 generates the 3D scene 920 in memory 916 that includes lights sources based on the stored light configurations.
  • At block 1214, a UI element is retrieved from memory and added to the 3D scene. For example, the renderer 1108 retrieves the UI element from the stored UI elements 918 and adds the UI element to the 3D scene 920.
  • At block 1216, the UI element is lighted based on the position and characteristics of the light sources configured in the 3D scene. For example, the renderer 1108 performs this operation. For example, the renderer 1108 lights the surfaces of the UI element based on location of the light sources relative to the surfaces.
  • At block 1218, shadows are added to the UI element that are determined based on the position and characteristics of the light sources configured in the 3D scene. For example, the renderer 1108 performs this operation. For example, the renderer 1108 determines shadows cast by the UI element when struck by light from the light sources. The lighted and shadowed UI element forms a decorated UI element.
  • At block 1220, a determination is made as to whether there are more UI elements to be decorated. If there are no more UI elements to be decorated, the method proceeds to block 1222. If there are more UI elements to be decorated, the decorated UI element is stored and the method proceeds to block 1214. For example, the renderer 1108 performs this operation.
  • At block 1222, the decorated (lighted and shadowed) UI elements are displayed on a display screen. For example, the renderer 1108 obtains the lighted and shadowed elements from the 3D scene (or from storage) and outputs them to a device display as indicated at 924.
  • Thus, the method 1200 operates to decorate UI elements with environmental lighting. It should be noted that although the method 1200 describes specific operations, these operations may be changed, modified, rearranged, added to, and subtracted from within the scope of the embodiments.
  • While particular embodiments of the present invention have been shown and described, it will be obvious to those skilled in the art that, based upon the teachings herein, changes and modifications may be made without departing from these exemplary embodiments of the present invention and its broader aspects. Therefore, the appended claims are intended to encompass within their scope all such changes and modifications as are within the true spirit and scope of these exemplary embodiments of the present invention.

Claims (26)

What is claimed is:
1. A method, comprising:
acquiring an image of an environment;
analyzing the image using at least one heuristic to determine a light configuration of a light source in the image; and
decorating a user interface (UI) element based on the light configuration of the light source to generate a decorated UI element.
2. The method of claim 1, wherein the operation of analyzing comprises calculating a relative angle between an image sensor that captured the image and a light source in the image.
3. The method of claim 2, wherein the operation of decorating further comprises decorating the UI element with shadow based on the relative angle and the light configuration of the light source.
4. The method of claim 3, further comprising configuring an intensity of the shadow based on the light configuration of the light source.
5. The method of claim 1, wherein the operation of analyzing comprises calculating a brightness of the image as part of the light configuration.
6. The method of claim 5, wherein the operation of decorating further comprises decorating the UI element with surface highlights based on the brightness of the image.
7. The method of claim 1, further comprising configuring color of the light source based on data from the image.
8. The method of claim 1, further comprising configuring color of a shadow based on data from the image.
9. The method of claim 1, further comprising rendering the UI element in a 3D scene.
10. The method of claim 9, further comprising storing the rendered images in memory.
11. The method of claim 1, further comprising storing the image in memory.
12. The method of claim 1, further comprising storing the light configuration in memory.
13. The method of claim 1, further comprising displaying the decorated UI element on a display device.
14. The method of claim 1, wherein the image is captured from a plurality of image sensors.
15. The method of claim 1, further comprising performing the method on a computer.
16. The method of claim 16, further comprising performing the method on at least one of a handheld device, smartphone, tablet computer, desktop computer, and laptop computer.
17. An apparatus, comprising:
an image sensor that acquires an image of an environment;
a light detector that analyzes the image using at least one heuristic to determine a light configuration of a light source in the image; and
a renderer that decorates a user interface (UI) element based on the light configuration of the light source to generate a decorated UI element.
18. The apparatus of claim 17, wherein the image sensor is a high-resolution image sensor.
19. The apparatus of claim 17, wherein the image sensor is one of an ambient light sensor and a low-pixel-count image sensor.
20. The apparatus of claim 17, wherein the image sensor comprises a plurality of image sensors that acquire a plurality of images of the environment.
21. The apparatus of claim 17, further comprising a memory that stores the image of the environment.
22. The apparatus of claim 17, further comprising a memory that stores the light configuration.
23. The apparatus of claim 17, further comprising a memory that stores the decorated UI element rendered from a 3D scene, wherein the UI element comprises an object lit by the light source.
24. The apparatus of claim 17, further comprising a display.
25. The apparatus of claim 17, wherein the apparatus is located in a computer.
26. The apparatus of claim 17, wherein the apparatus is located in one of a handheld device, smartphone, tablet computer, desktop computer, and laptop computer.
US16/163,305 2017-06-30 2018-10-17 Methods and Apparatus for Decorating User Interface Elements with Environmental Lighting Abandoned US20190066366A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US16/163,305 US20190066366A1 (en) 2017-06-30 2018-10-17 Methods and Apparatus for Decorating User Interface Elements with Environmental Lighting

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US201762527778P 2017-06-30 2017-06-30
US201715705671A 2017-09-15 2017-09-15
US16/163,305 US20190066366A1 (en) 2017-06-30 2018-10-17 Methods and Apparatus for Decorating User Interface Elements with Environmental Lighting

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US201715705671A Continuation 2017-06-30 2017-09-15

Publications (1)

Publication Number Publication Date
US20190066366A1 true US20190066366A1 (en) 2019-02-28

Family

ID=64738780

Family Applications (3)

Application Number Title Priority Date Filing Date
US15/693,076 Abandoned US20190007672A1 (en) 2017-06-30 2017-08-31 Method and Apparatus for Generating Dynamic Real-Time 3D Environment Projections
US15/801,095 Expired - Fee Related US10540809B2 (en) 2017-06-30 2017-11-01 Methods and apparatus for tracking a light source in an environment surrounding a device
US16/163,305 Abandoned US20190066366A1 (en) 2017-06-30 2018-10-17 Methods and Apparatus for Decorating User Interface Elements with Environmental Lighting

Family Applications Before (2)

Application Number Title Priority Date Filing Date
US15/693,076 Abandoned US20190007672A1 (en) 2017-06-30 2017-08-31 Method and Apparatus for Generating Dynamic Real-Time 3D Environment Projections
US15/801,095 Expired - Fee Related US10540809B2 (en) 2017-06-30 2017-11-01 Methods and apparatus for tracking a light source in an environment surrounding a device

Country Status (1)

Country Link
US (3) US20190007672A1 (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11127212B1 (en) * 2017-08-24 2021-09-21 Sean Asher Wilens Method of projecting virtual reality imagery for augmenting real world objects and surfaces
US10771764B2 (en) * 2018-06-22 2020-09-08 Lg Electronics Inc. Method for transmitting 360-degree video, method for receiving 360-degree video, apparatus for transmitting 360-degree video, and apparatus for receiving 360-degree video
US10916062B1 (en) * 2019-07-15 2021-02-09 Google Llc 6-DoF tracking using visual cues

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110234543A1 (en) * 2010-03-25 2011-09-29 User Interfaces In Sweden Ab System and method for gesture detection and feedback
US20150009130A1 (en) * 2010-08-04 2015-01-08 Apple Inc. Three Dimensional User Interface Effects On A Display
US20150370322A1 (en) * 2014-06-18 2015-12-24 Advanced Micro Devices, Inc. Method and apparatus for bezel mitigation with head tracking

Family Cites Families (48)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3571954A (en) * 1966-05-13 1971-03-23 Planetaria Inc Space transit simulator planetarium
US4297723A (en) * 1980-01-28 1981-10-27 The Singer Company Wide angle laser display system
US4634384A (en) * 1984-02-02 1987-01-06 General Electric Company Head and/or eye tracked optically blended display system
GB8827952D0 (en) 1988-11-30 1989-01-05 Screen Form Inc Display device
US5638116A (en) 1993-09-08 1997-06-10 Sumitomo Electric Industries, Ltd. Object recognition apparatus and method
US5642293A (en) * 1996-06-03 1997-06-24 Camsys, Inc. Method and apparatus for determining surface profile and/or surface strain
US6369830B1 (en) 1999-05-10 2002-04-09 Apple Computer, Inc. Rendering translucent layers in a display system
US6559853B1 (en) 2000-02-16 2003-05-06 Enroute, Inc. Environment map creation using texture projections with polygonal curved surfaces
US20010056574A1 (en) * 2000-06-26 2001-12-27 Richards Angus Duncan VTV system
US7742073B1 (en) 2000-11-01 2010-06-22 Koninklijke Philips Electronics N.V. Method and apparatus for tracking an object of interest using a camera associated with a hand-held processing device
US7009663B2 (en) 2003-12-17 2006-03-07 Planar Systems, Inc. Integrated optical light sensitive active matrix liquid crystal display
US7883415B2 (en) 2003-09-15 2011-02-08 Sony Computer Entertainment Inc. Method and apparatus for adjusting a view of a scene being displayed according to tracked head motion
JP4532856B2 (en) 2003-07-08 2010-08-25 キヤノン株式会社 Position and orientation measurement method and apparatus
US7694233B1 (en) 2004-04-30 2010-04-06 Apple Inc. User interface presentation of information in reconfigured or overlapping containers
US20070182812A1 (en) * 2004-05-19 2007-08-09 Ritchey Kurtis J Panoramic image-based virtual reality/telepresence audio-visual system and method
KR101112735B1 (en) 2005-04-08 2012-03-13 삼성전자주식회사 3D display apparatus using hybrid tracking system
US7657849B2 (en) 2005-12-23 2010-02-02 Apple Inc. Unlocking a device by performing gestures on an unlock image
US8564544B2 (en) 2006-09-06 2013-10-22 Apple Inc. Touch screen device, method, and graphical user interface for customizing display of content category icons
US8717412B2 (en) * 2007-07-18 2014-05-06 Samsung Electronics Co., Ltd. Panoramic image production
US7857461B2 (en) * 2007-11-06 2010-12-28 Panasonic Corporation Projector and projection method
US8180112B2 (en) 2008-01-21 2012-05-15 Eastman Kodak Company Enabling persistent recognition of individuals in images
DE202009019125U1 (en) 2008-05-28 2016-12-05 Google Inc. Motion-controlled views on mobile computing devices
US8098894B2 (en) 2008-06-20 2012-01-17 Yahoo! Inc. Mobile imaging device as navigator
CN111522493A (en) 2008-08-22 2020-08-11 谷歌有限责任公司 Navigation in three-dimensional environment on mobile device
US8788977B2 (en) 2008-11-20 2014-07-22 Amazon Technologies, Inc. Movement recognition as input mechanism
JP5087532B2 (en) 2008-12-05 2012-12-05 ソニーモバイルコミュニケーションズ株式会社 Terminal device, display control method, and display control program
US20100275122A1 (en) 2009-04-27 2010-10-28 Microsoft Corporation Click-through controller for mobile interaction
JP5393318B2 (en) 2009-07-28 2014-01-22 キヤノン株式会社 Position and orientation measurement method and apparatus
US8762846B2 (en) 2009-11-16 2014-06-24 Broadcom Corporation Method and system for adaptive viewport for a mobile device based on viewing angle
US8922480B1 (en) 2010-03-05 2014-12-30 Amazon Technologies, Inc. Viewer-based device control
US8913004B1 (en) 2010-03-05 2014-12-16 Amazon Technologies, Inc. Action based device control
US8581905B2 (en) 2010-04-08 2013-11-12 Disney Enterprises, Inc. Interactive three dimensional displays on handheld devices
US9204040B2 (en) * 2010-05-21 2015-12-01 Qualcomm Incorporated Online creation of panoramic augmented reality annotations on mobile platforms
US20120249544A1 (en) * 2011-03-29 2012-10-04 Giuliano Maciocci Cloud storage of geotagged maps
CN103562791A (en) * 2011-04-18 2014-02-05 眼见360股份有限公司 Apparatus and method for panoramic video imaging with mobile computing devices
US20120314899A1 (en) 2011-06-13 2012-12-13 Microsoft Corporation Natural user interfaces for mobile image viewing
US9880640B2 (en) 2011-10-06 2018-01-30 Amazon Technologies, Inc. Multi-dimensional interface
US9324183B2 (en) 2011-11-29 2016-04-26 Apple Inc. Dynamic graphical interface shadows
US9582741B2 (en) * 2011-12-01 2017-02-28 Xerox Corporation System diagnostic tools for printmaking devices
US20160292924A1 (en) * 2012-10-31 2016-10-06 Sulon Technologies Inc. System and method for augmented reality and virtual reality applications
KR20140112909A (en) * 2013-03-14 2014-09-24 삼성전자주식회사 Electronic device and method for generating panorama image
KR20150068299A (en) * 2013-12-09 2015-06-19 씨제이씨지브이 주식회사 Method and system of generating images for multi-surface display
US9582731B1 (en) * 2014-04-15 2017-02-28 Google Inc. Detecting spherical images
US20170026659A1 (en) * 2015-10-13 2017-01-26 Mediatek Inc. Partial Decoding For Arbitrary View Angle And Line Buffer Reduction For Virtual Reality Video
US20170169617A1 (en) * 2015-12-14 2017-06-15 II Jonathan M. Rodriguez Systems and Methods for Creating and Sharing a 3-Dimensional Augmented Reality Space
US10311644B2 (en) * 2016-12-14 2019-06-04 II Jonathan M. Rodriguez Systems and methods for creating and sharing a 3-dimensional augmented reality space
US10095928B2 (en) * 2015-12-22 2018-10-09 WorldViz, Inc. Methods and systems for marker identification
US10065049B2 (en) * 2016-01-25 2018-09-04 Accuray Incorporated Presenting a sequence of images associated with a motion model

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110234543A1 (en) * 2010-03-25 2011-09-29 User Interfaces In Sweden Ab System and method for gesture detection and feedback
US20150009130A1 (en) * 2010-08-04 2015-01-08 Apple Inc. Three Dimensional User Interface Effects On A Display
US20150370322A1 (en) * 2014-06-18 2015-12-24 Advanced Micro Devices, Inc. Method and apparatus for bezel mitigation with head tracking

Also Published As

Publication number Publication date
US20190005675A1 (en) 2019-01-03
US10540809B2 (en) 2020-01-21
US20190007672A1 (en) 2019-01-03

Similar Documents

Publication Publication Date Title
US11182961B2 (en) Method and system for representing a virtual object in a view of a real environment
CN109997167B (en) Directional image stitching for spherical image content
JP5916248B2 (en) Image generating apparatus, image generating method, program, and computer-readable information storage medium
CN112884874B (en) Method, device, equipment and medium for applying applique on virtual model
US10891796B2 (en) Systems and methods for augmented reality applications
US20190066366A1 (en) Methods and Apparatus for Decorating User Interface Elements with Environmental Lighting
EP2933781A2 (en) Method and system for representing a virtual object in a view of a real environment
US11562545B2 (en) Method and device for providing augmented reality, and computer program
CN112262413A (en) Real-time synthesis in mixed reality
US20230072701A1 (en) Ambient light based mixed reality object rendering
US10268216B2 (en) Method and system for providing position or movement information for controlling at least one function of an environment
US11941729B2 (en) Image processing apparatus, method for controlling image processing apparatus, and storage medium
KR20180062867A (en) Display apparatus and controlling method thereof
Santos et al. Supporting outdoor mixed reality applications for architecture and cultural heritage
Schwandt et al. Glossy reflections for mixed reality environments on mobile devices
KR102623700B1 (en) Information processing apparatus, information processing method, and computer program
KR101488647B1 (en) Virtual illumination of operating method and apparatus for mobile terminal
EP2706508B1 (en) Reducing latency in an augmented-reality display
Schwandt et al. Illumination in Mixed Reality
CN115244494A (en) System and method for processing a scanned object
KR20220154780A (en) System and method for real-time ray tracing in 3D environment
CN116486046A (en) Method and equipment for displaying virtual object based on illumination intensity
CN111176452A (en) Method and apparatus for determining display area, computer system, and readable storage medium

Legal Events

Date Code Title Description
STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION