WO2020091739A1 - Accessory devices to reflect environmental lighting - Google Patents

Accessory devices to reflect environmental lighting Download PDF

Info

Publication number
WO2020091739A1
WO2020091739A1 PCT/US2018/058133 US2018058133W WO2020091739A1 WO 2020091739 A1 WO2020091739 A1 WO 2020091739A1 US 2018058133 W US2018058133 W US 2018058133W WO 2020091739 A1 WO2020091739 A1 WO 2020091739A1
Authority
WO
WIPO (PCT)
Prior art keywords
camera
lens
lighting
computing device
environmental lighting
Prior art date
Application number
PCT/US2018/058133
Other languages
French (fr)
Inventor
Craig Peter Sayers
Ian N. Robinson
Original Assignee
Hewlett-Packard Development Company, L.P.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hewlett-Packard Development Company, L.P. filed Critical Hewlett-Packard Development Company, L.P.
Priority to PCT/US2018/058133 priority Critical patent/WO2020091739A1/en
Publication of WO2020091739A1 publication Critical patent/WO2020091739A1/en

Links

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B17/00Details of cameras or camera bodies; Accessories therefor
    • G03B17/02Bodies
    • G03B17/17Bodies with reflectors arranged in beam forming the photographic image, e.g. for reducing dimensions of camera
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B17/00Details of cameras or camera bodies; Accessories therefor
    • G03B17/56Accessories
    • G03B17/565Optical accessories, e.g. converters for close-up photography, tele-convertors, wide-angle convertors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/1686Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being an integrated camera
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/45Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from two or more image sensors being of different type or operating in different modes, e.g. with a CMOS sensor for moving images in combination with a charge-coupled device [CCD] for still images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/55Optical parts specially adapted for electronic image sensors; Mounting thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/57Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices

Definitions

  • a mobile computing device can include a camera.
  • the camera can include a lens and an optical sensor to capture images of an environment the camera is in.
  • FIG. 1A illustrates an example accessory device to reflect environmental lighting to a camera
  • FIG. IB illustrates an example computing system using accessory device 100 of FIG. 1A.
  • FIG. 2 illustrates an example accessory device with multiple wide-angle lenses and multiple reflective surfaces
  • FIG. 3 illustrates an example accessory device for a computing device with multiple cameras
  • FIG. 4 illustrates another example accessory device to reflect environmental lighting to a camera
  • FIG. 5 illustrates an example method for generating a lighting effect on an object to be rendered on a display
  • FIG. 6 illustrates a computer system upon which aspects described herein may be implemented.
  • Examples provide for an accessory device to reflect
  • the camera can capture environmental lighting of an actual environment around the camera.
  • the accessory device can include an accessory housing that can be positioned over the camera of the computing device. Additionally, the accessory device can include an optical structure.
  • the optical structure can be in optical alignment with a lens of the camera. In some examples, the optical structure can reflect the environmental lighting received from a region outside of an original field-of-view of the lens of the camera to the lens of the camera. Examples provide for environmental lighting to be captured in non-cognizable form, meaning lighting information determined from the non-cognizable environmental lighting captured by a sensor of the camera is insufficient to render a recognizable image of an object or a person that may be present in the actual environment.
  • an example accessory device can be used to enhance lighting effects of objects (e.g., three-dimensional objects) depicted on the display of a computing device.
  • a computing device can be used to create a realistic depiction of an object by using lighting effects that reflect the environment within which the computing device is located.
  • the lighting effects can enhance a three-dimensional representation of a depicted object, by depicting the depicted objects to be more realistic than what would otherwise be possible. For example, if there were a window to the left of the computing device then the three-dimensional representation of an object rendered on a display of the computing device may include a brighter illumination from that direction.
  • Many applications can utilize an accessory device as described with examples. In context of three-dimensional printing, for example, a user may want a preview an object that is to be created on a display of the user's computing device. Examples as described enable the user to view a realistic depiction of the object, including lighting effects of the immediate
  • a processor of a computing device can determine lighting information based on at least a portion of environmental lighting captured by a camera of the computing device. Additionally, the processor can utilize the lighting information to determine and generate a lighting effect on an object (e.g., three-dimensional object) or modify the object that is to be rendered on a display. That way, the object can be displayed in a way that reflects the lighting conditions similar to that of an actual environment surrounding the camera.
  • an accessory device can passively enable an existing camera of a computing device to capture environmental lighting from a region outside of an original field-of-view of a lens of the camera, including from above and/or behind the camera.
  • the image captured by the camera using the accessory may be non-cognizable to protect the privacy of a person visible to the camera.
  • the accessory device may enable the existing camera of an environment to capture environmental lighting with a wider dynamic range than the camera would otherwise be capable of.
  • Examples described herein provide that methods, techniques, and actions performed by a computing device are performed
  • programmatically or as a computer-implemented method.
  • Programmatically means through the use of code or computer-executable instructions. These instructions can be stored in a memory resource of the computing device.
  • a programmatically performed step may or may not be automatic.
  • Examples described herein can be implemented using programmatic modules, engines, or components.
  • a programmatic module, engine, or component can include a program, a sub-routine, a portion of a program, or a software component or a hardware component capable of performing stated tasks or functions.
  • a module or component can exist on a hardware component independently of other modules or components.
  • a module or component can be a shared element or process of other modules, programs, or machines.
  • examples described herein can utilize specialized computing devices, including processing and memory resources.
  • examples described may be implemented, in whole or in part, on computing devices such as servers, desktop computers, cellular or
  • smartphones personal digital assistants (e.g., PDAs), laptop computers, printers, digital picture frames, network equipment (e.g., routers), wearable computing devices, and tablet devices.
  • Memory, processing, and network resources may all be used in connection with the establishment, use, or performance of any example described herein (including with the
  • a computing device coupled to a data storage device storing the computer program and to execute the program corresponds to a special- purpose computing device.
  • any computing systems referred to in the specification may include a single processor or may be architectures employing multiple processor designs for increased computing capability.
  • examples described herein may be implemented through the use of instructions that are executable by a processor. These instructions may be carried on a computer-readable medium.
  • Machines shown or described with figures below provide examples of processing resources and computer-readable mediums on which instructions for implementing examples described can be carried and/or executed.
  • the numerous machines shown with examples described include processor(s) and various forms of memory for holding data and instructions.
  • Examples of computer-readable mediums include permanent memory storage devices, such as hard drives on personal computers or servers.
  • Other examples of computer storage mediums include portable storage units, such as CD or DVD units, flash memory (such as carried on smartphones, multifunctional devices or tablets), and magnetic memory.
  • Computers, terminals, network enabled devices are all examples of machines and devices that utilize processors, memory, and instructions stored on computer-readable mediums.
  • examples may be implemented in the form of computer- programs, or a computer usable carrier medium capable of carrying such a program.
  • examples described herein may be implemented through the use of dedicated hardware logic circuits that are comprised of an interconnection of logic gates.
  • Such circuits are typically designed using a hardware description language (HDL), such as Verilog and VFIDL. These languages contain instructions that ultimately define the layout of the circuit. Flowever, once the circuit is fabricated, there are no instructions. All the processing is performed by interconnected gates.
  • HDL hardware description language
  • FIG. 1A illustrates an example accessory to reflect
  • accessory device 100 can reflect environmental lighting to camera 194 of computing device 190.
  • camera 194 may include a lens that is flush or substantially flush on a facade of computing device 190.
  • Accessory device 100 can include accessory housing 102.
  • Accessory housing 102 can be dimensioned to fit a portion of computing device 190 that includes camera 194. That way, accessory housing 102 can be positioned over the portion of computing device 190 that includes camera 194.
  • accessory housing 102 can include an optical structure that is in optical alignment with a lens of camera 194.
  • the optical structure can reflect environmental lighting received from a region outside of original field-of-view 196 of the lens of camera 194, to the lens of camera 194.
  • the environmental lighting can include lighting information of an environment camera 194 is in.
  • the environmental lighting can be in non-cognizable form, meaning lighting information determined from non-cognizable environmental lighting can be insufficient to render a high-resolution image of an object that may be present in the environment camera 194 is in.
  • an optical structure can include wide-angle lens 104 and reflective surface 106 (e.g., a mirror) to receive and reflect environmental lighting from a region outside of original field-of-view 196 of the lens of camera 194 (e.g., environmental lighting from above and/or behind camera 194), to the lens of camera 194.
  • Wide-angle lens 104 can extend or widen original field-of-view 196 of camera 194.
  • reflective surface 106 can receive environmental lighting, or at least a portion of the environmental lighting, from wide-angle lens 104 and reflect the environmental lighting, to a lens of camera 194. That way, the optical structure can enable camera 194 to receive environmental lighting of an environment surrounding computing device 190 over wide-angle field-of-view 110.
  • FIG. IB illustrates an example computing system using the accessory device 100 of FIG. 1A.
  • the example computing system can include computing device 190.
  • Computing device 190 can include a housing that includes display 192 and camera 194.
  • accessory device 100 can collect environmental lighting from a region outside of an original field-of- view of a lens of camera 194 (e.g., wide-angle field-of-view 110) for camera 194 to capture.
  • accessory device 100 can include wide-angle lens 104 to capture environmental lighting over wide-angle field-of-view 110.
  • accessory device 100 can include a reflective surface (e.g., reflective surface 106 - not shown) within accessory device 100, to reflect the environmental lighting collected by wide-angle lens 104 to the lens of camera 194.
  • a reflective surface e.g., reflective surface 106 - not shown
  • FIG. 2 illustrates an example accessory device with multiple wide-angle lenses and multiple reflective surfaces.
  • accessory device 200 can include multiple wide-angle lenses (e.g., wide- angle lens 204 and wide-angle lens 206) and multiple reflective surfaces (e.g., reflective surface 208 and reflective surface 210) to enable camera 194 to capture environmental lighting from a region outside of original field-of- view 196 of a lens of camera 194.
  • the environmental lighting can be in non-cognizable form.
  • Accessory device 200 can include accessory housing 202.
  • Accessory housing 202 can be dimensioned to fit a portion of computing device 190 that includes camera 194. That way, accessory housing 202 can be positioned over the portion of computing device 190 that includes camera 194. In some examples, accessory housing 202 can extend from a back facade of computing device 190 to a front facade of computing device 190 to create an enclosure in front of camera 194. Additionally, accessory housing 202 can be flush against or near to the back facade of computing device 190.
  • accessory housing 202 can include an optical structure that is in optical alignment with a lens of camera 194.
  • the optical structure can include multiple wide-angle lenses, such as wide-angle lens 204 and wide-angle lens 206, and multiple reflective surfaces (e.g., mirrors), such as reflective surface 208 and reflective surface 210.
  • the multiple wide-angle lenses e.g., wide-angle lens 204 and wide-angle lens 206
  • the multiple reflective surfaces e.g., reflective surface 208 and reflective surface 210) can be positioned within the enclosure.
  • wide-angle lens 204 and wide-angle lens 206 can be embedded into or supported by accessory housing 202 such that wide-angle lens 204 and wide-angle lens 206 are positioned above camera 194 along a vertical or Y axis such that wide-angle lens 204 and wide-angle lens 206 can capture or direct environmental lighting into the enclosure.
  • reflective surface 208 can be positioned in front of camera 194 along a horizontal or X axis such that environmental lighting or a portion of the environmental lighting captured by wide-angle lens 204 can be reflected to a lens of camera 194.
  • reflective surface 210 can be positioned in front of camera 194 along the horizontal or X axis such that environmental lighting or a portion of environmental lighting captured by wide-angle lens 206 can be reflected to camera 194.
  • positioning of the multiple reflective surfaces in the enclosure can differ in its horizontal positioning or and/or vertical positioning as to not obstruct a pathway of the environmental lighting being reflected from the multiple wide-angle lenses to camera 194. That way, wide- angle lens 204 and wide-angle lens 206 can collect environmental lighting of an environment that is outside original field-of-view 196 of the lens of camera 194 (e.g., the combined field-of-view 212 and field-of-view 214). Additionally, reflective surface 208 and reflective surface 210 can receive and reflect environmental lighting collected from wide-angle lens 204 and wide- angle lens 206 to the lens of camera 194.
  • computing device 190 can include multiple cameras.
  • FIG. 3 illustrates an example accessory device for a computing device with multiple cameras.
  • computing device 190 can include camera 194 on a front facade of computing device 190 and camera 198 on a back facade of computing device 190.
  • an optical structure of an accessory device can include at least one wide-angle lens and reflective surface for each camera that computing device 190 has.
  • Accessory device 300 can include accessory housing 302.
  • Accessory housing 302 can be dimensioned to fit a portion of computing device 190 that includes camera 194 and camera 198. That way, accessory housing 302 can be positioned over the portion of computing device 190 that includes camera 194 and camera 198. In some examples, accessory housing 302 can extend from a back facade of computing device 190 to a front facade of computing device 190 to create an enclosure around camera 194 and camera 198. In such examples, the created enclosure can be around a top portion of computing device 190. In other examples, a top portion of accessory housing 302 can be flush against or next to the top portion of computing device 190. That way two enclosures are created, one positioned in front of camera 194 and the other positioned in front of camera 198.
  • accessory housing 302 can include an optical structure within the enclosure(s) around camera 194 and camera 198.
  • the optical structure is in optical alignment with a lens of camera 194 and a lens of camera 198.
  • the optical structure can receive and reflect environmental lighting received from a region outside of original field-of-view 196 of a lens of camera 194 and original field-of-view 197 of a lens of camera 198, to the lens of camera 194 and the lens of camera 198.
  • the optical structure of accessory device 300 can include at least a wide-angle lens and reflective surface (e.g., a mirror) for each camera (e.g., camera 194 and camera 198) of computing device 190.
  • At least one wide-angle lens can be embedded within accessory housing 302 for each of the multiple cameras of computing device 190.
  • wide-angle lens 308 can be embedded into or supported by accessory housing 302 such that wide-angle lens 308 can be positioned in front of camera 198 along a horizontal or X axis and above camera 198 along a vertical or Y axis.
  • wide-angle lens 310 can be embedded into or supported by accessory housing 302 such that wide-angle lens 310 can be positioned in front of camera 194 along a horizontal or X axis and above camera 194 along a vertical or Y axis. That way, wide-angle lens 308 and wide-angle lens 310 can collect or direct environmental lighting that is outside original field-of- view 196 (e.g., the combined field-of-view 312 and field-of-view 314) of the lens of camera 194 and the lens of camera 198 into the enclosure(s).
  • original field-of- view 196 e.g., the combined field-of-view 312 and field-of-view 314
  • accessory housing 302 can include at least one reflective surface for each of the multiple cameras of computing device 190.
  • reflective surface 304 can be positioned in front of camera 198 along a horizontal or x axis such that environmental lighting or a portion of environmental lighting collected by wide-angle lens 308 can be reflected to the lens of camera 198.
  • reflective surface 306 can be positioned in front of camera 194 along the horizontal or x axis such that environmental lighting or a portion of environmental lighting collected by wide-angle lens 310 can be reflected to the lens of camera 194.
  • an optical structure of an accessory device can include optical fiber light pipes to collect environmental lighting from multiple directions.
  • FIG. 4 illustrates another example accessory device to reflect environmental lighting to a camera. As illustrated in FIG. 4, accessory device 400 can redirect environmental lighting from a region outside original field-of-view 196 of a lens of camera 194 to the lens of camera 194. The environmental lighting can be in non-cognizable form.
  • Accessory device 400 can include accessory housing 402.
  • Accessory housing 402 can be dimensioned to fit a portion of computing device 190 that includes camera 194. That way, accessory housing 402 can be positioned over the portion of computing device 190 that includes camera 194. In some examples, as illustrated in FIG. 4, accessory housing 402 can extend from the back facade of computing device 190 to a front facade of computing device 190 to create an enclosure in front of camera 194. In such examples, accessory housing 402 can be flush against the back facade of computing device 190.
  • accessory housing 402 can include an optical structure.
  • the optical structure can be in optical alignment with the lens of camera 194. Additionally, the optical structure can reflect environmental lighting received from a region outside of original field-of-view 196 of lens of camera 194, to the lens of camera 194.
  • the optical structure can include multiple optical fiber light pipes (as illustrated in FIG. 4 as optical fiber light pipe 404, optical fiber light pipe 406, optical fiber light pipe 408, optical fiber light pipe 410, and optical fiber light pipe 412) to receive and redirect environmental lighting to camera 194.
  • the multiple optical fibers of the optical structure of accessory device 400 can enable camera 194 to capture environmental lighting from a region outside of original field-of-view 196 of a lens of camera 194.
  • the combined field-of-views of the multiple optical fiber light pipes can widen original field- of-view 196 of camera 194 to widened field-of-view 414.
  • the multiple optical fiber light pipes can be positioned in front of camera 194 along a horizontal or X axis and extend through an opening on a top portion of accessory housing 402. That way, accessory housing 402 can support and hold the multiple optical fibers.
  • camera 194 can capture environmental lighting in non-cognizable form.
  • an image captured by camera 194 of an environment around computing device 190 can be in a lower- resolution and preserve the privacy of a person that may be visible to the camera.
  • an image captured by camera 194 that is lower in resolution can utilize a lesser amount of resources (e.g., hardware
  • a processor of computing device 190 can instruct camera 194 to capture a blurred image and then combine
  • an optical structure of an accessory device (e.g., accessory device 100, accessory device 200, accessory device 300 or accessory device 400) can be visibly imperfect so a person examining the accessory may verify an image captured by camera 194 will be non- cognizable.
  • camera 194 can be coupled to a mechanical device (e.g., a switch, toggle or a component, such as ring, to alter the focus of the camera) that can cause camera 194 to go out of focus. That way, camera 194 can capture blurred images (e.g., the environmental lighting in non-cognizable form).
  • a mechanical device e.g., a switch, toggle or a component, such as ring, to alter the focus of the camera
  • camera 194 can capture blurred images (e.g., the environmental lighting in non-cognizable form).
  • an optical filter can be included with an optical structure of an accessory device (e.g., accessory device 100, accessory device 200, accessory device 300 or accessory device 400) to enable the camera to capture environmental lighting with higher dynamic range.
  • the optical filter can filter light captured by wide-angle lens 104.
  • the optical filter is a neutral density filter.
  • camera 194 can capture multiple images. For example, at least one image can be captured without use of the neutral density filter and at least one image can be captured with use of the neutral density filter. That way each captured image contains two similar view of the surrounding environment but with different levels of light sensitivity.
  • a processor e.g., a processor of computing device 190
  • the optical filter can be a stripe optical filter where the stripes act as a neutral density filters.
  • the stripe optical filter can cause a portion of a field-of-view of camera 194 to be dedicated to capture low light intensity, while other portions of the field-of- view of camera 194 can be dedicated to capture high light intensity. That way, camera 194 can capture an image that has portions of the image that are filtered and other portions of the image that are not filtered.
  • a processor e.g., a processor of computing device 190
  • an accessory device e.g., accessory device 100, accessory device 200, accessory device 300 or accessory device 400
  • the optical filter can be positioned between a wide-angle lens (e.g., wide-angle lens 104, wide-angle lens 204, wide-angle lens 206, wide-angle lens 308 and/or wide-angle lens 310) and a reflective surface (e.g., reflective surface 106, reflective surface 208, reflective surface 210, reflective surface 304 and/or reflective surface 306).
  • the optical filter can be positioned between the reflective surface (e.g., reflective surface 106, reflective surface 208, reflective surface 210, reflective surface 304 and/or reflective surface 306) and camera 194.
  • the optical filter can be positioned between multiple optical fiber light pipes (optical fiber light pipe 404, optical fiber light pipe 406, optical fiber light pipe 408, optical fiber light pipe 410, and optical fiber light pipe 412) and camera 194.
  • a diffusing effect may be applied to a lens of camera 194.
  • a film with a roughened front surface or that is semi-transparent may be placed in front of the lens of camera 194.
  • the film can cause a diffusing effect that causes camera 194 to capture non- cognizable environmental lighting, while preserving a privacy of a user.
  • the film can cause camera 194 to capture a blurry image that includes a user.
  • the blurry image can include lighting effects of the environment without capturing a clear image of the area around computing device 190, including the user.
  • computing device 190 can include a processor to determine a lighting effect onto an object that is to be rendered on display 192, based on lighting information of environmental lighting captured by camera 194.
  • the lighting effect on the object can reflect a lighting source of an environment where the environmental lighting is collected by camera 194.
  • the processor can generate the lighting effect on the object to be rendered on display 192 in manner that reflects the lighting conditions of an environment camera 194 is in. That way, the generated lighting effect can enhance a perception (e.g., a three-dimensional perception) of the object rendered on display 192.
  • the processor can determine lighting information from the environmental lighting captured by camera 194. Examples of lighting information the processor can determine from the environmental lighting captured by camera 194 include, a number of light sources in the environment, a location of each light source in the environment, and a coverage of ambient light from each light source.
  • a processor of computing device 190 can determine lighting information of the environment camera 194 is in, from environmental lighting captured by camera 194.
  • the processor can utilize the lighting information to determine and generate a lighting effect on the object to be rendered on display 192.
  • the processor can generate the lighting effect similar to how ambient light affects the lighting conditions in the room. For example, based on the lighting information, the processor can determine which portions of the object is to be covered by light. Additionally, the processor can
  • the processor can determine, how intense various portions of the light are to be on the object, based on the lighting information. Additionally, the processor can determine the color of various portions of the light on the object based on the lighting information. Moreover, the processor can determine areas on the object where shadowing occurs, based on the lighting information. That way, the processor can generate a lighting effect on the object to be rendered on display 192 in a manner that reflects how the ambient light from the window affects the lighting conditions in the room.
  • an object can be rendered and displayed with a generated environment that combines a pre-existing environment may with lighting conditions of an environment camera 194 is in.
  • the processor can select an environmental map (e.g., a pre-generated
  • the environmental map can be stored on computing device 190.
  • the environmental map can be obtained from a third-party device.
  • the processor of computing device 190 can alter the environmental map, based on lighting information of environmental lighting captured by camera 194. For example, the processor can brighten or darken some areas in the selected environmental map of a jungle. Additionally, the processor can determine and generate a lighting effect on the object in the selected environmental map of a jungle, based on lighting conditions of an office room that camera 194 is in.
  • a lighting effect determined and generated by a processor of computing device 190 can be based on parameters of the object.
  • the object to be displayed can be a preview of an output of a three-dimensional printer system (e.g., a three-dimensional object).
  • a user may input parameters indicating an output of the three-dimensional printer system is to have a reflective surface.
  • the processor can utilize such parameters and determine that the object to be generated in the preview is to have a reflective surface.
  • the processor can generate the lighting effect on the object such that, the surface of the object can reflect light.
  • the user may input parameters indicating the output of the three-dimensional printer system is to have a more matte-like surface.
  • the processor can utilize such parameters and determine that the object to be generated in the preview is to have a matte-like surface. Additionally, the processor can generate the lighting effect on the object such that, the generated surface of the object does not reflect light as much as an object with a reflective surface.
  • FIG. 5 illustrates an example method for generating a lighting effect on an object to be rendered on a display.
  • a processor of computing device 190 can capture environmental lighting of a given environment using camera 194 of computing device 190 (500).
  • computing device 190 can include an accessory device to (e.g., accessory device 100, accessory device 200, accessory device 300 or accessory device 400) to receive and reflect environmental lighting from a region outside an original field-of-view of a lens of camera 194. Additionally, the environmental lighting can be in non- cognizable form.
  • the accessory device can include an optical structure that includes and optical filter. Examples of an optical filter that the optical structure can include are a neutral density filter and a stripe filter.
  • computing device 190 can include multiple cameras (e.g., camera 194 and camera 198).
  • an optical structure of the accessory device e.g., accessory device 300
  • a processor of computing device 190 can determine a lighting effect using the lighting information (502).
  • the lighting effect can be based on the lighting information of environmental lighting captured by camera 194.
  • the processor can render the object on display 192 of computing device 190 and the rendering of the object can include the lighting effect (504).
  • the processor can generate the lighting effect on the object to be rendered on display 192 in a manner that reflects the lighting conditions of an environment camera 194 is in. That way, the generated lighting effect can enhance a perception (e.g., a three- dimensional perception) of the object rendered on display 192.
  • an object can be generated and displayed with a generated environment that emulates lighting conditions of an environment camera 194 is in.
  • a processor of computing device 190 can generate the object with a selected environmental map of a jungle.
  • the selected environmental map can be obtained from a memory resource of computing device 190 that stores a set of environmental maps.
  • the environmental map can be obtained from a third-party device.
  • the processor can generate a lighting effect on the object in the selected environmental map of a jungle, based on lighting conditions of an office room that camera 194 is in.
  • a generated lighting effect can be based on parameters of an object. For example, a user may input parameters indicating characteristics of a surface of the object (e.g., reflective, matte, stippled, etc.). Additionally, the processor can generate the lighting effect on the object that takes into account the input parameters that indicate the characteristics of the surface of the object to be rendered on display 192. For example, if the user inputs parameters that indicate the surface of the object is to be reflective, then the processor can generate a lighting effect on the object that appears that light is reflecting off of the reflective surface of the object.
  • FIG. 6 is a block diagram that illustrates a computer system upon which examples described herein may be implemented.
  • a computing device 600 may correspond to a mobile computing device, such as a cellular device that is capable of telephony, messaging, and data services.
  • the computing device 600 can correspond to a device operated by a user. Examples of such devices include smartphones, handsets, tablet devices, or in-vehicle computing devices that communicate with cellular carriers.
  • the computing device 600 includes a processor 610, memory resources 620, a display 630 (e.g., such as a touch-sensitive display device), communication sub-systems 640 (including wireless communication systems), a sensor set 650 (e.g., accelerometer and/or gyroscope, microphone, barometer, etc.), and location detection mechanisms (e.g., GPS component) 660.
  • a sensor set 650 e.g., accelerometer and/or gyroscope, microphone, barometer, etc.
  • location detection mechanisms e.g., GPS component
  • at least one of the communication sub- systems 640 sends and receives cellular data over data channels and voice channels.
  • the communications sub-systems 640 can include a cellular transceiver and a short-range wireless transceiver.
  • communication sub-systems 640 can send and receive cellular data over network(s) 670 (e.g., data channels and voice channels).
  • Communication sub-systems 640 can include a cellular transceiver and a short-range wireless transceiver.
  • the processor 610 can exchange data with a third-party device or system (not illustrated in FIG. 6) via the communications sub-systems 640 to obtain environmental maps.
  • Memory resources 620 can store instructions for a variety of operations. For example, as illustrated in FIG. 6, memory resources 620 can include lighting effect instructions 622. Additionally, processor 610 can execute lighting effect instructions 622 to perform operations for
  • Processor 610 can execute lighting effect instructions 622 to utilize lighting information of environmental lighting captured by a camera (e.g., camera 194) to determine and generate a lighting effect on an object to be rendered on display 630. That way, the object with the generated lighting effect rendered on display 630 can enhance a perception (e.g., a three-dimensional perception) of the object.
  • a camera e.g., camera 194
  • a perception e.g., a three-dimensional perception

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Hardware Design (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Studio Devices (AREA)

Abstract

An accessory housing that is positionable over at least a first camera of the computing device. Additionally, the accessory device can include an optical structure. The optical structure can be provided with the accessory housing in optical alignment with a lens of the first camera. Additionally, optical structure can reflect environmental lighting received from a region outside of an original field-of-view of the lens of the first camera to the lens of the first camera. Moreover, the environmental lighting can be received in a non-cognizable form.

Description

ACCESSORY DEVICES TO REFLECT ENVIRONMENTAL LIGHTING
BACKGROUND
[0001] A mobile computing device can include a camera. In some examples, the camera can include a lens and an optical sensor to capture images of an environment the camera is in.
BRIEF DESCRIPTION OF THE DRAWINGS
[0002] The disclosure herein is illustrated by way of example, and not by way of limitation, in the figures of the accompanying drawings in which like reference numerals refer to similar elements, and in which :
[0003] FIG. 1A illustrates an example accessory device to reflect environmental lighting to a camera;
[0004] FIG. IB illustrates an example computing system using accessory device 100 of FIG. 1A.
[0005] FIG. 2 illustrates an example accessory device with multiple wide-angle lenses and multiple reflective surfaces;
[0006] FIG. 3 illustrates an example accessory device for a computing device with multiple cameras;
[0007] FIG. 4 illustrates another example accessory device to reflect environmental lighting to a camera;
[0008] FIG. 5 illustrates an example method for generating a lighting effect on an object to be rendered on a display; and
[0009] FIG. 6 illustrates a computer system upon which aspects described herein may be implemented.
[0010] Throughout the drawings, identical reference numbers designate similar, but not necessarily identical elements. The figures are not necessarily to scale, and the size of some parts may be exaggerated to more clearly illustrate the example shown. Moreover, the drawings provide examples and/or implementations consistent with the description. Flowever, the description is not limited to the examples and/or implementations provided in the drawings. DETAILED DESCRIPTION
[0011] Examples provide for an accessory device to reflect
environmental lighting to a camera of a computing device. The camera can capture environmental lighting of an actual environment around the camera. The accessory device can include an accessory housing that can be positioned over the camera of the computing device. Additionally, the accessory device can include an optical structure. The optical structure can be in optical alignment with a lens of the camera. In some examples, the optical structure can reflect the environmental lighting received from a region outside of an original field-of-view of the lens of the camera to the lens of the camera. Examples provide for environmental lighting to be captured in non-cognizable form, meaning lighting information determined from the non-cognizable environmental lighting captured by a sensor of the camera is insufficient to render a recognizable image of an object or a person that may be present in the actual environment.
[0012] Among other benefits, an example accessory device can be used to enhance lighting effects of objects (e.g., three-dimensional objects) depicted on the display of a computing device. In particular, a computing device can be used to create a realistic depiction of an object by using lighting effects that reflect the environment within which the computing device is located. As further described by examples, the lighting effects can enhance a three-dimensional representation of a depicted object, by depicting the depicted objects to be more realistic than what would otherwise be possible. For example, if there were a window to the left of the computing device then the three-dimensional representation of an object rendered on a display of the computing device may include a brighter illumination from that direction. Many applications can utilize an accessory device as described with examples. In context of three-dimensional printing, for example, a user may want a preview an object that is to be created on a display of the user's computing device. Examples as described enable the user to view a realistic depiction of the object, including lighting effects of the immediate
environment, prior to the user printing the object.
[0013] As described by various examples, a processor of a computing device can determine lighting information based on at least a portion of environmental lighting captured by a camera of the computing device. Additionally, the processor can utilize the lighting information to determine and generate a lighting effect on an object (e.g., three-dimensional object) or modify the object that is to be rendered on a display. That way, the object can be displayed in a way that reflects the lighting conditions similar to that of an actual environment surrounding the camera. Among other benefits, an accessory device can passively enable an existing camera of a computing device to capture environmental lighting from a region outside of an original field-of-view of a lens of the camera, including from above and/or behind the camera. In some examples, the image captured by the camera using the accessory may be non-cognizable to protect the privacy of a person visible to the camera. Additionally, the accessory device may enable the existing camera of an environment to capture environmental lighting with a wider dynamic range than the camera would otherwise be capable of.
[0014] Examples described herein provide that methods, techniques, and actions performed by a computing device are performed
programmatically, or as a computer-implemented method. Programmatically, as used, means through the use of code or computer-executable instructions. These instructions can be stored in a memory resource of the computing device. A programmatically performed step may or may not be automatic.
[0015] Additionally, examples described herein can be implemented using programmatic modules, engines, or components. A programmatic module, engine, or component can include a program, a sub-routine, a portion of a program, or a software component or a hardware component capable of performing stated tasks or functions. As used herein, a module or component can exist on a hardware component independently of other modules or components. Alternatively, a module or component can be a shared element or process of other modules, programs, or machines.
[0016] Moreover, examples described herein can utilize specialized computing devices, including processing and memory resources. For example, examples described may be implemented, in whole or in part, on computing devices such as servers, desktop computers, cellular or
smartphones, personal digital assistants (e.g., PDAs), laptop computers, printers, digital picture frames, network equipment (e.g., routers), wearable computing devices, and tablet devices. Memory, processing, and network resources may all be used in connection with the establishment, use, or performance of any example described herein (including with the
performance of any method or with the implementation of any system). For instance, a computing device coupled to a data storage device storing the computer program and to execute the program corresponds to a special- purpose computing device. Furthermore, any computing systems referred to in the specification may include a single processor or may be architectures employing multiple processor designs for increased computing capability.
[0017] Furthermore, examples described herein may be implemented through the use of instructions that are executable by a processor. These instructions may be carried on a computer-readable medium. Machines shown or described with figures below provide examples of processing resources and computer-readable mediums on which instructions for implementing examples described can be carried and/or executed. In particular, the numerous machines shown with examples described include processor(s) and various forms of memory for holding data and instructions. Examples of computer-readable mediums include permanent memory storage devices, such as hard drives on personal computers or servers. Other examples of computer storage mediums include portable storage units, such as CD or DVD units, flash memory (such as carried on smartphones, multifunctional devices or tablets), and magnetic memory. Computers, terminals, network enabled devices (e.g., mobile devices, such as cell phones) are all examples of machines and devices that utilize processors, memory, and instructions stored on computer-readable mediums.
Additionally, examples may be implemented in the form of computer- programs, or a computer usable carrier medium capable of carrying such a program.
[0018] Alternatively, examples described herein may be implemented through the use of dedicated hardware logic circuits that are comprised of an interconnection of logic gates. Such circuits are typically designed using a hardware description language (HDL), such as Verilog and VFIDL. These languages contain instructions that ultimately define the layout of the circuit. Flowever, once the circuit is fabricated, there are no instructions. All the processing is performed by interconnected gates. SYSTEM DESCRIPTION
[0019] FIG. 1A illustrates an example accessory to reflect
environmental lighting to a camera. As illustrated in FIG. 1A, accessory device 100 can reflect environmental lighting to camera 194 of computing device 190. In some examples, camera 194 may include a lens that is flush or substantially flush on a facade of computing device 190.
[0020] Accessory device 100 can include accessory housing 102.
Accessory housing 102 can be dimensioned to fit a portion of computing device 190 that includes camera 194. That way, accessory housing 102 can be positioned over the portion of computing device 190 that includes camera 194.
[0021] Additionally, accessory housing 102 can include an optical structure that is in optical alignment with a lens of camera 194. In some examples, the optical structure can reflect environmental lighting received from a region outside of original field-of-view 196 of the lens of camera 194, to the lens of camera 194. In such examples, the environmental lighting can include lighting information of an environment camera 194 is in. Additionally, the environmental lighting can be in non-cognizable form, meaning lighting information determined from non-cognizable environmental lighting can be insufficient to render a high-resolution image of an object that may be present in the environment camera 194 is in.
[0022] In some examples, an optical structure can include wide-angle lens 104 and reflective surface 106 (e.g., a mirror) to receive and reflect environmental lighting from a region outside of original field-of-view 196 of the lens of camera 194 (e.g., environmental lighting from above and/or behind camera 194), to the lens of camera 194. Wide-angle lens 104 can extend or widen original field-of-view 196 of camera 194. Additionally, reflective surface 106 can receive environmental lighting, or at least a portion of the environmental lighting, from wide-angle lens 104 and reflect the environmental lighting, to a lens of camera 194. That way, the optical structure can enable camera 194 to receive environmental lighting of an environment surrounding computing device 190 over wide-angle field-of-view 110.
[0023] FIG. IB illustrates an example computing system using the accessory device 100 of FIG. 1A. The example computing system can include computing device 190. Computing device 190 can include a housing that includes display 192 and camera 194. Additionally, accessory device 100, can collect environmental lighting from a region outside of an original field-of- view of a lens of camera 194 (e.g., wide-angle field-of-view 110) for camera 194 to capture. For example, accessory device 100 can include wide-angle lens 104 to capture environmental lighting over wide-angle field-of-view 110. Additionally, accessory device 100 can include a reflective surface (e.g., reflective surface 106 - not shown) within accessory device 100, to reflect the environmental lighting collected by wide-angle lens 104 to the lens of camera 194.
[0024] FIG. 2 illustrates an example accessory device with multiple wide-angle lenses and multiple reflective surfaces. As illustrated in FIG. 2, accessory device 200 can include multiple wide-angle lenses (e.g., wide- angle lens 204 and wide-angle lens 206) and multiple reflective surfaces (e.g., reflective surface 208 and reflective surface 210) to enable camera 194 to capture environmental lighting from a region outside of original field-of- view 196 of a lens of camera 194. In some examples, the environmental lighting can be in non-cognizable form.
[0025] Accessory device 200 can include accessory housing 202.
Accessory housing 202 can be dimensioned to fit a portion of computing device 190 that includes camera 194. That way, accessory housing 202 can be positioned over the portion of computing device 190 that includes camera 194. In some examples, accessory housing 202 can extend from a back facade of computing device 190 to a front facade of computing device 190 to create an enclosure in front of camera 194. Additionally, accessory housing 202 can be flush against or near to the back facade of computing device 190.
[0026] Additionally, accessory housing 202 can include an optical structure that is in optical alignment with a lens of camera 194. In some examples, the optical structure can include multiple wide-angle lenses, such as wide-angle lens 204 and wide-angle lens 206, and multiple reflective surfaces (e.g., mirrors), such as reflective surface 208 and reflective surface 210. Additionally, the multiple wide-angle lenses (e.g., wide-angle lens 204 and wide-angle lens 206) can be embedded within accessory housing 202 and the multiple reflective surfaces (e.g., reflective surface 208 and reflective surface 210) can be positioned within the enclosure. [0027] For instance, as illustrated in FIG. 2, wide-angle lens 204 and wide-angle lens 206 can be embedded into or supported by accessory housing 202 such that wide-angle lens 204 and wide-angle lens 206 are positioned above camera 194 along a vertical or Y axis such that wide-angle lens 204 and wide-angle lens 206 can capture or direct environmental lighting into the enclosure. Within the enclosure, reflective surface 208 can be positioned in front of camera 194 along a horizontal or X axis such that environmental lighting or a portion of the environmental lighting captured by wide-angle lens 204 can be reflected to a lens of camera 194. Additionally, reflective surface 210 can be positioned in front of camera 194 along the horizontal or X axis such that environmental lighting or a portion of environmental lighting captured by wide-angle lens 206 can be reflected to camera 194. In some examples, positioning of the multiple reflective surfaces in the enclosure can differ in its horizontal positioning or and/or vertical positioning as to not obstruct a pathway of the environmental lighting being reflected from the multiple wide-angle lenses to camera 194. That way, wide- angle lens 204 and wide-angle lens 206 can collect environmental lighting of an environment that is outside original field-of-view 196 of the lens of camera 194 (e.g., the combined field-of-view 212 and field-of-view 214). Additionally, reflective surface 208 and reflective surface 210 can receive and reflect environmental lighting collected from wide-angle lens 204 and wide- angle lens 206 to the lens of camera 194.
[0028] In some examples, computing device 190 can include multiple cameras. FIG. 3 illustrates an example accessory device for a computing device with multiple cameras. For example, computing device 190 can include camera 194 on a front facade of computing device 190 and camera 198 on a back facade of computing device 190. In such examples, an optical structure of an accessory device can include at least one wide-angle lens and reflective surface for each camera that computing device 190 has.
[0029] Accessory device 300 can include accessory housing 302.
Accessory housing 302 can be dimensioned to fit a portion of computing device 190 that includes camera 194 and camera 198. That way, accessory housing 302 can be positioned over the portion of computing device 190 that includes camera 194 and camera 198. In some examples, accessory housing 302 can extend from a back facade of computing device 190 to a front facade of computing device 190 to create an enclosure around camera 194 and camera 198. In such examples, the created enclosure can be around a top portion of computing device 190. In other examples, a top portion of accessory housing 302 can be flush against or next to the top portion of computing device 190. That way two enclosures are created, one positioned in front of camera 194 and the other positioned in front of camera 198.
[0030] Additionally, accessory housing 302 can include an optical structure within the enclosure(s) around camera 194 and camera 198. In some examples, the optical structure is in optical alignment with a lens of camera 194 and a lens of camera 198. In such examples, the optical structure can receive and reflect environmental lighting received from a region outside of original field-of-view 196 of a lens of camera 194 and original field-of-view 197 of a lens of camera 198, to the lens of camera 194 and the lens of camera 198. In some examples, the optical structure of accessory device 300 can include at least a wide-angle lens and reflective surface (e.g., a mirror) for each camera (e.g., camera 194 and camera 198) of computing device 190. For example, at least one wide-angle lens can be embedded within accessory housing 302 for each of the multiple cameras of computing device 190. For instance, as illustrated in FIG. 3, wide-angle lens 308 can be embedded into or supported by accessory housing 302 such that wide-angle lens 308 can be positioned in front of camera 198 along a horizontal or X axis and above camera 198 along a vertical or Y axis.
Additionally, wide-angle lens 310 can be embedded into or supported by accessory housing 302 such that wide-angle lens 310 can be positioned in front of camera 194 along a horizontal or X axis and above camera 194 along a vertical or Y axis. That way, wide-angle lens 308 and wide-angle lens 310 can collect or direct environmental lighting that is outside original field-of- view 196 (e.g., the combined field-of-view 312 and field-of-view 314) of the lens of camera 194 and the lens of camera 198 into the enclosure(s).
[0031] Additionally, accessory housing 302 can include at least one reflective surface for each of the multiple cameras of computing device 190. For example, following the example above, reflective surface 304 can be positioned in front of camera 198 along a horizontal or x axis such that environmental lighting or a portion of environmental lighting collected by wide-angle lens 308 can be reflected to the lens of camera 198. Additionally, reflective surface 306 can be positioned in front of camera 194 along the horizontal or x axis such that environmental lighting or a portion of environmental lighting collected by wide-angle lens 310 can be reflected to the lens of camera 194.
[0032] In some examples, an optical structure of an accessory device can include optical fiber light pipes to collect environmental lighting from multiple directions. FIG. 4 illustrates another example accessory device to reflect environmental lighting to a camera. As illustrated in FIG. 4, accessory device 400 can redirect environmental lighting from a region outside original field-of-view 196 of a lens of camera 194 to the lens of camera 194. The environmental lighting can be in non-cognizable form.
[0033] Accessory device 400 can include accessory housing 402.
Accessory housing 402 can be dimensioned to fit a portion of computing device 190 that includes camera 194. That way, accessory housing 402 can be positioned over the portion of computing device 190 that includes camera 194. In some examples, as illustrated in FIG. 4, accessory housing 402 can extend from the back facade of computing device 190 to a front facade of computing device 190 to create an enclosure in front of camera 194. In such examples, accessory housing 402 can be flush against the back facade of computing device 190.
[0034] Additionally, accessory housing 402 can include an optical structure. The optical structure can be in optical alignment with the lens of camera 194. Additionally, the optical structure can reflect environmental lighting received from a region outside of original field-of-view 196 of lens of camera 194, to the lens of camera 194. In some examples, the optical structure can include multiple optical fiber light pipes (as illustrated in FIG. 4 as optical fiber light pipe 404, optical fiber light pipe 406, optical fiber light pipe 408, optical fiber light pipe 410, and optical fiber light pipe 412) to receive and redirect environmental lighting to camera 194. The multiple optical fibers of the optical structure of accessory device 400 can enable camera 194 to capture environmental lighting from a region outside of original field-of-view 196 of a lens of camera 194. For example, the combined field-of-views of the multiple optical fiber light pipes can widen original field- of-view 196 of camera 194 to widened field-of-view 414. Additionally, the multiple optical fiber light pipes can be positioned in front of camera 194 along a horizontal or X axis and extend through an opening on a top portion of accessory housing 402. That way, accessory housing 402 can support and hold the multiple optical fibers.
[0035] In some examples, camera 194 can capture environmental lighting in non-cognizable form. For example, an image captured by camera 194 of an environment around computing device 190 can be in a lower- resolution and preserve the privacy of a person that may be visible to the camera. Additionally, an image captured by camera 194 that is lower in resolution can utilize a lesser amount of resources (e.g., hardware
components, such as integrated circuits or specialized integrated circuits, and/or software or logic stored on the hardware components, such as software stored on a non-transitory computer-readable medium), than a high-resolution image.
[0036] In some examples, a processor of computing device 190, can instruct camera 194 to capture a blurred image and then combine
information from nearby pixels to create an image of lower resolution but higher dynamic range. That way, such an image can be in non-cognizable form, but with a high-dynamic range of luminosity.
[0037] In other examples, an optical structure of an accessory device (e.g., accessory device 100, accessory device 200, accessory device 300 or accessory device 400) can be visibly imperfect so a person examining the accessory may verify an image captured by camera 194 will be non- cognizable.
[0038] In yet other examples, camera 194 can be coupled to a mechanical device (e.g., a switch, toggle or a component, such as ring, to alter the focus of the camera) that can cause camera 194 to go out of focus. That way, camera 194 can capture blurred images (e.g., the environmental lighting in non-cognizable form).
[0039] In some examples, an optical filter can be included with an optical structure of an accessory device (e.g., accessory device 100, accessory device 200, accessory device 300 or accessory device 400) to enable the camera to capture environmental lighting with higher dynamic range. The optical filter can filter light captured by wide-angle lens 104. In some examples, the optical filter is a neutral density filter. In such examples, camera 194 can capture multiple images. For example, at least one image can be captured without use of the neutral density filter and at least one image can be captured with use of the neutral density filter. That way each captured image contains two similar view of the surrounding environment but with different levels of light sensitivity. A processor (e.g., a processor of computing device 190) can then combine the images to generate an image of an environment that computing device 190 is in, that has a high-dynamic range of luminosity.
[0040] In other examples, the optical filter can be a stripe optical filter where the stripes act as a neutral density filters. In such examples, the stripe optical filter can cause a portion of a field-of-view of camera 194 to be dedicated to capture low light intensity, while other portions of the field-of- view of camera 194 can be dedicated to capture high light intensity. That way, camera 194 can capture an image that has portions of the image that are filtered and other portions of the image that are not filtered. Additionally, a processor (e.g., a processor of computing device 190) can processes the image to generate another image of an environment that computing device 190 is in, that has a lower pixel resolution, but high-dynamic range of luminosity.
[0041] In some examples, an accessory device (e.g., accessory device 100, accessory device 200, accessory device 300 or accessory device 400) can include a mechanical switch that when toggled can place an optical filter (e.g., a neutral density filter or a stripe optical filter) between a wide-angle lens and camera 194. In some examples, the optical filter can be positioned between a wide-angle lens (e.g., wide-angle lens 104, wide-angle lens 204, wide-angle lens 206, wide-angle lens 308 and/or wide-angle lens 310) and a reflective surface (e.g., reflective surface 106, reflective surface 208, reflective surface 210, reflective surface 304 and/or reflective surface 306).
In other examples, the optical filter can be positioned between the reflective surface (e.g., reflective surface 106, reflective surface 208, reflective surface 210, reflective surface 304 and/or reflective surface 306) and camera 194. In yet other examples, the optical filter can be positioned between multiple optical fiber light pipes (optical fiber light pipe 404, optical fiber light pipe 406, optical fiber light pipe 408, optical fiber light pipe 410, and optical fiber light pipe 412) and camera 194. [0042] In some examples, a diffusing effect may be applied to a lens of camera 194. For example, a film with a roughened front surface or that is semi-transparent, may be placed in front of the lens of camera 194. The film can cause a diffusing effect that causes camera 194 to capture non- cognizable environmental lighting, while preserving a privacy of a user. For example, the film can cause camera 194 to capture a blurry image that includes a user. The blurry image can include lighting effects of the environment without capturing a clear image of the area around computing device 190, including the user.
[0043] In some examples, computing device 190 can include a processor to determine a lighting effect onto an object that is to be rendered on display 192, based on lighting information of environmental lighting captured by camera 194. The lighting effect on the object can reflect a lighting source of an environment where the environmental lighting is collected by camera 194. Additionally, the processor can generate the lighting effect on the object to be rendered on display 192 in manner that reflects the lighting conditions of an environment camera 194 is in. That way, the generated lighting effect can enhance a perception (e.g., a three-dimensional perception) of the object rendered on display 192. In some examples, the processor can determine lighting information from the environmental lighting captured by camera 194. Examples of lighting information the processor can determine from the environmental lighting captured by camera 194 include, a number of light sources in the environment, a location of each light source in the environment, and a coverage of ambient light from each light source.
[0044] In some examples, if computing device 190 is in a room with a window, a processor of computing device 190 can determine lighting information of the environment camera 194 is in, from environmental lighting captured by camera 194. The processor can utilize the lighting information to determine and generate a lighting effect on the object to be rendered on display 192. The processor can generate the lighting effect similar to how ambient light affects the lighting conditions in the room. For example, based on the lighting information, the processor can determine which portions of the object is to be covered by light. Additionally, the processor can
determine, how intense various portions of the light are to be on the object, based on the lighting information. Additionally, the processor can determine the color of various portions of the light on the object based on the lighting information. Moreover, the processor can determine areas on the object where shadowing occurs, based on the lighting information. That way, the processor can generate a lighting effect on the object to be rendered on display 192 in a manner that reflects how the ambient light from the window affects the lighting conditions in the room.
[0045] In some examples, an object can be rendered and displayed with a generated environment that combines a pre-existing environment may with lighting conditions of an environment camera 194 is in. For example, the processor can select an environmental map (e.g., a pre-generated
environmental map of a jungle) to generate the object with. In some examples, the environmental map can be stored on computing device 190. In other examples, the environmental map can be obtained from a third-party device. Additionally, the processor of computing device 190 can alter the environmental map, based on lighting information of environmental lighting captured by camera 194. For example, the processor can brighten or darken some areas in the selected environmental map of a jungle. Additionally, the processor can determine and generate a lighting effect on the object in the selected environmental map of a jungle, based on lighting conditions of an office room that camera 194 is in.
[0046] Additionally, a lighting effect determined and generated by a processor of computing device 190 can be based on parameters of the object. In some examples, the object to be displayed can be a preview of an output of a three-dimensional printer system (e.g., a three-dimensional object). In such an example, a user may input parameters indicating an output of the three-dimensional printer system is to have a reflective surface. The processor can utilize such parameters and determine that the object to be generated in the preview is to have a reflective surface. Additionally, the processor can generate the lighting effect on the object such that, the surface of the object can reflect light. In other examples, the user may input parameters indicating the output of the three-dimensional printer system is to have a more matte-like surface. In such examples, the processor can utilize such parameters and determine that the object to be generated in the preview is to have a matte-like surface. Additionally, the processor can generate the lighting effect on the object such that, the generated surface of the object does not reflect light as much as an object with a reflective surface.
METHODOLOGY
[0047] FIG. 5 illustrates an example method for generating a lighting effect on an object to be rendered on a display. In below discussion of FIG.
1A and 2-4, reference may be made to reference characters representing features as shown and described with respect to FIG. 5 for the purpose of illustrating a suitable component for performing an example method being described.
[0048] In some examples, a processor of computing device 190 can capture environmental lighting of a given environment using camera 194 of computing device 190 (500). In such examples, computing device 190 can include an accessory device to (e.g., accessory device 100, accessory device 200, accessory device 300 or accessory device 400) to receive and reflect environmental lighting from a region outside an original field-of-view of a lens of camera 194. Additionally, the environmental lighting can be in non- cognizable form. In some examples, the accessory device can include an optical structure that includes and optical filter. Examples of an optical filter that the optical structure can include are a neutral density filter and a stripe filter.
[0049] In some examples, computing device 190 can include multiple cameras (e.g., camera 194 and camera 198). In such examples, as illustrated in FIG. 3, an optical structure of the accessory device (e.g., accessory device 300) can include at least one wide-angle lens and reflective surface for each camera (e.g., camera 194 and camera 198) that computing device 190 has (e.g., wide-angle lens 308 and reflective surface 304 for camera 198 and wide-angle lens 310 and reflective surface 306 for camera 194).
[0050] A processor of computing device 190 can determine a lighting effect using the lighting information (502). In some examples, the lighting effect can be based on the lighting information of environmental lighting captured by camera 194. Additionally, the processor can render the object on display 192 of computing device 190 and the rendering of the object can include the lighting effect (504). For example, the processor can generate the lighting effect on the object to be rendered on display 192 in a manner that reflects the lighting conditions of an environment camera 194 is in. That way, the generated lighting effect can enhance a perception (e.g., a three- dimensional perception) of the object rendered on display 192.
[0051] In some examples, an object can be generated and displayed with a generated environment that emulates lighting conditions of an environment camera 194 is in. For example, a processor of computing device 190 can generate the object with a selected environmental map of a jungle.
In such example, the selected environmental map can be obtained from a memory resource of computing device 190 that stores a set of environmental maps. In other examples, the environmental map can be obtained from a third-party device. Additionally, the processor can generate a lighting effect on the object in the selected environmental map of a jungle, based on lighting conditions of an office room that camera 194 is in.
[0052] In other examples, a generated lighting effect can be based on parameters of an object. For example, a user may input parameters indicating characteristics of a surface of the object (e.g., reflective, matte, stippled, etc.). Additionally, the processor can generate the lighting effect on the object that takes into account the input parameters that indicate the characteristics of the surface of the object to be rendered on display 192. For example, if the user inputs parameters that indicate the surface of the object is to be reflective, then the processor can generate a lighting effect on the object that appears that light is reflecting off of the reflective surface of the object.
HARDWARE DIAGRAM
[0053] FIG. 6 is a block diagram that illustrates a computer system upon which examples described herein may be implemented. In some examples, a computing device 600 may correspond to a mobile computing device, such as a cellular device that is capable of telephony, messaging, and data services. The computing device 600 can correspond to a device operated by a user. Examples of such devices include smartphones, handsets, tablet devices, or in-vehicle computing devices that communicate with cellular carriers. The computing device 600 includes a processor 610, memory resources 620, a display 630 (e.g., such as a touch-sensitive display device), communication sub-systems 640 (including wireless communication systems), a sensor set 650 (e.g., accelerometer and/or gyroscope, microphone, barometer, etc.), and location detection mechanisms (e.g., GPS component) 660. In one example, at least one of the communication sub- systems 640 sends and receives cellular data over data channels and voice channels. The communications sub-systems 640 can include a cellular transceiver and a short-range wireless transceiver.
[0054] In some examples, communication sub-systems 640 can send and receive cellular data over network(s) 670 (e.g., data channels and voice channels). Communication sub-systems 640 can include a cellular transceiver and a short-range wireless transceiver. In some examples, the processor 610 can exchange data with a third-party device or system (not illustrated in FIG. 6) via the communications sub-systems 640 to obtain environmental maps.
[0055] Memory resources 620 can store instructions for a variety of operations. For example, as illustrated in FIG. 6, memory resources 620 can include lighting effect instructions 622. Additionally, processor 610 can execute lighting effect instructions 622 to perform operations for
implementing a method, such as described with example method of FIG. 5. Processor 610 can execute lighting effect instructions 622 to utilize lighting information of environmental lighting captured by a camera (e.g., camera 194) to determine and generate a lighting effect on an object to be rendered on display 630. That way, the object with the generated lighting effect rendered on display 630 can enhance a perception (e.g., a three-dimensional perception) of the object.
[0056] Although specific examples have been illustrated and described herein, it will be appreciated by those of ordinary skill in the art that a variety of alternate and/or equivalent implementations may be substituted for the specific examples shown and described without departing from the scope of the disclosure. This application is intended to cover any adaptations or variations of the specific examples discussed herein.

Claims

WHAT IS CLAIMED IS:
1. An accessory device for a computing device, the accessory device comprising :
an accessory housing that is positionable over at least a first camera of the computing device; and
an optical structure, provided with the accessory housing in optical alignment with a lens of the first camera, to reflect environmental lighting received from a region outside of an original field-of-view of the lens of the first camera, in non-cognizable form, to the lens of the first camera.
2. The accessory device of claim 1, wherein the optical structure includes a wide-angle lens to receive the environmental lighting from the region outside of the original field-of-view of the lens of the first camera.
3. The accessory device of claim 2, wherein the optical structure further includes a reflective surface positioned to (i) receive environmental lighting received by the wide-angle lens, and (ii) reflect at least a portion of the environmental lighting to the lens of the first camera.
4. The accessory device of claim 1, wherein the optical structure includes multiple wide-angle lenses, each wide-angle lens of the multiple wide-angle lenses receiving the environmental lighting from the region outside of the original field-of-view of the lens of the first camera.
5. The accessory device of claim 4, wherein the optical structure includes multiple reflective surfaces, each reflective surface of the multiple reflective surfaces being positioned to (i) receive environmental lighting received by a wide-angle lens of the multiple wide-angle lenses, and (ii) reflect at least the portion of the environmental lighting to the lens of the first camera.
6. The accessory device of claim 1, wherein the computing device includes a second camera, and wherein the accessory housing is positionable over the first camera and the second camera and the optical structure is in optical alignment with the lens of the first camera and a lens of the second camera.
7. The accessory device of claim 6, wherein the optical structure includes: a first wide-angle lens provided on the accessory housing to receive the environmental lighting from the region outside of the original field-of- view of the lens of the first camera; and
a second wide-angle lens provided on the accessory housing to receive the environmental lighting from the region outside of the original field-of- view of the lens of the second camera.
8. The accessory device of claim 7, wherein the optical structure further includes:
a first reflective surface positioned to (i) receive environmental lighting received by the first wide-angle lens, and (ii) reflect at least a first portion of the environmental lighting to the lens of the first camera; and
a second reflective surface positioned to (i) receive environmental lighting received by the second wide-angle lens, and (ii) reflect at least a second portion of the environmental lighting to the lens of the second camera.
9. The accessory device of claim 1, wherein the optical structure further includes:
an optical filter.
10. The accessory device of claim 9, wherein the optical filter is a neutral density filter.
11. A computing device comprising :
a housing including a camera and a display;
an accessory device including :
an accessory housing that is positionable over at least the camera of the computing device;
an optical structure, provided with the accessory housing in optical alignment with a lens of the camera, to reflect environmental lighting received from a region outside of an original field-of-view of the lens of the camera to the lens of the camera, the environmental lighting including lighting information; and a processor to generate a lighting effect on an object to be rendered on the display, based on the lighting information.
12. The computing device of claim 11, wherein the optical structure includes a wide-angle lens to receive the environmental lighting from the region outside of the original field-of-view of the lens of the camera.
13. The computing device of claim 12, wherein the optical structure further includes a reflective surface positioned to (i) receive environmental lighting received by the wide-angle lens, and (ii) reflect at least a portion of the environmental lighting to the lens of the camera.
14. The computing device of claim 11, wherein the lighting effect on the object reflects a lighting source of an environment where the environmental lighting is collected from.
15. A method for operating a computing device, the method comprising : capturing environmental lighting of an environment using a camera of the computing device, the environmental lighting being (i) captured from a region outside an original field-of-view of a lens of the camera, and (ii) in non-cognizable form;
determining a lighting effect using the environmental lighting; and rendering an object on a display of the computing device, the rendering including the lighting effect.
PCT/US2018/058133 2018-10-30 2018-10-30 Accessory devices to reflect environmental lighting WO2020091739A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/US2018/058133 WO2020091739A1 (en) 2018-10-30 2018-10-30 Accessory devices to reflect environmental lighting

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2018/058133 WO2020091739A1 (en) 2018-10-30 2018-10-30 Accessory devices to reflect environmental lighting

Publications (1)

Publication Number Publication Date
WO2020091739A1 true WO2020091739A1 (en) 2020-05-07

Family

ID=70463861

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2018/058133 WO2020091739A1 (en) 2018-10-30 2018-10-30 Accessory devices to reflect environmental lighting

Country Status (1)

Country Link
WO (1) WO2020091739A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024019709A1 (en) * 2022-07-20 2024-01-25 Hewlett-Packard Development Company, L.P. Light altering devices for light sensors

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100328420A1 (en) * 2009-06-29 2010-12-30 Roman Kendyl A Optical adapters for mobile devices with a camera
US20130177304A1 (en) * 2012-01-11 2013-07-11 Targus Group International, Inc. Portable electronic device case accessory with interchangeable camera lens system
US20140313377A1 (en) * 2011-11-09 2014-10-23 Mark Ross Hampton In relation to a lens system for a camera
US20150220766A1 (en) * 2012-10-04 2015-08-06 The Code Corporation Barcode-reading enhancement system for a computing device that comprises a camera and an illumination system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100328420A1 (en) * 2009-06-29 2010-12-30 Roman Kendyl A Optical adapters for mobile devices with a camera
US20140313377A1 (en) * 2011-11-09 2014-10-23 Mark Ross Hampton In relation to a lens system for a camera
US20130177304A1 (en) * 2012-01-11 2013-07-11 Targus Group International, Inc. Portable electronic device case accessory with interchangeable camera lens system
US20150220766A1 (en) * 2012-10-04 2015-08-06 The Code Corporation Barcode-reading enhancement system for a computing device that comprises a camera and an illumination system

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024019709A1 (en) * 2022-07-20 2024-01-25 Hewlett-Packard Development Company, L.P. Light altering devices for light sensors

Similar Documents

Publication Publication Date Title
CN104754239B (en) Photographic method and device
US10095307B2 (en) Eye tracking systems and methods for virtual reality environments
CN113592887B (en) Video shooting method, electronic device and computer-readable storage medium
EP2410733A2 (en) Camera system and method of displaying photos
CN108737738B (en) Panoramic camera and exposure method and device thereof
CN111243049B (en) Face image processing method and device, readable medium and electronic equipment
US9576397B2 (en) Reducing latency in an augmented-reality display
WO2017015507A1 (en) Surround ambient light sensing, processing and adjustment
US9097966B2 (en) Mobile electronic device for projecting an image
KR20170005009A (en) Generation and use of a 3d radon image
CN107483777A (en) A kind of imaging method, device and mobile terminal
WO2021109374A1 (en) Light supplementing device, control method for light supplementing device, and computer storage medium
CN111325698A (en) Image processing method, device and system and electronic equipment
CN110717920A (en) Method and device for extracting target image of projector galvanometer test and electronic equipment
CN112351209A (en) External lens for mobile terminal, method for controlling lens, mobile terminal and storage medium
US8451346B2 (en) Optically projected mosaic rendering
US10447910B2 (en) Camera notification and filtering of content for restricted sites
US10225482B2 (en) Method of controlling photographing device including flash, and the photographing device
WO2020091739A1 (en) Accessory devices to reflect environmental lighting
US10396272B2 (en) Display distortion for alignment with a user gaze direction
KR102214199B1 (en) Mobile communication terminal
US20230033956A1 (en) Estimating depth based on iris size
CA2842264C (en) A device with enhanced augmented reality functionality
CN107451972B (en) Image enhancement method, device and computer readable storage medium
US11832001B2 (en) Image processing method and image processing system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18939069

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18939069

Country of ref document: EP

Kind code of ref document: A1