WO2013140359A2 - Apparatus and system for imaging in video calls - Google Patents

Apparatus and system for imaging in video calls Download PDF

Info

Publication number
WO2013140359A2
WO2013140359A2 PCT/IB2013/052240 IB2013052240W WO2013140359A2 WO 2013140359 A2 WO2013140359 A2 WO 2013140359A2 IB 2013052240 W IB2013052240 W IB 2013052240W WO 2013140359 A2 WO2013140359 A2 WO 2013140359A2
Authority
WO
WIPO (PCT)
Prior art keywords
camera module
camera
arm
accordance
image
Prior art date
Application number
PCT/IB2013/052240
Other languages
French (fr)
Other versions
WO2013140359A3 (en
Inventor
Tadmor Shalon
Original Assignee
Svip Llc, Series 9
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Svip Llc, Series 9 filed Critical Svip Llc, Series 9
Publication of WO2013140359A2 publication Critical patent/WO2013140359A2/en
Publication of WO2013140359A3 publication Critical patent/WO2013140359A3/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/141Systems for two-way working between two video terminals, e.g. videophone
    • H04N7/142Constructional details of the terminal equipment, e.g. arrangements of the camera and the display
    • H04N7/144Constructional details of the terminal equipment, e.g. arrangements of the camera and the display camera and display on the same optical axis, e.g. optically multiplexing the camera and display for eye to eye contact

Definitions

  • One or more embodiments relate to imaging in applications such as, for example and without limitation, video conferencing.
  • gaze awareness and eye contact are important because gaze awareness and eye contact serve, among other things, as signals for turn-taking in a conversation.
  • gaze awareness and eye contact express attributes such as attentiveness, confidence and cooperativeness. For example, people using increased eye contact get more help from others, generate more learning as teachers, and have better success in job interviews.
  • One or more embodiments solve one or more of the above-identified problems.
  • one or more such embodiments provide one or more of the following during a video call and/or a video conference: (a) improved eye contact between a user and a participant; (b) improved gaze awareness between the user and the participant; (c) an improved user image with respect to lighting even where the user is in a poorly-lit ambient; (d) an improved user image wherein background in the user ambient is muted; (e) an improved user image wherein lens distortion is reduced; and (f) reduced or eliminated video stutter for the user image which results in improved Internet communication.
  • FIG. 1 shows orthogonal views of an embodiment that includes a camera module disposed at the tip of a transparent arm, an illuminator, a microphone and a magnetic coupler.
  • FIG. 2 shows a front view of another embodiment that includes a camera module disposed at the tip of an ultra-thin, displaceable arm, a USB cable and a magnetic coupler.
  • FIG. 3 shows orthogonal views of an embodiment that includes a camera module disposed at the tip of an ultra-thin camera module arm, an illuminator, a USB cable, and a carrying case capable of holding the camera module, the camera module arm, the illuminator, and the USB cable.
  • FIG. 4 shows an embodiment in use where the embodiment is mounted on a laptop computer for use in a video call or video conference.
  • FIG. 5 shows orthogonal views of an embodiment that includes a camera module disposed at the tip of an ultra-thin, displaceable camera module arm, and a data port coupler.
  • FIG. 6 shows orthogonal views of an embodiment that includes a camera module and an illuminator mounted on a fixture housing, a retractable, flexible camera module arm attached to the fixture housing, wherein the fixture housing is capable of storing the camera module arm.
  • FIG. 7 shows orthogonal views of an embodiment that includes a camera module disposed at the tip of an ultra-thin camera module arm, the camera module arm being built into a tablet device and being capable of rotating onto a display screen of the tablet, and an illuminator built into a display screen bezel of the tablet.
  • FIG. 8 shows two embodiments in use during a video call or video conference.
  • FIG. 9 shows an embodiment that includes a miniature camera module disposed on a retractable camera module arm that moves behind a mobile device wherein the retractable camera module arm is fabricated using nitinol and a magnet wire harness.
  • FIG. 10 shows an embodiment that includes a miniature camera module attached to on an ultra-thin, retractable, camera module arm that moves onto a top bezel of a mobile device.
  • FIG. 11 shows an MIPI mobile device interface flex PCB ("printed circuit board") with shielding used to fabricate one or more embodiments: (a) disposed in a flat configuration (i.e., with a BGA sensor module at the bottom before being folded and attached to nitinol wires; and (b) in a folded position where it is connected to a camera module.
  • MIPI mobile device interface flex PCB printed circuit board
  • FIG. 12 is a schematic of camera imager support electronics used to fabricate one or more embodiments, which electronics may be positioned near an imager in the camera module to reduce conductor count in a transparent or ultra-thin camera module arm connected to the camera module.
  • FIG. 13 illustrates the quantum efficiency of an imager spectral filter having an IR notch near 810nm overlaid with the imager's spectral sensitivity curves.
  • FIG. 14 is a graph that indicates gaze angles useful in maintaining eye contact. Description
  • One or more embodiments are camera systems or systems that provide one or more of the following during a video call and/or a video conference: (a) improved eye contact between a user and a participant; (b) improved gaze awareness between the user and the participant; (c) an improved user image with respect to lighting even where the user is in a poorly-lit ambient; (d) an improved user image wherein background in the user ambient is muted; (e) an improved user image wherein lens distortion is reduced; and (f) reduced or eliminated video stutter for the user image.
  • an inventive camera system may be built into a device, for example and without limitation, into a device display screen bezel.
  • a camera module and a camera module arm of the camera system may be stored in a slot built into the device display screen bezel (i.e., an embedded storage slot).
  • a thumb activated turn knob is disposed flush with the bezel, and the thumb activated turn knob includes a rotation hinge/helical cam causes the camera module to lift up and rotate 90 degrees from its position in the embedded storage slot and to swing onto the device display screen to a portrait mode for use in a video call.
  • a camera in the camera module may be used as a camera for the device.
  • a camera module arm of the camera system may be disposed, at least in part, in or on the back of the device.
  • an inventive camera system is an add-on to the device.
  • the camera module arm and the camera module may be stored, i.e., when not deployed for use during a video call or video conference, in a housing which is attachable to the device.
  • a thumb activated turn knob is disposed flush with the housing, and the thumb activated turn knob includes a rotation hinge/helical cam that causes the camera module to lift up and rotate 90 degrees from its position in the embedded storage slot and to swing onto the device display screen to a portrait mode for use in a video call or video conference.
  • One or more such embodiments are a camera system for use with a device, which camera system comprises: (a) a camera module; (b) a camera module arm that is mechanically and electrically connected to the camera module, which camera module arm: (i) may be electrically connected to camera system electronics, which camera system electronics is, in turn, electrically connectable to device system electronics, or (ii) may be electrically connectable to device system electronics; and (c) a housing that is mechanically connected to the camera module arm, may hold camera system electronics, and is mechanically connectable to the device.
  • the term device or the term communications device means a device and/or a device system that includes a display screen and is capable of receiving images from a camera or over a communications channel (for example, and without limitation, the Internet) and displaying the images on the display screen.
  • a device includes, without limitation, a personal computer ("PC"), a laptop computer, a tablet, a smartphone, and so forth.
  • the housing is attachable to the device, and the camera module is positionable with respect to a device display screen.
  • the camera module arm is capable of movement which enables the camera module to be positioned with respect to the device display screen.
  • the camera module may be deployed, i.e., positioned, with respect to an image of the participant on the device display screen to provide eye contact between the user and the participant.
  • camera system electronics is disposed in, or is affixed to, the housing in electrical contact with the camera module arm, which camera system electronics receives and processes image information transmitted thereto from the camera module and outputs processed signals to a connector, which connector is capable of transmitting the processed signals to the device system electronics.
  • the camera module arm transmits image information from the camera module to the device system electronics.
  • the device system electronics receives and processes image information transmitted thereto from the camera module.
  • the camera module arm transmits image information from the camera module to the device system electronics.
  • the camera system electronics includes a processor that executes software which, in response to image information transmitted thereto, processes the image (as described below) and outputs the processed image to the device system electronics.
  • the device system electronics transmits the processed image over a communications channel such as, for example and without limitation, the Internet, to another device.
  • the device system electronics includes a processor that executes software which, in response to image information transmitted thereto, processes the image (as described below) and transmits the processed image over a communications channel such as, for example and without limitation, the Internet, to another device.
  • One or more embodiments are a camera system for use with a device, which camera system comprises: (a) a camera module; and (b) a camera module arm, which camera module arm: (i) may be electrically connected to camera system electronics, which camera system electronics is, in turn, electrically connectable to device system electronics, or (ii) may be electrically connectable to device system electronics.
  • the camera module is positionable with respect to a device display screen.
  • the camera module arm is capable of movement which enables the camera module to be positioned with respect to the device display screen.
  • the camera module may be deployed, i.e., positioned, with respect to an image of the participant on the device display screen to provide eye contact between the user and the participant.
  • camera system electronics is disposed in or on the device in electrical contact with the camera module arm, which camera system electronics receives and processes image information transmitted thereto from the camera module and outputs processed signals to the device system electronics.
  • the camera module arm transmits image information from the camera module to the device system electronics.
  • the camera system electronics includes a processor that executes software which, in response to image information transmitted thereto, processes the image (as described below) and outputs the processed image to the device system electronics.
  • the device system electronics transmits the processed image over a communications channel such as, for example and without limitation, the Internet, to another device.
  • device system electronics includes a processor that executes software which, in response to image information transmitted thereto, processes the image (as described below) and transmits the processed image over a communications channel such as, for example and without limitation, the Internet, to another device.
  • embodiments of a built-in camera system includes a camera module (referred to as camera module 110), and a camera module arm (referred to as camera module 120); and (b) embodiments of an add-on camera system (referred to as camera system 200) include a camera module (referred to as camera module 210), a camera module arm (referred to as camera module arm 220) and a housing (referred to as housing 230).
  • camera module arm 120 is mechanically and electrically connected to camera module 110, and camera module arm 120 is capable of movement which causes camera module 110 (for example, a miniature camera) to be positioned with respect to a device display screen.
  • camera module arm 220 is mechanically and electrically connected to camera module 210, and camera module arm 220 is capable of movement which causes camera module 210 (for example, a miniature camera) to be positioned with respect to a device display screen.
  • a miniature camera module 110 is attached to camera module arm 120, and camera module arm 120 is fabricated (as described below) so that camera module 110 can be positioned in a stable and repeatable manner on a device display screen.
  • camera module 110 can be positioned with respect to an image of a participant on the device display screen to provide eye contact between a device user and the participant during a video call and/or video conference.
  • a miniature camera module 210 is attached to camera module arm 220, and camera module arm 220 is fabricated (as described below) so that camera module 210 can be positioned in a stable and repeatable manner on a device display screen.
  • camera module 210 can be positioned with respect to an image of a participant on the device display screen to provide eye contact between a device user and the participant during a video call and/or video conference.
  • camera module 110 includes a miniature camera which comprises, for example and without limitation, an OmniVision OVM7695 sensor or an OmniVision OVM7239 sensor (both of which are available from OmniVision, Inc. of Santa Clara, California) and a suitable lens system (as described below).
  • camera module 110 (or camera module 210) is approximately 3 mm by 3 mm to 5 mm in cross section and 2 mm to 3 mm deep.
  • camera imager support electronics (FIG. 12 is a schematic of camera imager support electronics used to fabricate one or more embodiments) is disposed in camera system electronics (see above).
  • cameras for devices such as PCs and smartphone typically have a landscape orientation.
  • the landscape orientation may be suitable for group video conference calls, the inventor has discovered that, for most video calls (for example, personal video calls), a portrait orientation provides a better image than the landscape orientation.
  • camera module 110 when deployed in position for use, is oriented so that the camera lens is disposed in a portrait orientation rather than a landscape orientation.
  • the camera lens has a 50-80 mm focal length.
  • the camera optics are suited for close range portrait photography.
  • the camera optics has a low F number lens such as, for example and without limitation, an F number in a range from about 2 to about 3 to provide a shallow depth of focus that blurs the background and a flattering 50-70 mm DSLR equivalent lens angle of approximately 40 degrees field of view.
  • a low F number lens such as, for example and without limitation, an F number in a range from about 2 to about 3 to provide a shallow depth of focus that blurs the background and a flattering 50-70 mm DSLR equivalent lens angle of approximately 40 degrees field of view.
  • Positioning a camera module at a relevant location (for example, a location that enables a user and a participant to maintain eye contact during a video call or video conference) on a device display screen during a video call/video conference requires meeting a challenge.
  • the challenge is how to transmit control/data/power signals via a minimally obscuring camera module arm that: (a) is mechanically robust enough for consumer use; and (b) orients the camera module in a repeatable manner on the device display screen.
  • the camera module arm comprises a semi-rigid (for example and without limitation, nitinol or other flexible material) that is laminated with thin wires or a flex circuit board (the flexible material enables the camera module arm to attain a repeatable location during deployment without having the arm break during use).
  • the camera module arm In use, i.e., when deployed for use, the camera module arm is oriented like a blade perpendicular to the device display screen (such an orientation may be seen, for example, in one of the views in FIG. 3 and the views in FIG. 5).
  • the camera module arm has a profile on the device display screen (i.e., a footprint on the device display screen) whose width is less than an amount which is in a range from about 0.3 mm to about 0.5 mm.
  • the camera module arm is mounted so that it can swing from a first position (for example, off the device display screen) when not in use, i.e., when not deployed for use, to a second position on the device display screen when in use, i.e., when deployed for use.
  • Some issues that are overcome to provide such first embodiments include: (i) how to connect the wires/flex circuit board in the camera module arm to the camera module without connectors in some embodiments since the connectors might make the footprint of the camera module on the device display screen too big (as described below), and (ii) how to laminate the wires/flex printed circuit board ("PCB") to the semi-rigid, flexible member (stainless steel, nitinol, and so forth) without increasing the bulk or the profile of the camera module arm (as described below).
  • PCB wires/flex printed circuit board
  • the camera module arm is the same as in the first embodiment, but as a difference (i.e., instead of being mounted so that it can swing from a first position, off the device display screen when not deployed for use, to a second position, on the device display screen when deployed for use) the camera module arm is mounted so that it can be retracted into the device display screen bezel (see for example, the views of FIG. 10) or through the device display screen bezel to the back of the device (see for example, the views of FIGs. 8 and 9).
  • any camera module arm that is bendable will have mechanical hysteresis when it is deployed from a curved, retracted position (whether the curve results from being positioned in the device display screen bezel or the curve results from being curved through the device display screen bezel and onto the back of the device), and thus, not attain a repeatable desired deployment position on the device display screen.
  • the use of nitinol enables the issue to be resolved.
  • the camera module arm comprises a rigid, transparent member that, when deployed, is oriented parallel to the device display screen.
  • the rigid, transparent member is made of a suitable, strong material such as, for example, and without limitation, polycarbonate.
  • the signal/power conductors are transparent or mostly transparent.
  • the camera module arm can be rotated from a first position (for example, off the device display screen) when not in use, i.e., when not deployed for use, to a second position on the device display screen when in use, i.e., when deployed for use (see, for example, views in Figs. 1 and 7).
  • camera module arm 120 (or camera module arm 220) comprises a polycarbonate (or other suitable transparent polymer) pendant arm having wires (for example, miniature wires) embedded therein.
  • camera module arm 120 (or camera module arm 220) can be made of 0.01" thick polycarbonate or a material having similar properties and a suitably small thickness.
  • the wires are, for example and without limitation, 30 micron diameter wire (for example, 50 gauge insulated wire (also referred to herein as ultra-thin wire) disposed, for example, in twisted pairs to minimize EMI).
  • the wires are embedded in camera module arm 120 (or camera module arm 220) or they are laminated onto camera module arm 120 (or camera module arm 220) using a flexible glue such as, for example and without limitation, UV cured, Dymax® 3025 encapsulant or clear, UV cured, Dymax® 9001-EV3 encapsulant (both of which are available from Dymax® Corporation, website address www.dymax.com).
  • a flexible glue such as, for example and without limitation, UV cured, Dymax® 3025 encapsulant or clear, UV cured, Dymax® 9001-EV3 encapsulant (both of which are available from Dymax® Corporation, website address www.dymax.com).
  • the wires carry power, clock, data and control signals or additional signals required by standard interfaces such as the standard MIPI camera interface.
  • camera module arm 120 (or camera module arm 220) has a thickness in a range between about 1 mm to about 3 mm.
  • camera module arm 120 is pivoted on a hinge which is affixed to the device display screen bezel, refer FIG. 7), or camera module arm 220 is pivoted on a hinge which is affixed to housing 130.
  • the hinge allows only 90 degrees of rotation to insure that electrical connectors between camera module arm 120 (or camera module arm 220) and other electronics do not overextend.
  • camera module arm 120 can be mounted on the top of the device display screen, either from a bezel or from a housing of the device system electronics, and camera module arm 220 can be mounted on housing 230.
  • a nitinol wire (available from NDC Inc. of Freemont California), for example and without limitation, a 0.08" diameter nitinol wire, is embedded in, or is attached to, camera module arm 120 (or camera module arm 220).
  • the embedded or attached nitinol enables camera module arm 120 to maintain a lateral flexibility, which lateral flexibility enables camera module arm 120 to slide: (a) in and out of an aperture in a device display screen bezel to a guide or slot, for example and without limitation, a concealed guide or slot, that is embedded in the device display screen bezel (refer to FIG.
  • camera module arm 120 bends along its short axis (where the short axis faces the user if the slot is configured sideways along the device display screen bezel); or (b) in and out of a channel in the device display screen bezel to a device electronic system, wherein the channel extends through the bezel from the front to the back of the device (refer to FIG. 9) (camera module arm 120 bends by 90 degrees where camera module arm 120 passes through the device display screen bezel to the back of the device), and wherein the sliding occurs as a result of a user's pushing attached camera module 110 up or down with respect to the aperture or the channel in the bezel, respectively.
  • the device display screen bezel includes a slot: (a) in a retracted position, camera module arm 120 is stored in the camera arm slot, and flexible wiring connects the wires or traces in camera module arm 120 to any camera system electronics disposed in the slot; and (b) in a deployed position, camera module arm 120 is disposed partially in the slot and partially on the device display screen.
  • the embedded or attached nitinol enables camera module arm 220 to maintain a lateral flexibility, which lateral flexibility enables camera module arm 220 to slide in and out of an aperture in housing 230 to a guide or slot, for example and without limitation, a concealed guide or slot, that is embedded in housing 230—camera module arm 220 bends along its short axis (where the short axis faces the user if the slot is configured sideways along housing 230), wherein the sliding occurs as a result of a user's pushing attached camera module 110 up or down with respect to the aperture or the channel in the bezel.
  • a guide or slot for example and without limitation, a concealed guide or slot
  • housing 230 includes a slot: (a) in a retracted position, camera module arm 220 is stored in the camera arm slot, and flexible wiring connects the wires or traces in camera module arm 220 to any camera system electronics disposed in the slot; and (b) in a deployed position, camera module arm 220 is disposed partially in the slot and partially on the device display screen.
  • camera module arm 220 is disposed partially in the slot and partially on the device display screen.
  • wires affixed to, or embedded in, a polycarbonate arm can form a loop at the back of camera module arm 120 (or camera arm 220).
  • the wires in the loop may be connected to the device electronics system by, for example and without limitation, a PCB; and
  • the back end of camera module arm 120 may move up and down (i) inside a slot internal to the device, for example and without limitation, a slot in the device display screen bezel, or (ii) the back end of camera module arm 120 may be wound sideways inside the device display screen.
  • a user may position camera module 110 for use by moving it down (for example and without limitation, by dragging with the user' s thumb) from a first or non-use or non-deployed position (for example and without limitation, in the first or non-use or non-deployed position, camera module 110 is disposed adjacent to or on the bezel) to a second or use or deployed position which is adjacent to or on the device screen display. Further, a user may move camera module 110 from the use or deployed to the non-use or non- deployed position by moving camera module 110 up (for example and without limitation, by pushing with the user's thumb).
  • the wires in the loop may be connected to the device electronics system by, for example and without limitation, a PCB; and (b) the back end of camera module arm 220 may move up and down (i) inside a slot in housing 230, or (ii) the back end of camera module arm 220 may be wound sideways inside the slot.
  • a user may position camera module 210 for use by moving it down (for example and without limitation, by dragging with the user' s thumb) from a first or non-use or non-deployed position (for example and without limitation, in the first or non-use or non-deployed position, camera module 210 is disposed adjacent to or on housing 230) to a second or use or deployed position which is adjacent to or on the device screen display. Further, a user may move camera module 210 from the use or deployed to the non-use or non- deployed position by moving camera module 210 up (for example and without limitation, by pushing with the user's thumb).
  • nitinol wires for example and without limitation, ten to fifteen 50 gauge wires (some of which may be configured in twisted pairs to reduce EMI) are positioned between nitinol wires, for example and without limitation, two 0.008" diameter nitinol wires, and the assembly is held together using a flexible glue such as, for example and without limitation, UV cured, Dymax® 3025 encapsulant or clear, UV cured, Dymax® 9001- EV3 encapsulant (both of which are available from Dymax® Corporation, website address www.dymax.com).
  • a flexible glue such as, for example and without limitation, UV cured, Dymax® 3025 encapsulant or clear, UV cured, Dymax® 9001- EV3 encapsulant (both of which are available from Dymax® Corporation, website address www.dymax.com).
  • camera module arm 120 (or camera module arm 220) of the aforementioned assembly is flat and has a thickness of about 0.3 mm and a width less than about 2 mm.
  • camera module arm 120 is disposed through a channel in a device display screen bezel (the channel extending through the bezel from the front to the back of the device), and camera module arm 120 may be looped to a back or a side of the device.
  • the wires in camera module arm 120 may be connected to a device electronics system by, for example and without limitation, a PCB.
  • a user may position camera module 110 for use by moving it down (for example and without limitation, by dragging with the user's thumb) from a first or non-use or non-deployed position (for example and without limitation, in the first or non-use or non-deployed position, camera module 110 is disposed adjacent to the channel in the bezel) to a second or use or deployed position which is adjacent to or on the device screen display. Further, a user may move camera module 110 from the use or deployed position to the non-use or non-deployed position by moving camera module 110 up (for example and without limitation, by pushing with the user's thumb).
  • the assembly which includes two nitinol wires insures mechanical stability so that camera module arm 120 does not buckle when it is pushed back into the non-use position, and insures that camera module 110 maintains a flat and repeatable orientation with respect to the screen.
  • camera module arm 120 (or camera module 220) comprises an assembly of multiple 50 gauge wires (some of which may be configured in twisted pair to reduce EMI) which are laminated to edges of a polycarbonate or other suitable flexible thin arm.
  • camera module arm 120 can slide up or down through a channel in the device display screen bezel (the channel extending through the bezel from the front to the back of the device) or rotate down from a guide or slot in the device display screen bezel through an aperture disposed in the device display screen bezel.
  • camera module arm 220 can rotate down from a guide or slot in housing 230 through an aperture disposed in housing 230.
  • the inventor has discovered that locating the wires on the edges of camera module arm 120 (or camera module arm 220), when the embodiment is in use and camera module 110 (or camera module 210) is disposed on or adjacent to the device display screen, the wires are stacked normal to a user's line of vision, and as a result, the wires would provide little obstruction of the device display screen.
  • such a configuration of the wires can have as little as 30 microns of optical thickness on the edge of camera module arm 120 (or camera module arm 130).
  • camera module arm 120 (or camera module arm 220) is made from flex PCB that is folded into an orientation wherein, when the embodiment is in use and camera module 110 (or camera 210) is disposed on or adjacent to the device display screen, the PCB is oriented orthogonal to the device display screen (see, for example, the views of FIG. 11).
  • wires on the flex PCB carry signals such as standard parallel interface signals or MIPI interface signals, and a metal RF shield may be affixed to a back of the flex PCB to minimize EMI.
  • the flex PCB is glued (using a flexible glue such as, for example and without limitation, UV cured, Dymax® 3025 encapsulant or clear, UV cured, Dymax® 9001-EV3 encapsulant (both of which are available from Dymax® Corporation, website address www.dymax.com)) between two nitinol wires such as, for example and without limitation, 0.015" diameter nitinol wires, to form a "blade.”
  • a camera module arm will have sufficient mechanical robustness so that, when the embodiment is in use and camera module 110 (or camera module 210) is disposed on or adjacent to the device display screen, such a camera module arm will maintain the position of camera module 110 (or camera module 210) at the same spot and in the same orientation while being thin.
  • a camera module arm would be less than about 0.4 mm thick, and therefore, would not obscure the screen image underneath.
  • such a camera module arm can
  • the flex PCB arm described above can be fabricated using thinner nitinol wires than described above to form a retractable camera module arm that can: (a) slide up or down through a channel in the device display screen bezel (the channel extending through the bezel from the front to the back of the device), and be concealed in the back of the device; (b) slide through an aperture in the device display screen bezel into and out of a concealed, horizontally disposed, guide or slot in bezel; or (c) slide through an aperture in housing 230 into and out of a horizontally disposed, guide or slot in housing 230.
  • a user may position camera module 110 for use by moving it down (for example and without limitation, by dragging with the user's thumb) from a first or non-use or non-deployed position (for example and without limitation, in the first or non-use or non-deployed position, camera module 110 is disposed adjacent to the channel or the aperture in the bezel) to a second or use or deployed position which is adjacent to or on the device screen display. Further, a user may move camera module 110 from the use or deployed position to the non-use or non- deployed position by moving camera module 110 up (for example and without limitation, by pushing with the user's thumb).
  • the guide or slot is fabricated using a lubricious tube such as, for example and without limitation, a Teflon tube, or a concealed guide may be formed from a slit in the bezel material and coating it with a lubricious material coating.
  • a user may position camera module 210 for use by moving it down (for example and without limitation, by dragging with the user's thumb) from a first or non-use or non-deployed position (for example and without limitation, in the first or non-use or non-deployed position, camera module 210 is disposed adjacent to the aperture in housing 230) to a second or use or deployed position which is adjacent to or on the device screen display.
  • a user may move camera module 210 from the use or deployed position to the non-use or non-deployed position by moving camera module 210 up (for example and without limitation, by pushing with the user's thumb).
  • the guide or slot is fabricated using a lubricious tube such as, for example and without limitation, a Teflon tube, or channel in housing 230.
  • metal is deposited on the edge of the polycarbonate (where the edge is a surface that is disposed orthogonal to the device display screen when the embodiment is in use) to carry power.
  • thin traces of deposited metal such as, for example and without limitation, gold, silver, copper or aluminum traces or ITO traces (such as ITO traces used in LCD screen manufacturing) are used to fabricate traces that conduct signals between camera module 110 (or camera module arm 210) and the device electronics system, which traces are nearly invisible.
  • camera module arm 120 (or camera module arm 220) is anti-reflective (“AR") coated to reduce reflection and, thereby, decrease obstruction of the device display screen.
  • AR anti-reflective
  • wires comprising camera module arm 120 are soldered to a PCB of camera module 110 (or camera module 210). Further, for the built-in embodiments, the wires are soldered to a device system electronics PCB.
  • housing 230 of camera system 200 may be a thin plastic bar or a thin metal bar.
  • housing 230 may include one or more magnets affixed, for example and without limitation, by being glued, to a back side of housing 230 to secure it to the device adjacent an edge of the device display screen.
  • the magnets may be affixed to the device, for example and without limitation, by being glued to a top of the device display screen bezel.
  • housing 230 includes a clamp to secure it to the device adjacent an edge of the device display screen.
  • housing 230 can be mounted: (a) on either side of the device display screen bezel; (b) on either side of the top bezel without obscuring any built-in device webcam; or (c) on the bottom bezel.
  • a user mounts housing 230 by first affixing two metal tabs to the device display screen bezel with contact adhesive (for example and without limitation, SLE300 contact adhesive available from the 3M Corporation).
  • contact adhesive for example and without limitation, SLE300 contact adhesive available from the 3M Corporation.
  • the magnet(s) affixed to housing 230 enable quick connect/disconnect to/from the metal tabs while enabling the user to return housing 230 to a specific location quickly.
  • housing 230 can be removed quickly from the magnetic tabs or camera module 210 can be swung out of the way. Further, in accordance with one or more embodiments, housing 230 can be attached to a device display screen bezel using repositionable adhesive tape (for example and without limitation, 9416 adhesive tape or 9425 adhesive tape available from the 3M Corporation) which is attached to the back of housing 230.
  • repositionable adhesive tape for example and without limitation, 9416 adhesive tape or 9425 adhesive tape available from the 3M Corporation
  • add-on camera system 200 further includes a connector that plugs into the device such as, for example and without limitation, a device USB port to provide power, communications, and a mechanical fixation.
  • a connector that plugs into the device such as, for example and without limitation, a device USB port to provide power, communications, and a mechanical fixation.
  • housing 230 includes space in which electronic processing and signal conditioning circuitry is disposed.
  • the electronic processing and signal conditioning circuitry transforms camera image signals input thereto from camera module arm 220 and outputs signals (that are compatible with USB electronics and protocols in accordance with any one of a number of methods that are well known to those of ordinary skill in the art) to a connector, for example and without limitation, a USB connector.
  • the electronic processing and signal conditioning circuitry controls the camera imager electronics (as will be described in detail below) to optimize signal-to-noise based on a captured image, providing color balance, combining illuminated and natural video frames, and applying computed luminosity and color masks as described below.
  • Camera Screen Position During Use or Deployment controls the camera imager electronics (as will be described in detail below) to optimize signal-to-noise based on a captured image, providing color balance, combining illuminated and natural video frames, and applying computed luminosity and color masks as described below.
  • a user can position a camera module on a device display screen. Further, in accordance with one or more such embodiments, during an initial installation of a built-in or add-on camera system for use with a device (or any time thereafter that calibration is desired), the user may calibrate preferred camera module positions on the device display screen using device software, for example and without limitation, a "video conference app," by moving a display screen cursor to one or more positions of the camera module on or adjacent to the device display screen. When the cursor has moved to a position, the user may, for example, provide a click. In response, the video conference app will record the position, for example and without limitation, in device storage, for future use. Such a video conference app may be constructed routinely and without undue experimentation using any one of a number of methods that are well known to one of ordinary skill in the art.
  • the user may calibrate preferred camera module positions on the device display screen using device software, for example and without limitation, a "video conference app,” by moving the camera module to various positions on or adjacent to the device display screen.
  • a video conference app When the camera module has moved to a position, the user may, for example, provide a click, whereupon, the video conference app sends a signal to device system electronics or to camera system embedded electronics to generate a voltage spike in predetermined wires in the camera module arm.
  • the wires are used as a pickup antenna for the voltage spike which is created when a particular transistor on an LCD screen is turned on or off to control its underlying pixel.
  • the video conference app causes a horizontal and vertical scan of the device display screen and thus triggers sequentially all the transistors controlling the pixels on the screen.
  • the video conference app uses the voltage spike picked up by the antenna in combination with pixel location of a moving scan line on the screen to identify the location of the antenna in the camera module arm, and hence, the position of the camera module.
  • Such a video conference app may be constructed routinely and without undue experimentation using any one of a number of methods that are well known to those of ordinary skill in the art.
  • the camera module includes a photo detector, for example and without limitation, a photodiode, which is disposed so that it faces the device display screen during deployment.
  • a photo detector for example and without limitation, a photodiode
  • the user may calibrate preferred camera module positions on the device display screen using device software, for example and without limitation, a "video conference app,” by moving the camera module to various positions on or adjacent to the device display screen.
  • the user may, for example, provide a click, whereupon, the video conference app causes a moving predetermined scan feature (for example and without limitation, a white line on a black screen) to be displayed on the screen or predetermined video information to be displayed on the screen. Then, the video conference app uses a signal from the photo detector in combination with pixel location of, for example, the moving scan feature to identify the position of the camera module.
  • a moving predetermined scan feature for example and without limitation, a white line on a black screen
  • the video conference app uses a signal from the photo detector in combination with pixel location of, for example, the moving scan feature to identify the position of the camera module.
  • Such a video conference app may be constructed routinely and without undue experimentation using any one of a number of methods that are well known to those of ordinary skill in the art.
  • an imager die is placed on a semitransparent substrate instead of an opaque substrate.
  • the semitransparent substrate is disposed so that it faces the device display screen during deployment.
  • the imager can be sensitive to light coming from the device display screen in addition to radiation impinging thereon from the camera lens.
  • a video app uses a signal from the imager in combination with pixel location of, for example, a moving scan feature to identify the position of the camera module.
  • the portion of the device display screen disposed under the camera module during deployment is darkened to avoid leakage of light from the screen into the user's image.
  • built-in camera system 200 comprises electronics located in housing 230, which electronics is used to capture and process data output from the camera's imager.
  • the electronics includes a USB bridge processor (for example and without limitation, a suitable USB bridge processor is available from SunPak Co., Ltd. of Taichung City, Taiwan).
  • the USB bridge processor receives image output data via typical parallel or MIPI interfaces, and packages the data into USB compatible signals for delivery to the device using a USB connector (refer to FIGs. 2 and 3).
  • the camera system derives power from the device via the USB connector.
  • housing 230 instead of using a USB connection to derive power from the device, housing 230 contains its own power source such as, for example and without limitation, one or more batteries. Further, in accordance with one or more alternative embodiments, housing 230 contains a "near-field" communication device such as, for example and without limitation, a Bluetooth device, to communicate with the device.
  • a "near-field" communication device such as, for example and without limitation, a Bluetooth device
  • device software for example and without limitation, a video conference app
  • a will periodically analyze: (a) video images received from the camera system; and (b) a window on the device display screen which contains: (i) a video image of a participant (received from the participant's device), (ii) a static picture of the participant, or (iii) a generic face to identify the participant's face.
  • Such software may be fabricated using FaceDetect software which is available from Fraunhofer IIS 2001, website address www.iis.fraunhofer.de/en/bf/bsy/fue/isyst/detetechnisch/tech.html.
  • the video conference app will position the received video image of the participant in a video conferencing window by sizing, cropping and panning the received video image in accordance with any one of a number of methods that are well known to those of ordinary skill in the art so that the camera module, and hence its camera, is located (with respect to the received video image) to cause the user to maintain eye contact with the received video image of the participant when the user looks at the participant's eyes in the received video image.
  • suitable camera module positions include: (a) an area above a line connecting the participant's pupils, and preferably providing a gaze angle of more than three (3) degrees; and (b) to either side of the participant's face at eye level.
  • FIG. 14 is a graph which indicates gaze angles useful in maintaining eye contact.
  • the user will be provided with instructions on how to position the video image of the participant with respect to the camera module.
  • the video conference app will automatically position a speaking participant in a screen window associated with the camera module, and hence its camera, so the speaking participant would perceive the user to be looking attentively at the speaking participant.
  • a "near-field” illumination for example and without limitation, monochromatic, visible or invisible, illumination, is used to provide a low noise image of a user's face.
  • color information and the low noise image are combined to provide a webcam image that the device communicates to a participant in a video call or video conference.
  • housing 230 includes an illuminator comprised of one or more infrared (“IR”) radiation sources or one or more visible light sources to provide "near-field” illumination of a user's face (refer to FIGs. 1 and 6); and (b) of built-in camera system 100, the device includes an illuminator which is located, for example and without limitation, in the device display screen bezel, and which is comprised of one or more infrared (“IR”) radiation sources or one or more visible light sources to provide "near-field” illumination of a user's face (refer to FIG. 7).
  • IR infrared
  • an IR source for example and without limitation, an IR LED such as, for example and without limitation, an LZ1-00R400 IR LED manufactured by LedEngin which is available from Mouser Electronics, website address www.mouser.com/
  • the IR source can be pulsed, and the pulses may be synchronized under software control with the frame rate of the camera in the camera module.
  • such software may be embedded in the USB bridge processor or in device software, for example, a "video conference app.”
  • a conventional IR filter or a notch filter is disposed between a lens and an imager of the camera in the camera module.
  • the conventional IR filter has a spectral band pass notch in the wavelength region of the IR source, for example and without limitation, a notch of 30 nm centered at a wavelength approximately equal to 810 nm.
  • FIG. 13 illustrates the quantum efficiency of an imager spectral filter having an IR notch near 810nm overlaid with the imager's spectral sensitivity curves. This notch does not disrupt the normal color balance of the camera and provides a spectral window for the IR source.
  • software in the USB bridge processor or device software records an RGB signal in one conventional frame (i.e., a frame captured where there is no IR illumination) and an RGBI signal in another frame (i.e., a frame in which there is IR illumination). Then, the software combines the signals from the RGBI pixels to create the luminance of the image in accordance with any one of a number of methods that are well known to those of ordinary skill in the art and uses the RGB information to provide balanced color channels using a conventional white balance method (for example and without limitation, a method described in an article entitled "Automatic White Balance for Digital Still Cameras," Journal of Information Science and Engineering, pp. 497-509 (2006), which article is incorporated by reference herein in its entirety) and, thereby, creates a well-lit, color corrected, clear rendition of the user's face.
  • a conventional white balance method for example and without limitation, a method described in an article entitled "Automatic White Balance for Digital Still Cameras," Journal of Information Science and Engineering, pp. 497-509 (2006), which article
  • the inventor has recognized that in many circumstances supplying sufficient light (for example and without limitation, 300 lux to a user's face vs. 100 lux available in a typical office setting) in the manner described above, substantially reduces image the signal-to-noise ratio ("SNR") in the resultant well-lit image -in accordance with one or more embodiments, adding illumination might produce, for example and without limitation, as much as 400% reduction in SNR.
  • a video conference software CODEC disposed, for example and without limitation, in the video conference app, compresses the user's image prior to transmission to the participant's device, and the available communications bandwidth typically drives the software CODEC compression settings.
  • the reduced SNR in the well-lit image is advantageous because, post compression, the well-lit, reduced SNR, user's image has fewer bits. As a result, as the inventor has also recognized, this avoids overwhelming the communication channel between the user's device and the participant's device with poorly compressed frames due to excessive image noise which results in loss of connection or stutter.
  • the illuminator or illuminators can be located outside the housing to provide both key light and fill lights, and the illuminator(s) can be synchronized electronically or optically by picking up a pulsed IR signal via a photo detector, for example and without limitation, disposed in the illuminator, with frame capture software in the device software to capture image frames with and without illumination.
  • a photo detector for example and without limitation, disposed in the illuminator
  • the illuminator or illuminators can be used in a non-pulsed mode wherein their intensity is controlled by the device software via pulse width modulation or other suitable methods.
  • an illuminator is a "bright" red LED (for example and without limitation, a PU1WLEE, 1 watt, red LED which is available from American Opto Plus Led Corp., web site address www.aopled.com) which causes less apparent brightness for a user than white light or other light in the visible spectrum, but which still encompasses a suitable image sensor sensitivity.
  • the red LED can be pulsed as described above in the case of an illuminator comprised of an IR source.
  • a frame illuminated using the red LED is used to derive luminance by adding the signal collected by the red CCD channel, while a white-balanced image is derived in accordance with any one of a number of methods that are well known to those of ordinary skill in the art (same as described for the IR case above) from frames where the red LED was not illuminated.
  • a CCD optical filter such as the one used with the IR source (as described above), need not be used.
  • a "video conference app” applies a conventional YUV compression to an image frame obtained from an RGB sensor in the camera to provide an illuminated image frame.
  • the video conference app : (a) masks (as described below by utilizing "near- field” lighting effect) the luminance image frame Y to minimize extraneous background information; (b) masks UV color channels in the image frame; and then (b) spatially decimates the masked UV color channels (for example and without limitation, by averaging the value of each 4, 8 or 16 adjacent pixels -the inventor has discovered that this is useful since a face has fairly uniform color).
  • the retained information from the combined image frame (i.e., after masking and decimating the YUV channels) comprises a low pixel noise signal because of the well-lit Y channel and reduced information in the masked areas.
  • the combination results in an image which can be more effectively compressed, thereby, resulting in lower bit rate transmission for a more reliable and persistent video call, and an image in which the local viewer's face is the focus of the image and the surrounding visual clutter is visually suppressed.
  • pulsed "near-field” illumination that is synchronized with video frames, is used to visually mute background and, thereby, to enhance a user's image.
  • LEDs IR LEDs or red LEDS having an appropriate light cone angle (for example and without limitation, an angle in a range from about 5 degrees to about 10 degrees) illuminate the user's face at a typical distance from the screen (for example and without limitation, for a laptop a typical distance might be about one to about two feet, and for a smartphone, a typical distance might be about one-half to about one foot).
  • the LEDs are mounted on a device display screen bezel or a housing to form "near-field" lighting. The inventor has discovered that such a configuration of LEDs lights up the nearby user's face, but casts little light into the background.
  • an LED-unlit image frame is subtracted from an LED-lit image frame, pixel by pixel, and a threshold is applied to the resulting image frame data to provide a transparency map— the areas whose light intensity doesn't change much represent background, and the remaining areas represent a near screen object, i.e., a face.
  • the threshold is determined periodically, for example, by comparing the change in luminosity between frames and selecting a value which best separates minimum and maximum change.
  • the pixels that are contiguous and are above the threshold comprise the transparent portion of the digital mask while the others comprise the more opaque portion of the mask.
  • the digital transparency mask is applied to each frame (to provide an effect similar to a lens vignette) by multiplying each pixel's luminosity channel value by a corresponding pixel value in the digital transparency mask, thereby reducing the brightness of visual information in the background.
  • the digital transparency is further applied to color channels to de-saturate them by multiplying the value of each pixel in the digital transparency mask and the corresponding pixel value in the color channel. This procedure results in an image that tends to focus the participant's attention on the user's face. In addition, this procedure reduces the information content of a frame, thereby resulting in a more reliable Internet transmission of images.
  • multiple images are obtained from: (a) a device webcam, if any; (b) the camera in the camera module, with or without off axis illumination; and (c) "near-field” illumination, as described above.
  • device software for example, a "video conference app,” analyzes the multiple images for parallax to ascertain the user's head position and an orientation of the user's head in three dimensions using any one of a number of methods that are well known to those of ordinary skill in the art.
  • the housing contains one or more microphones (refer to FIG. 1).
  • Device software for example, a "video conference app,” analyzes the microphone output using any one of a number of methods that are well known to those of ordinary skill in the art to create a directional microphone capable of HD sound.
  • device software for example, a "video conference app” analyzes the 3D head position and orientation information described above using any one of a number of methods that are well known to those of ordinary skill in the art to further aim the directional microphone to preferentially record the user's voice and to reduce recording and transmission of background noise.
  • a gambled parabolic sound projector is mounted on or behind the device display screen.
  • Device software for example, a "video conference app” analyzes the 3D head position and orientation information described above using any one of a number of methods that are well known to those of ordinary skill in the art to cause the gambled parabolic sound projector to project sound to the user.
  • the inventor has discovered that such embodiments may avoid the use of headphones, and can make the transmitted image appear more natural.
  • the inventor has discovered that such embodiments may be useful in noisy environments such as a cubicle or a public space.
  • FIG. 1 shows orthogonal views of an add-on embodiment that includes a camera module disposed at the tip of a transparent arm, an illuminator, a microphone and a magnetic coupler.
  • housing 101 includes magnets 104, illuminator 106 and dual microphones 101.
  • camera module 103 hangs down from housing 101 via camera module arm 102 which swivels on axis 105.
  • FIG. 2 shows a front view of another add-on embodiment that includes a camera module disposed at the tip of an ultra-thin, displaceable arm, a USB cable and a magnetic coupler.
  • housing 207 includes dual microphones 202, and illuminator 201.
  • housing 207 supports camera module arm 204 and camera module 203.
  • camera module arm 204 and can swivel so that camera module 203 may be moved between a deployed or "down" position and a non-deployed or "up" position 205 abutting housing 207.
  • camera module 203 may be held in the non-deployed position, for example and without limitation, by a magnet.
  • the add-on embodiment communicates with a device via USB cable 206.
  • FIG. 3 shows orthogonal views of an add-on embodiment that includes a camera module disposed at the tip of an ultra- thin camera module arm, an illuminator, a USB cable, and a carrying case capable of holding the camera module, the camera module arm, the illuminator, and the USB cable.
  • a housing includes illuminator 301, thin camera module arm 303, and camera module 302, which camera module 302 is supported by camera module arm 303.
  • the add-in embodiment is powered by, and communicates with, the device via USB cable 307.
  • carrying case 308 serves as a holder for camera module 303, camera module arm 302, illuminator 301, and USB cable 307.
  • FIG. 4 shows an add-on embodiment, like that shown in FIG. 2, in use where the add-on embodiment is mounted on a laptop computer for use in a video call or video conference.
  • the add-on embodiment includes a housing that supports illuminator 401, thin camera module arm 403, and camera module 404, which camera module 404 is supported by camera module arm 403.
  • the add-on embodiment is affixed to laptop 402 and laptop 402 displays an image of participant 405 and laptop user 408.
  • camera module arm 403 has been rotated so that camera module 404 is disposed at non-deployed position 407.
  • FIG. 4 shows the add-on embodiment detached from laptop 402 and disposed in carrying case 406.
  • FIG. 4 shows the add-on embodiment detached from laptop 402 and disposed in carrying case 406.
  • the add-on embodiment includes a housing that supports illuminator 501, thin camera module arm 502, camera module 503, which camera module 503 is supported by camera module arm 502, and USB connector 504.
  • the add-on embodiment is affixed to a device by connecting USB connector 504 to a device USB port.
  • FIG. 6 shows orthogonal views of an embodiment that includes a camera module and an illuminator mounted on a fixture housing, a retractable, and a flexible camera module arm attached to the fixture housing, wherein the fixture housing is capable of storing the camera module arm.
  • fixture housing 608 includes dual microphones 603 and 605, dual illuminators 601 and 602, and spring arm 604.
  • a retractable, flexible camera module arm (not shown) supports camera module 607, and as shown in FIG. 6, the retractable, flexible camera module arm is stored in housing 608 in its non-deployed position.
  • spring arm 604 is disposed in an open position. When housing 608 is placed on a device display screen bezel and spring arm 604 is released from the open position, spring arm 604 applies force that pulls the back of the housing 609 against the bezel for secure positioning.
  • FIG. 7 shows orthogonal views of a built-in embodiment that includes a camera module disposed at the tip of an ultra-thin camera module arm, the camera module arm being built into a tablet device and being capable of rotating onto a display screen of the tablet, and an illuminator built into a display screen bezel of the tablet.
  • tablet 704 includes thin camera module arm 702 that supports camera module 703.
  • camera module arm 702 is connected to finger activated, rotating pivot 701.
  • camera module 703 fits into recess 705 in the bezel.
  • FIG. 8 shows two built-in embodiments in use during a video call or video conference.
  • a tablet or phone includes camera module 806 attached to rotating, rigid, transparent, camera module arm 804—shown in a deployed position;
  • a tablet or phone includes camera module 806 attached to rotating, rigid, transparent, camera module arm 808 —shown in a non- deployed position where camera module arm 808 (and hence camera module 807) has been rotated from the deployed position, and camera module 807 has been stored in a recess area in a bezel.
  • FIG. 8 shows two built-in embodiments in use during a video call or video conference.
  • a tablet or phone includes camera module 805 attached to retractable camera module arm 801 -shown in a deployed position 801 and (b) a tablet or phone includes camera module 802 attached to retractable camera module arm -shown in a non-deployed position where the camera module arm is not seen as it has been retracted in the device.
  • FIG. 9 shows a built-in embodiment that includes a miniature camera module disposed on a retractable camera module arm that moves behind a device wherein the retractable camera module arm is fabricated using nitinol and a magnet wire harness.
  • camera module 902 is shown attached to phone 901 in a retracted or non- deployed position and is also shown attached to flexible camera module arm 904 in an expanded or deployed position.
  • camera module arm 906 which is attached to camera module 907 is guided through tube 905.
  • FIG. 10 shows a built-in embodiment that includes a miniature camera module attached to an ultra-thin, retractable, camera module arm that moves onto a top bezel of a mobile device.
  • camera module 1001 in a retracted or non-deployed position, camera module 1001 is attached to thin, camera module arm 1003, which camera module arm 1003 is oriented so that its thin profile faces toward the user and which camera module arm 1003 is retracted into a bezel of phone or tablet 1004.
  • camera module arm 1003 is retracted, it is guided through guides 1002.
  • camera module 1001 is positioned on the screen.
  • FIG. 11 shows an MIPI mobile device interface flex PCB ("printed circuit board") with shielding used to fabricate one or more embodiments: (a) disposed in a flat configuration (i.e., with a BGA sensor module at the bottom before being folded and attached to nitinol wires; and (b) in a folded position where it is connected to a camera module.
  • flex circuit board 1100 (shown flat) is fabricated with ball grid array 1102 to connect to camera module 1105 and MIPI interface traces 1101 and 1109 and to terminate in connector traces 1103 which connect camera module 1105 to a CPU.
  • camera module 1105 and its associated capacitors and electronics 1106 and the flex circuit board arms 1101 and 1109 are folded over as 1107 to create a thin viewing profile and are laminated to nitinol 1104 with glue 1108.
  • FIG. 13 illustrates the quantum efficiency of an imager spectral filter having an IR notch near 810nm overlaid with the imager's spectral sensitivity curves.
  • an IR filter having transparent window 1302 in the visible wavelengths as well as narrow window 1301 in the IR.
  • FIG. 14 is a graph that indicates gaze angles useful in maintaining eye contact. As shown in FIG. 14, the eyes of viewer 1404 are directed to one of multiple LEDs 1401 arranged in an array surrounding camera module 1403, which camera module 1403 records the viewer's gaze. The graph shows that multiple observers of the viewer's gaze determine that eye contact is established generally when the viewer is looking at the camera or up to 7 degrees below it.
  • device software for example, a "video conference app” analyzes a participant's image using any one of a number of methods that are well known to those of ordinary skill in the art to determine gaze. Since the video conference app does not know where the participant is displaying the user's screen image on the participant's screen, the video conference: (a) includes an attention blink on each of the two diagonal vertices of a transmitted user's screen image; and (b) then uses this information to map the participant's eye gaze between these two diagonal vertices to the user's screen. Next, the video conference app causes the participant's gaze to be displayed on the device display screen, for example and without limitation, as a faint red circle with light traces so the user can be aware of where (or whether) the remote participant' s attention is focused during the presentation.
  • the video conference app causes the participant's gaze to be displayed on the device display screen, for example and without limitation, as a faint red circle with light traces so the user can be aware of where (or whether) the remote participant' s attention is focused
  • selected steps could be implemented as a plurality of software instructions being executed by a processing unit using any suitable operating system.
  • selected steps of the methods and systems could be described as being performed by a data processor, such as a computing platform for executing a plurality of instructions.
  • Embodiments of the present invention described above are exemplary. As such, many changes and modifications may be made to the disclosure set forth above while remaining within the scope of the invention.
  • materials, methods, and mechanisms suitable for fabricating embodiments of the present invention have been described above by providing specific, non-limiting examples and/or by relying on the knowledge of one of ordinary skill in the art. Materials, methods, and mechanisms suitable for fabricating various embodiments or portions of various embodiments of the present invention described above have not been repeated, for sake of brevity, wherever it should be well understood by those of ordinary skill in the art that the various embodiments or portions of the various embodiments could be fabricated utilizing the same or similar previously described materials, methods or mechanisms.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

Apparatus and system for facilitating video communication between two devices which enable, among other things, maintenance of eye contact.

Description

APPARATUS AND SYSTEM FOR IMAGING IN VIDEO CALLS
Field
One or more embodiments relate to imaging in applications such as, for example and without limitation, video conferencing.
Background
Many personal video conferencing systems are problematic because they make it difficult for a user (a "local person") and a participant (a "remote person") to maintain eye contact. This is problematic because eye contact is important for telepresence and effective communication. In typical prior art video conferencing systems, cameras are located on a side or a top of a viewing screen so that, when a user follows his/her instincts to look at a participant's eyes, the image of the user's eyes results in disrupted eye contact between the user and the participant. Such prior art video conferencing systems are further problematic because they do not enable gaze awareness —gaze awareness is an ability to tell what someone is looking at by watching the direction of his/her eyes. In face-to-face communication, gaze awareness and eye contact are important because gaze awareness and eye contact serve, among other things, as signals for turn-taking in a conversation. In addition, gaze awareness and eye contact express attributes such as attentiveness, confidence and cooperativeness. For example, people using increased eye contact get more help from others, generate more learning as teachers, and have better success in job interviews.
Several attempts have been made to create gaze awareness with videoconferencing systems using specialized hardware (see an article by J. Gemmell et al. entitled "Gaze Awareness for Video-conferencing: a Software Approach" in IEEE MultiMedia, pp. 26-35, Oct-Dec 2000). The article states: "Among the hardware techniques that support gaze-aware videoconferencing systems are half- silvered mirrors and pinhole cameras in displays. The Virtual Space and Hydra systems support gaze awareness by deploying a small display and camera for each party. If you place the display far enough away from users, they're unlikely to notice the angle between the camera and the display images."
In addition to the problems discuss above, these personal video conferencing systems are typically used under ambient conditions that include poor and mixed-color lighting, distracting backgrounds and noisy soundscapes. This is problematic not only because poorly lit images do not look good, but also that poorly lit images are noisy, which noisy images produce video stutter during Internet transmission.
Lastly, these personal video conferencing systems are problematic because traditional webcams have a wide angle lens that is oriented in a landscape mode. As a result, when a user is positioned close to the webcam, his/her face is distorted by the wide angle lens.
Summary
One or more embodiments solve one or more of the above-identified problems. In particular, one or more such embodiments provide one or more of the following during a video call and/or a video conference: (a) improved eye contact between a user and a participant; (b) improved gaze awareness between the user and the participant; (c) an improved user image with respect to lighting even where the user is in a poorly-lit ambient; (d) an improved user image wherein background in the user ambient is muted; (e) an improved user image wherein lens distortion is reduced; and (f) reduced or eliminated video stutter for the user image which results in improved Internet communication.
Brief Description of the Drawings
FIG. 1 shows orthogonal views of an embodiment that includes a camera module disposed at the tip of a transparent arm, an illuminator, a microphone and a magnetic coupler.
FIG. 2 shows a front view of another embodiment that includes a camera module disposed at the tip of an ultra-thin, displaceable arm, a USB cable and a magnetic coupler.
FIG. 3 shows orthogonal views of an embodiment that includes a camera module disposed at the tip of an ultra-thin camera module arm, an illuminator, a USB cable, and a carrying case capable of holding the camera module, the camera module arm, the illuminator, and the USB cable.
FIG. 4 shows an embodiment in use where the embodiment is mounted on a laptop computer for use in a video call or video conference. FIG. 5 shows orthogonal views of an embodiment that includes a camera module disposed at the tip of an ultra-thin, displaceable camera module arm, and a data port coupler.
FIG. 6 shows orthogonal views of an embodiment that includes a camera module and an illuminator mounted on a fixture housing, a retractable, flexible camera module arm attached to the fixture housing, wherein the fixture housing is capable of storing the camera module arm.
FIG. 7 shows orthogonal views of an embodiment that includes a camera module disposed at the tip of an ultra-thin camera module arm, the camera module arm being built into a tablet device and being capable of rotating onto a display screen of the tablet, and an illuminator built into a display screen bezel of the tablet.
FIG. 8 shows two embodiments in use during a video call or video conference.
FIG. 9 shows an embodiment that includes a miniature camera module disposed on a retractable camera module arm that moves behind a mobile device wherein the retractable camera module arm is fabricated using nitinol and a magnet wire harness.
FIG. 10 shows an embodiment that includes a miniature camera module attached to on an ultra-thin, retractable, camera module arm that moves onto a top bezel of a mobile device.
FIG. 11 shows an MIPI mobile device interface flex PCB ("printed circuit board") with shielding used to fabricate one or more embodiments: (a) disposed in a flat configuration (i.e., with a BGA sensor module at the bottom before being folded and attached to nitinol wires; and (b) in a folded position where it is connected to a camera module.
FIG. 12 is a schematic of camera imager support electronics used to fabricate one or more embodiments, which electronics may be positioned near an imager in the camera module to reduce conductor count in a transparent or ultra-thin camera module arm connected to the camera module.
FIG. 13 illustrates the quantum efficiency of an imager spectral filter having an IR notch near 810nm overlaid with the imager's spectral sensitivity curves.
FIG. 14 is a graph that indicates gaze angles useful in maintaining eye contact. Description
One or more embodiments are camera systems or systems that provide one or more of the following during a video call and/or a video conference: (a) improved eye contact between a user and a participant; (b) improved gaze awareness between the user and the participant; (c) an improved user image with respect to lighting even where the user is in a poorly-lit ambient; (d) an improved user image wherein background in the user ambient is muted; (e) an improved user image wherein lens distortion is reduced; and (f) reduced or eliminated video stutter for the user image.
In accordance with one or more embodiments, an inventive camera system may be built into a device, for example and without limitation, into a device display screen bezel. In accordance with one or more embodiments where the camera system is built into the device, a camera module and a camera module arm of the camera system (as set forth in detail below) may be stored in a slot built into the device display screen bezel (i.e., an embedded storage slot). In accordance with one or more such embodiments, a thumb activated turn knob is disposed flush with the bezel, and the thumb activated turn knob includes a rotation hinge/helical cam causes the camera module to lift up and rotate 90 degrees from its position in the embedded storage slot and to swing onto the device display screen to a portrait mode for use in a video call. Further, in accordance with one or more such embodiments where the camera module of the camera system is stored in the embedded storage slot in the device display screen bezel, a camera in the camera module (as described below) may be used as a camera for the device. Further, in accordance with one or more embodiments where the camera system is built into the device, a camera module arm of the camera system may be disposed, at least in part, in or on the back of the device. Still further, in accordance with one or more still further embodiments, an inventive camera system is an add-on to the device. In accordance with one or more such embodiments where the camera system is an add-on, the camera module arm and the camera module may be stored, i.e., when not deployed for use during a video call or video conference, in a housing which is attachable to the device. In accordance with one or more such embodiments, a thumb activated turn knob is disposed flush with the housing, and the thumb activated turn knob includes a rotation hinge/helical cam that causes the camera module to lift up and rotate 90 degrees from its position in the embedded storage slot and to swing onto the device display screen to a portrait mode for use in a video call or video conference.
One or more such embodiments are a camera system for use with a device, which camera system comprises: (a) a camera module; (b) a camera module arm that is mechanically and electrically connected to the camera module, which camera module arm: (i) may be electrically connected to camera system electronics, which camera system electronics is, in turn, electrically connectable to device system electronics, or (ii) may be electrically connectable to device system electronics; and (c) a housing that is mechanically connected to the camera module arm, may hold camera system electronics, and is mechanically connectable to the device. For purposes of this document, the term device or the term communications device means a device and/or a device system that includes a display screen and is capable of receiving images from a camera or over a communications channel (for example, and without limitation, the Internet) and displaying the images on the display screen. Such a device includes, without limitation, a personal computer ("PC"), a laptop computer, a tablet, a smartphone, and so forth. In accordance with one or more such embodiments, the housing is attachable to the device, and the camera module is positionable with respect to a device display screen. Further, in accordance with one or more such embodiments, the camera module arm is capable of movement which enables the camera module to be positioned with respect to the device display screen. In particular, in a video call and/or video conference between a user and a participant, the camera module may be deployed, i.e., positioned, with respect to an image of the participant on the device display screen to provide eye contact between the user and the participant.
In accordance with one or more such embodiments, camera system electronics is disposed in, or is affixed to, the housing in electrical contact with the camera module arm, which camera system electronics receives and processes image information transmitted thereto from the camera module and outputs processed signals to a connector, which connector is capable of transmitting the processed signals to the device system electronics. In accordance with one or more alternative embodiments, the camera module arm transmits image information from the camera module to the device system electronics. In accordance with one or more alternative embodiments, the device system electronics receives and processes image information transmitted thereto from the camera module. In accordance with one or more alternative embodiments, the camera module arm transmits image information from the camera module to the device system electronics.
In accordance with one or more further embodiments, the camera system electronics includes a processor that executes software which, in response to image information transmitted thereto, processes the image (as described below) and outputs the processed image to the device system electronics. In response, the device system electronics transmits the processed image over a communications channel such as, for example and without limitation, the Internet, to another device. In accordance with one or more such further embodiments, the device system electronics includes a processor that executes software which, in response to image information transmitted thereto, processes the image (as described below) and transmits the processed image over a communications channel such as, for example and without limitation, the Internet, to another device.
One or more embodiments are a camera system for use with a device, which camera system comprises: (a) a camera module; and (b) a camera module arm, which camera module arm: (i) may be electrically connected to camera system electronics, which camera system electronics is, in turn, electrically connectable to device system electronics, or (ii) may be electrically connectable to device system electronics. In accordance with one or more such embodiments, the camera module is positionable with respect to a device display screen. Further, in accordance with one or more such embodiments, the camera module arm is capable of movement which enables the camera module to be positioned with respect to the device display screen. In particular, in a video call and/or video conference between a user and a participant, the camera module may be deployed, i.e., positioned, with respect to an image of the participant on the device display screen to provide eye contact between the user and the participant.
In accordance with one or more such embodiments, camera system electronics is disposed in or on the device in electrical contact with the camera module arm, which camera system electronics receives and processes image information transmitted thereto from the camera module and outputs processed signals to the device system electronics. In accordance with one or more alternative embodiments, the camera module arm transmits image information from the camera module to the device system electronics.
In accordance with one or more further embodiments, the camera system electronics includes a processor that executes software which, in response to image information transmitted thereto, processes the image (as described below) and outputs the processed image to the device system electronics. In response, the device system electronics transmits the processed image over a communications channel such as, for example and without limitation, the Internet, to another device. In accordance with one or more such further embodiments, device system electronics includes a processor that executes software which, in response to image information transmitted thereto, processes the image (as described below) and transmits the processed image over a communications channel such as, for example and without limitation, the Internet, to another device.
For sake of the discussion that follows: (a) embodiments of a built-in camera system (referred to as camera system 100) includes a camera module (referred to as camera module 110), and a camera module arm (referred to as camera module 120); and (b) embodiments of an add-on camera system (referred to as camera system 200) include a camera module (referred to as camera module 210), a camera module arm (referred to as camera module arm 220) and a housing (referred to as housing 230).
In accordance with one or more embodiments, camera module arm 120 is mechanically and electrically connected to camera module 110, and camera module arm 120 is capable of movement which causes camera module 110 (for example, a miniature camera) to be positioned with respect to a device display screen. In accordance with one or more embodiments, camera module arm 220 is mechanically and electrically connected to camera module 210, and camera module arm 220 is capable of movement which causes camera module 210 (for example, a miniature camera) to be positioned with respect to a device display screen.
Camera Module
In accordance with one or more embodiments of camera system 100, a miniature camera module 110 is attached to camera module arm 120, and camera module arm 120 is fabricated (as described below) so that camera module 110 can be positioned in a stable and repeatable manner on a device display screen. In particular, as will be described below, camera module 110 can be positioned with respect to an image of a participant on the device display screen to provide eye contact between a device user and the participant during a video call and/or video conference. In accordance with one or more embodiments of camera system 200, a miniature camera module 210 is attached to camera module arm 220, and camera module arm 220 is fabricated (as described below) so that camera module 210 can be positioned in a stable and repeatable manner on a device display screen. In particular, as will be described below, camera module 210 can be positioned with respect to an image of a participant on the device display screen to provide eye contact between a device user and the participant during a video call and/or video conference.
In accordance with one or more embodiments, camera module 110 (or camera module 210) includes a miniature camera which comprises, for example and without limitation, an OmniVision OVM7695 sensor or an OmniVision OVM7239 sensor (both of which are available from OmniVision, Inc. of Santa Clara, California) and a suitable lens system (as described below). In accordance with one or more such embodiments, camera module 110 (or camera module 210) is approximately 3 mm by 3 mm to 5 mm in cross section and 2 mm to 3 mm deep. In accordance with one or more such embodiments, camera imager support electronics (FIG. 12 is a schematic of camera imager support electronics used to fabricate one or more embodiments) is disposed in camera system electronics (see above).
The inventor has discovered that cameras (sometimes referred to as webcams) for devices such as PCs and smartphone typically have a landscape orientation. Although the landscape orientation may be suitable for group video conference calls, the inventor has discovered that, for most video calls (for example, personal video calls), a portrait orientation provides a better image than the landscape orientation. As such, in accordance with one or more embodiments that effectuate the inventor's discovery, camera module 110 (or camera module 210), when deployed in position for use, is oriented so that the camera lens is disposed in a portrait orientation rather than a landscape orientation.
In addition, the inventor has discovered webcams typically use a wide angle lens
(for example, a 15-30 mm effective DSLR lens), and that such a wide angle lens distorts the image of a user's face when the user is close to the device display screen and the webcam (for example, and without limitation, during a personal video call). As such, in accordance with one or more embodiments that effectuate a solution to the inventor's discovery, the camera lens has a 50-80 mm focal length. As a result, and in accordance with one or more such embodiments, the camera optics are suited for close range portrait photography. For example, in accordance with one or more such embodiments, the camera optics has a low F number lens such as, for example and without limitation, an F number in a range from about 2 to about 3 to provide a shallow depth of focus that blurs the background and a flattering 50-70 mm DSLR equivalent lens angle of approximately 40 degrees field of view.
Camera Module Arm
Positioning a camera module at a relevant location (for example, a location that enables a user and a participant to maintain eye contact during a video call or video conference) on a device display screen during a video call/video conference requires meeting a challenge. The challenge is how to transmit control/data/power signals via a minimally obscuring camera module arm that: (a) is mechanically robust enough for consumer use; and (b) orients the camera module in a repeatable manner on the device display screen.
As described below, there are several types of camera module arm embodiments, including, for example and without limitation, the following. In accordance with a first type of camera module arm embodiment, the camera module arm comprises a semi-rigid (for example and without limitation, nitinol or other flexible material) that is laminated with thin wires or a flex circuit board (the flexible material enables the camera module arm to attain a repeatable location during deployment without having the arm break during use). In use, i.e., when deployed for use, the camera module arm is oriented like a blade perpendicular to the device display screen (such an orientation may be seen, for example, in one of the views in FIG. 3 and the views in FIG. 5). As such, the camera module arm has a profile on the device display screen (i.e., a footprint on the device display screen) whose width is less than an amount which is in a range from about 0.3 mm to about 0.5 mm. In accordance with one or more such embodiments, the camera module arm is mounted so that it can swing from a first position (for example, off the device display screen) when not in use, i.e., when not deployed for use, to a second position on the device display screen when in use, i.e., when deployed for use. Some issues that are overcome to provide such first embodiments include: (i) how to connect the wires/flex circuit board in the camera module arm to the camera module without connectors in some embodiments since the connectors might make the footprint of the camera module on the device display screen too big (as described below), and (ii) how to laminate the wires/flex printed circuit board ("PCB") to the semi-rigid, flexible member (stainless steel, nitinol, and so forth) without increasing the bulk or the profile of the camera module arm (as described below). In accordance with a second type of camera module arm embodiment, the camera module arm is the same as in the first embodiment, but as a difference (i.e., instead of being mounted so that it can swing from a first position, off the device display screen when not deployed for use, to a second position, on the device display screen when deployed for use) the camera module arm is mounted so that it can be retracted into the device display screen bezel (see for example, the views of FIG. 10) or through the device display screen bezel to the back of the device (see for example, the views of FIGs. 8 and 9). An issue that is overcome to provide such second embodiments, in addition to the issues addressed with respect to the first embodiments, is that any camera module arm that is bendable will have mechanical hysteresis when it is deployed from a curved, retracted position (whether the curve results from being positioned in the device display screen bezel or the curve results from being curved through the device display screen bezel and onto the back of the device), and thus, not attain a repeatable desired deployment position on the device display screen. As described below, the use of nitinol enables the issue to be resolved. In accordance with a third type of camera module arm embodiment, the camera module arm comprises a rigid, transparent member that, when deployed, is oriented parallel to the device display screen. In accordance with one or more such embodiments, the rigid, transparent member is made of a suitable, strong material such as, for example, and without limitation, polycarbonate. Further, the signal/power conductors are transparent or mostly transparent. In accordance with one or more such embodiments, the camera module arm can be rotated from a first position (for example, off the device display screen) when not in use, i.e., when not deployed for use, to a second position on the device display screen when in use, i.e., when deployed for use (see, for example, views in Figs. 1 and 7). In accordance with one or more embodiments, camera module arm 120 (or camera module arm 220) comprises a polycarbonate (or other suitable transparent polymer) pendant arm having wires (for example, miniature wires) embedded therein. In accordance with one or more such embodiments, camera module arm 120 (or camera module arm 220) can be made of 0.01" thick polycarbonate or a material having similar properties and a suitably small thickness. In accordance with one or more such embodiments, the wires are, for example and without limitation, 30 micron diameter wire (for example, 50 gauge insulated wire (also referred to herein as ultra-thin wire) disposed, for example, in twisted pairs to minimize EMI). In accordance with one or more such embodiments, the wires are embedded in camera module arm 120 (or camera module arm 220) or they are laminated onto camera module arm 120 (or camera module arm 220) using a flexible glue such as, for example and without limitation, UV cured, Dymax® 3025 encapsulant or clear, UV cured, Dymax® 9001-EV3 encapsulant (both of which are available from Dymax® Corporation, website address www.dymax.com). As one of ordinary skill in the art will readily appreciate, the wires carry power, clock, data and control signals or additional signals required by standard interfaces such as the standard MIPI camera interface.
In accordance with one or more embodiments, camera module arm 120 (or camera module arm 220) has a thickness in a range between about 1 mm to about 3 mm. In addition, and in accordance with one or more embodiments, camera module arm 120 is pivoted on a hinge which is affixed to the device display screen bezel, refer FIG. 7), or camera module arm 220 is pivoted on a hinge which is affixed to housing 130. In accordance with one or more such embodiments, the hinge allows only 90 degrees of rotation to insure that electrical connectors between camera module arm 120 (or camera module arm 220) and other electronics do not overextend. In accordance with one or more further embodiments, camera module arm 120 can be mounted on the top of the device display screen, either from a bezel or from a housing of the device system electronics, and camera module arm 220 can be mounted on housing 230.
In accordance with one or more further embodiments, a nitinol wire (available from NDC Inc. of Freemont California), for example and without limitation, a 0.08" diameter nitinol wire, is embedded in, or is attached to, camera module arm 120 (or camera module arm 220). As the inventor has discovered, the embedded or attached nitinol enables camera module arm 120 to maintain a lateral flexibility, which lateral flexibility enables camera module arm 120 to slide: (a) in and out of an aperture in a device display screen bezel to a guide or slot, for example and without limitation, a concealed guide or slot, that is embedded in the device display screen bezel (refer to FIG. 10)—camera module arm 120 bends along its short axis (where the short axis faces the user if the slot is configured sideways along the device display screen bezel); or (b) in and out of a channel in the device display screen bezel to a device electronic system, wherein the channel extends through the bezel from the front to the back of the device (refer to FIG. 9) (camera module arm 120 bends by 90 degrees where camera module arm 120 passes through the device display screen bezel to the back of the device), and wherein the sliding occurs as a result of a user's pushing attached camera module 110 up or down with respect to the aperture or the channel in the bezel, respectively. Where the device display screen bezel includes a slot: (a) in a retracted position, camera module arm 120 is stored in the camera arm slot, and flexible wiring connects the wires or traces in camera module arm 120 to any camera system electronics disposed in the slot; and (b) in a deployed position, camera module arm 120 is disposed partially in the slot and partially on the device display screen. As the inventor has also discovered, the embedded or attached nitinol enables camera module arm 220 to maintain a lateral flexibility, which lateral flexibility enables camera module arm 220 to slide in and out of an aperture in housing 230 to a guide or slot, for example and without limitation, a concealed guide or slot, that is embedded in housing 230—camera module arm 220 bends along its short axis (where the short axis faces the user if the slot is configured sideways along housing 230), wherein the sliding occurs as a result of a user's pushing attached camera module 110 up or down with respect to the aperture or the channel in the bezel. Where housing 230 includes a slot: (a) in a retracted position, camera module arm 220 is stored in the camera arm slot, and flexible wiring connects the wires or traces in camera module arm 220 to any camera system electronics disposed in the slot; and (b) in a deployed position, camera module arm 220 is disposed partially in the slot and partially on the device display screen. As a result, such embodiments enable horizontal and vertical positioning of camera module 110 (or camera module 210) with respect to the device display screen. In accordance with one or more such embodiments where a nitinol wire is embedded in camera module arm 120 (or camera module arm 220), wires affixed to, or embedded in, a polycarbonate arm can form a loop at the back of camera module arm 120 (or camera arm 220). In such embodiments for built-in camera system 100, (a) the wires in the loop may be connected to the device electronics system by, for example and without limitation, a PCB; and (b) the back end of camera module arm 120 may move up and down (i) inside a slot internal to the device, for example and without limitation, a slot in the device display screen bezel, or (ii) the back end of camera module arm 120 may be wound sideways inside the device display screen. In either case, a user may position camera module 110 for use by moving it down (for example and without limitation, by dragging with the user' s thumb) from a first or non-use or non-deployed position (for example and without limitation, in the first or non-use or non-deployed position, camera module 110 is disposed adjacent to or on the bezel) to a second or use or deployed position which is adjacent to or on the device screen display. Further, a user may move camera module 110 from the use or deployed to the non-use or non- deployed position by moving camera module 110 up (for example and without limitation, by pushing with the user's thumb). In such embodiments (for add-on camera system 200), (a) the wires in the loop may be connected to the device electronics system by, for example and without limitation, a PCB; and (b) the back end of camera module arm 220 may move up and down (i) inside a slot in housing 230, or (ii) the back end of camera module arm 220 may be wound sideways inside the slot. In either case, a user may position camera module 210 for use by moving it down (for example and without limitation, by dragging with the user' s thumb) from a first or non-use or non-deployed position (for example and without limitation, in the first or non-use or non-deployed position, camera module 210 is disposed adjacent to or on housing 230) to a second or use or deployed position which is adjacent to or on the device screen display. Further, a user may move camera module 210 from the use or deployed to the non-use or non- deployed position by moving camera module 210 up (for example and without limitation, by pushing with the user's thumb).
In accordance with one or more further embodiments, camera module arm 120
(or camera module arm 220) comprises an assembly of signal and power wires disposed between two parallel nitinol wires. For example and without limitation, in accordance with one or more such further embodiments, ten to fifteen 50 gauge wires (some of which may be configured in twisted pairs to reduce EMI) are positioned between nitinol wires, for example and without limitation, two 0.008" diameter nitinol wires, and the assembly is held together using a flexible glue such as, for example and without limitation, UV cured, Dymax® 3025 encapsulant or clear, UV cured, Dymax® 9001- EV3 encapsulant (both of which are available from Dymax® Corporation, website address www.dymax.com). In accordance with one or more such further embodiments, camera module arm 120 (or camera module arm 220) of the aforementioned assembly is flat and has a thickness of about 0.3 mm and a width less than about 2 mm. In accordance with one or more embodiments that use such a flat camera module arm 120 (for built-in camera system 100), camera module arm 120 is disposed through a channel in a device display screen bezel (the channel extending through the bezel from the front to the back of the device), and camera module arm 120 may be looped to a back or a side of the device. In accordance with one or more such embodiments, the wires in camera module arm 120 may be connected to a device electronics system by, for example and without limitation, a PCB. In operation, a user may position camera module 110 for use by moving it down (for example and without limitation, by dragging with the user's thumb) from a first or non-use or non-deployed position (for example and without limitation, in the first or non-use or non-deployed position, camera module 110 is disposed adjacent to the channel in the bezel) to a second or use or deployed position which is adjacent to or on the device screen display. Further, a user may move camera module 110 from the use or deployed position to the non-use or non-deployed position by moving camera module 110 up (for example and without limitation, by pushing with the user's thumb). The inventor has discovered that the assembly which includes two nitinol wires insures mechanical stability so that camera module arm 120 does not buckle when it is pushed back into the non-use position, and insures that camera module 110 maintains a flat and repeatable orientation with respect to the screen.
In accordance with one or more further embodiments, camera module arm 120 (or camera module 220) comprises an assembly of multiple 50 gauge wires (some of which may be configured in twisted pair to reduce EMI) which are laminated to edges of a polycarbonate or other suitable flexible thin arm. In such embodiments (for built-in camera system 100), camera module arm 120 can slide up or down through a channel in the device display screen bezel (the channel extending through the bezel from the front to the back of the device) or rotate down from a guide or slot in the device display screen bezel through an aperture disposed in the device display screen bezel. In such embodiments (for add-on camera system 200), camera module arm 220 can rotate down from a guide or slot in housing 230 through an aperture disposed in housing 230. The inventor has discovered that locating the wires on the edges of camera module arm 120 (or camera module arm 220), when the embodiment is in use and camera module 110 (or camera module 210) is disposed on or adjacent to the device display screen, the wires are stacked normal to a user's line of vision, and as a result, the wires would provide little obstruction of the device display screen. For example, such a configuration of the wires can have as little as 30 microns of optical thickness on the edge of camera module arm 120 (or camera module arm 130).
In accordance with one or more further embodiments, camera module arm 120 (or camera module arm 220) is made from flex PCB that is folded into an orientation wherein, when the embodiment is in use and camera module 110 (or camera 210) is disposed on or adjacent to the device display screen, the PCB is oriented orthogonal to the device display screen (see, for example, the views of FIG. 11). In accordance with one or more such further embodiments, wires on the flex PCB carry signals such as standard parallel interface signals or MIPI interface signals, and a metal RF shield may be affixed to a back of the flex PCB to minimize EMI. In accordance with one or more such further embodiments, the flex PCB is glued (using a flexible glue such as, for example and without limitation, UV cured, Dymax® 3025 encapsulant or clear, UV cured, Dymax® 9001-EV3 encapsulant (both of which are available from Dymax® Corporation, website address www.dymax.com)) between two nitinol wires such as, for example and without limitation, 0.015" diameter nitinol wires, to form a "blade." The inventor has discovered that such a camera module arm will have sufficient mechanical robustness so that, when the embodiment is in use and camera module 110 (or camera module 210) is disposed on or adjacent to the device display screen, such a camera module arm will maintain the position of camera module 110 (or camera module 210) at the same spot and in the same orientation while being thin. For example and without limitation, such a camera module arm would be less than about 0.4 mm thick, and therefore, would not obscure the screen image underneath. Further, such a camera module arm can be rotated on and off the device display screen.
In accordance with one or more further embodiments, the flex PCB arm described above can be fabricated using thinner nitinol wires than described above to form a retractable camera module arm that can: (a) slide up or down through a channel in the device display screen bezel (the channel extending through the bezel from the front to the back of the device), and be concealed in the back of the device; (b) slide through an aperture in the device display screen bezel into and out of a concealed, horizontally disposed, guide or slot in bezel; or (c) slide through an aperture in housing 230 into and out of a horizontally disposed, guide or slot in housing 230. In operation (for built-in camera system 100), a user may position camera module 110 for use by moving it down (for example and without limitation, by dragging with the user's thumb) from a first or non-use or non-deployed position (for example and without limitation, in the first or non-use or non-deployed position, camera module 110 is disposed adjacent to the channel or the aperture in the bezel) to a second or use or deployed position which is adjacent to or on the device screen display. Further, a user may move camera module 110 from the use or deployed position to the non-use or non- deployed position by moving camera module 110 up (for example and without limitation, by pushing with the user's thumb). In accordance with one or more such further embodiments, the guide or slot is fabricated using a lubricious tube such as, for example and without limitation, a Teflon tube, or a concealed guide may be formed from a slit in the bezel material and coating it with a lubricious material coating. In operation (for add-on camera system 200), a user may position camera module 210 for use by moving it down (for example and without limitation, by dragging with the user's thumb) from a first or non-use or non-deployed position (for example and without limitation, in the first or non-use or non-deployed position, camera module 210 is disposed adjacent to the aperture in housing 230) to a second or use or deployed position which is adjacent to or on the device screen display. Further, a user may move camera module 210 from the use or deployed position to the non-use or non-deployed position by moving camera module 210 up (for example and without limitation, by pushing with the user's thumb). In accordance with one or more such further embodiments, the guide or slot is fabricated using a lubricious tube such as, for example and without limitation, a Teflon tube, or channel in housing 230.
In accordance with one or more of the above-described embodiments that utilize polycarbonate to fabricate camera module arm 120 (or camera module arm 220), metal is deposited on the edge of the polycarbonate (where the edge is a surface that is disposed orthogonal to the device display screen when the embodiment is in use) to carry power.
In addition, in accordance with one or more of the above-described embodiments, instead of wires, thin traces of deposited metal such as, for example and without limitation, gold, silver, copper or aluminum traces or ITO traces (such as ITO traces used in LCD screen manufacturing) are used to fabricate traces that conduct signals between camera module 110 (or camera module arm 210) and the device electronics system, which traces are nearly invisible.
In further addition, in accordance with one or more of the above-described embodiments, camera module arm 120 (or camera module arm 220) is anti-reflective ("AR") coated to reduce reflection and, thereby, decrease obstruction of the device display screen.
In further addition, in accordance with one or more of the above-described embodiments, wires comprising camera module arm 120 (or camera module arm 220) are soldered to a PCB of camera module 110 (or camera module 210). Further, for the built-in embodiments, the wires are soldered to a device system electronics PCB.
Housing
In accordance with one or more embodiments, housing 230 of camera system 200 may be a thin plastic bar or a thin metal bar. In accordance with one or more further embodiments, housing 230 may include one or more magnets affixed, for example and without limitation, by being glued, to a back side of housing 230 to secure it to the device adjacent an edge of the device display screen. Alternatively, the magnets may be affixed to the device, for example and without limitation, by being glued to a top of the device display screen bezel. Alternatively, housing 230 includes a clamp to secure it to the device adjacent an edge of the device display screen. In accordance with one or more such embodiments, housing 230 can be mounted: (a) on either side of the device display screen bezel; (b) on either side of the top bezel without obscuring any built-in device webcam; or (c) on the bottom bezel. In accordance with one or more such embodiments, a user mounts housing 230 by first affixing two metal tabs to the device display screen bezel with contact adhesive (for example and without limitation, SLE300 contact adhesive available from the 3M Corporation). As such, the magnet(s) affixed to housing 230 enable quick connect/disconnect to/from the metal tabs while enabling the user to return housing 230 to a specific location quickly. The inventor has discovered that such an embodiment may be particularly useful if the device is a laptop with a folding screen or a tablet or a smart phone. When camera system 200 is not in use, housing 230 can be removed quickly from the magnetic tabs or camera module 210 can be swung out of the way. Further, in accordance with one or more embodiments, housing 230 can be attached to a device display screen bezel using repositionable adhesive tape (for example and without limitation, 9416 adhesive tape or 9425 adhesive tape available from the 3M Corporation) which is attached to the back of housing 230.
In accordance with one or more embodiments, add-on camera system 200 further includes a connector that plugs into the device such as, for example and without limitation, a device USB port to provide power, communications, and a mechanical fixation.
In accordance with one or more embodiments of add-on camera system 200, housing 230 includes space in which electronic processing and signal conditioning circuitry is disposed. In accordance with one or more such embodiments, the electronic processing and signal conditioning circuitry transforms camera image signals input thereto from camera module arm 220 and outputs signals (that are compatible with USB electronics and protocols in accordance with any one of a number of methods that are well known to those of ordinary skill in the art) to a connector, for example and without limitation, a USB connector.
In addition, and accordance with one or more such embodiments of camera system 100 or camera system 200, the electronic processing and signal conditioning circuitry, controls the camera imager electronics (as will be described in detail below) to optimize signal-to-noise based on a captured image, providing color balance, combining illuminated and natural video frames, and applying computed luminosity and color masks as described below. Camera Screen Position During Use or Deployment
In accordance with one or more embodiments, a user can position a camera module on a device display screen. Further, in accordance with one or more such embodiments, during an initial installation of a built-in or add-on camera system for use with a device (or any time thereafter that calibration is desired), the user may calibrate preferred camera module positions on the device display screen using device software, for example and without limitation, a "video conference app," by moving a display screen cursor to one or more positions of the camera module on or adjacent to the device display screen. When the cursor has moved to a position, the user may, for example, provide a click. In response, the video conference app will record the position, for example and without limitation, in device storage, for future use. Such a video conference app may be constructed routinely and without undue experimentation using any one of a number of methods that are well known to one of ordinary skill in the art.
Further, in accordance with one or more embodiments, during an initial installation of a camera system for use with a device (or any time thereafter that calibration is desired), the user may calibrate preferred camera module positions on the device display screen using device software, for example and without limitation, a "video conference app," by moving the camera module to various positions on or adjacent to the device display screen. When the camera module has moved to a position, the user may, for example, provide a click, whereupon, the video conference app sends a signal to device system electronics or to camera system embedded electronics to generate a voltage spike in predetermined wires in the camera module arm. In accordance with one or more such embodiments, while not carrying camera image information, the wires are used as a pickup antenna for the voltage spike which is created when a particular transistor on an LCD screen is turned on or off to control its underlying pixel. At the same time, the video conference app causes a horizontal and vertical scan of the device display screen and thus triggers sequentially all the transistors controlling the pixels on the screen. Then, the video conference app uses the voltage spike picked up by the antenna in combination with pixel location of a moving scan line on the screen to identify the location of the antenna in the camera module arm, and hence, the position of the camera module. Such a video conference app may be constructed routinely and without undue experimentation using any one of a number of methods that are well known to those of ordinary skill in the art.
Still further, in accordance with one or more embodiments, the camera module includes a photo detector, for example and without limitation, a photodiode, which is disposed so that it faces the device display screen during deployment. In accordance with one or more such embodiments, during an initial installation of a camera system for use with a device (or any time thereafter that calibration is desired), the user may calibrate preferred camera module positions on the device display screen using device software, for example and without limitation, a "video conference app," by moving the camera module to various positions on or adjacent to the device display screen. When the camera module has moved to a position, the user may, for example, provide a click, whereupon, the video conference app causes a moving predetermined scan feature (for example and without limitation, a white line on a black screen) to be displayed on the screen or predetermined video information to be displayed on the screen. Then, the video conference app uses a signal from the photo detector in combination with pixel location of, for example, the moving scan feature to identify the position of the camera module. Such a video conference app may be constructed routinely and without undue experimentation using any one of a number of methods that are well known to those of ordinary skill in the art. In accordance with one or more such embodiments, instead of using a photodiode, in the camera module, an imager die is placed on a semitransparent substrate instead of an opaque substrate. The semitransparent substrate is disposed so that it faces the device display screen during deployment. As a result, the imager can be sensitive to light coming from the device display screen in addition to radiation impinging thereon from the camera lens. In such embodiments, a video app uses a signal from the imager in combination with pixel location of, for example, a moving scan feature to identify the position of the camera module.
In addition, in accordance with one or more of the above-described embodiments, after determining the location of the camera module, the portion of the device display screen disposed under the camera module during deployment is darkened to avoid leakage of light from the screen into the user's image. Camera System Electronics
In accordance with one or more embodiments, built-in camera system 200 comprises electronics located in housing 230, which electronics is used to capture and process data output from the camera's imager. In accordance with one or more such embodiments, the electronics includes a USB bridge processor (for example and without limitation, a suitable USB bridge processor is available from SunPak Co., Ltd. of Taichung City, Taiwan). For example and without limitation, the USB bridge processor receives image output data via typical parallel or MIPI interfaces, and packages the data into USB compatible signals for delivery to the device using a USB connector (refer to FIGs. 2 and 3). In accordance with one or more such embodiments using a USB connector, the camera system derives power from the device via the USB connector.
In accordance with one or more alternative built-in embodiments, instead of using a USB connection to derive power from the device, housing 230 contains its own power source such as, for example and without limitation, one or more batteries. Further, in accordance with one or more alternative embodiments, housing 230 contains a "near-field" communication device such as, for example and without limitation, a Bluetooth device, to communicate with the device.
Video Conference Software
In accordance with one or more embodiments, during a video conference, device software, for example and without limitation, a video conference app," a will periodically analyze: (a) video images received from the camera system; and (b) a window on the device display screen which contains: (i) a video image of a participant (received from the participant's device), (ii) a static picture of the participant, or (iii) a generic face to identify the participant's face. Such software may be fabricated using FaceDetect software which is available from Fraunhofer IIS 2001, website address www.iis.fraunhofer.de/en/bf/bsy/fue/isyst/detektion/tech.html. In response to the image analysis, the video conference app will position the received video image of the participant in a video conferencing window by sizing, cropping and panning the received video image in accordance with any one of a number of methods that are well known to those of ordinary skill in the art so that the camera module, and hence its camera, is located (with respect to the received video image) to cause the user to maintain eye contact with the received video image of the participant when the user looks at the participant's eyes in the received video image. In accordance with one or more such embodiments, suitable camera module positions include: (a) an area above a line connecting the participant's pupils, and preferably providing a gaze angle of more than three (3) degrees; and (b) to either side of the participant's face at eye level. FIG. 14 is a graph which indicates gaze angles useful in maintaining eye contact. However, if such a video conference app is not available, the user will be provided with instructions on how to position the video image of the participant with respect to the camera module.
In accordance with one or more such embodiments, positioning the received video image with respect to the location of the camera module, and hence its camera, is done in a slow pan such as, for example and without limitation, 25% of a screen window dimension per second so as not to draw excessive attention of the user to the movement of the image or to the camera module while allowing natural eye contact to be re-established periodically. As a result, when the user looks at the participant's eyes, the user image sent to the participant will appear to the participant as if the user is looking at the participant's eyes. Further, through use of such embodiments, eye contact is established and maintained.
In accordance with one or more such embodiments, if the user is engaged in a multi-party video chat such as a Google video chat or a Facebook video chat, the video conference app will automatically position a speaking participant in a screen window associated with the camera module, and hence its camera, so the speaking participant would perceive the user to be looking attentively at the speaking participant.
Illuminator
As will be described below, a "near-field" illumination, for example and without limitation, monochromatic, visible or invisible, illumination, is used to provide a low noise image of a user's face. And, as will be described below, color information and the low noise image are combined to provide a webcam image that the device communicates to a participant in a video call or video conference.
In accordance with one or more embodiments: (a) of add-on camera system 200, housing 230 includes an illuminator comprised of one or more infrared ("IR") radiation sources or one or more visible light sources to provide "near-field" illumination of a user's face (refer to FIGs. 1 and 6); and (b) of built-in camera system 100, the device includes an illuminator which is located, for example and without limitation, in the device display screen bezel, and which is comprised of one or more infrared ("IR") radiation sources or one or more visible light sources to provide "near-field" illumination of a user's face (refer to FIG. 7). In accordance with one or more such embodiments, an IR source (for example and without limitation, an IR LED such as, for example and without limitation, an LZ1-00R400 IR LED manufactured by LedEngin which is available from Mouser Electronics, website address www.mouser.com/) may be used so as not to blind a user. In accordance with one or more such embodiments, the IR source can be pulsed, and the pulses may be synchronized under software control with the frame rate of the camera in the camera module. In accordance with one or more such embodiments, such software may be embedded in the USB bridge processor or in device software, for example, a "video conference app."
In accordance with one or more such embodiments that utilize an IR source, a conventional IR filter or a notch filter is disposed between a lens and an imager of the camera in the camera module. In accordance with one or more such embodiments, the conventional IR filter has a spectral band pass notch in the wavelength region of the IR source, for example and without limitation, a notch of 30 nm centered at a wavelength approximately equal to 810 nm. FIG. 13 illustrates the quantum efficiency of an imager spectral filter having an IR notch near 810nm overlaid with the imager's spectral sensitivity curves. This notch does not disrupt the normal color balance of the camera and provides a spectral window for the IR source. In accordance with one or more such embodiments, software in the USB bridge processor or device software records an RGB signal in one conventional frame (i.e., a frame captured where there is no IR illumination) and an RGBI signal in another frame (i.e., a frame in which there is IR illumination). Then, the software combines the signals from the RGBI pixels to create the luminance of the image in accordance with any one of a number of methods that are well known to those of ordinary skill in the art and uses the RGB information to provide balanced color channels using a conventional white balance method (for example and without limitation, a method described in an article entitled "Automatic White Balance for Digital Still Cameras," Journal of Information Science and Engineering, pp. 497-509 (2006), which article is incorporated by reference herein in its entirety) and, thereby, creates a well-lit, color corrected, clear rendition of the user's face.
The inventor has recognized that in many circumstances supplying sufficient light (for example and without limitation, 300 lux to a user's face vs. 100 lux available in a typical office setting) in the manner described above, substantially reduces image the signal-to-noise ratio ("SNR") in the resultant well-lit image -in accordance with one or more embodiments, adding illumination might produce, for example and without limitation, as much as 400% reduction in SNR. A video conference software CODEC disposed, for example and without limitation, in the video conference app, compresses the user's image prior to transmission to the participant's device, and the available communications bandwidth typically drives the software CODEC compression settings. The inventor has discovered that the reduced SNR in the well-lit image is advantageous because, post compression, the well-lit, reduced SNR, user's image has fewer bits. As a result, as the inventor has also recognized, this avoids overwhelming the communication channel between the user's device and the participant's device with poorly compressed frames due to excessive image noise which results in loss of connection or stutter.
In accordance with one or more embodiments, the illuminator or illuminators can be located outside the housing to provide both key light and fill lights, and the illuminator(s) can be synchronized electronically or optically by picking up a pulsed IR signal via a photo detector, for example and without limitation, disposed in the illuminator, with frame capture software in the device software to capture image frames with and without illumination.
In accordance with one or more embodiments, the illuminator or illuminators can be used in a non-pulsed mode wherein their intensity is controlled by the device software via pulse width modulation or other suitable methods.
In accordance with one or more further embodiments, an illuminator is a "bright" red LED (for example and without limitation, a PU1WLEE, 1 watt, red LED which is available from American Opto Plus Led Corp., web site address www.aopled.com) which causes less apparent brightness for a user than white light or other light in the visible spectrum, but which still encompasses a suitable image sensor sensitivity. In accordance with one or more such further embodiments, the red LED can be pulsed as described above in the case of an illuminator comprised of an IR source. Also, in accordance with one or more such further embodiments, a frame illuminated using the red LED is used to derive luminance by adding the signal collected by the red CCD channel, while a white-balanced image is derived in accordance with any one of a number of methods that are well known to those of ordinary skill in the art (same as described for the IR case above) from frames where the red LED was not illuminated. When using a red LED, a CCD optical filter, such as the one used with the IR source (as described above), need not be used.
Image Compression
In accordance with one or more embodiments, where the illuminator is utilized as described above, device software, for example, a "video conference app" applies a conventional YUV compression to an image frame obtained from an RGB sensor in the camera to provide an illuminated image frame. In accordance with one or more such embodiments, the video conference app: (a) masks (as described below by utilizing "near- field" lighting effect) the luminance image frame Y to minimize extraneous background information; (b) masks UV color channels in the image frame; and then (b) spatially decimates the masked UV color channels (for example and without limitation, by averaging the value of each 4, 8 or 16 adjacent pixels -the inventor has discovered that this is useful since a face has fairly uniform color). The retained information from the combined image frame (i.e., after masking and decimating the YUV channels) comprises a low pixel noise signal because of the well-lit Y channel and reduced information in the masked areas. The combination results in an image which can be more effectively compressed, thereby, resulting in lower bit rate transmission for a more reliable and persistent video call, and an image in which the local viewer's face is the focus of the image and the surrounding visual clutter is visually suppressed.
Muted Background Image
As will be described below, pulsed "near-field" illumination that is synchronized with video frames, is used to visually mute background and, thereby, to enhance a user's image.
In accordance with one or more embodiments, LEDs (IR LEDs or red LEDS) having an appropriate light cone angle (for example and without limitation, an angle in a range from about 5 degrees to about 10 degrees) illuminate the user's face at a typical distance from the screen (for example and without limitation, for a laptop a typical distance might be about one to about two feet, and for a smartphone, a typical distance might be about one-half to about one foot). The LEDs are mounted on a device display screen bezel or a housing to form "near-field" lighting. The inventor has discovered that such a configuration of LEDs lights up the nearby user's face, but casts little light into the background. Hence, in accordance with one or more such embodiments, an LED-unlit image frame is subtracted from an LED-lit image frame, pixel by pixel, and a threshold is applied to the resulting image frame data to provide a transparency map— the areas whose light intensity doesn't change much represent background, and the remaining areas represent a near screen object, i.e., a face. The threshold is determined periodically, for example, by comparing the change in luminosity between frames and selecting a value which best separates minimum and maximum change. The pixels that are contiguous and are above the threshold comprise the transparent portion of the digital mask while the others comprise the more opaque portion of the mask. The inventor has discovered that this computation, whether carried out in hardware, device software, or USB processor software, can be carried out using minimal computational and power resources, and that this is an advantage for a battery operated or battery powered accessory. Next, in accordance with one of more such embodiments, the digital transparency mask is applied to each frame (to provide an effect similar to a lens vignette) by multiplying each pixel's luminosity channel value by a corresponding pixel value in the digital transparency mask, thereby reducing the brightness of visual information in the background. In accordance with one or more such embodiments, the digital transparency is further applied to color channels to de-saturate them by multiplying the value of each pixel in the digital transparency mask and the corresponding pixel value in the color channel. This procedure results in an image that tends to focus the participant's attention on the user's face. In addition, this procedure reduces the information content of a frame, thereby resulting in a more reliable Internet transmission of images.
In addition, and in accordance with one or more further embodiments, multiple images are obtained from: (a) a device webcam, if any; (b) the camera in the camera module, with or without off axis illumination; and (c) "near-field" illumination, as described above. In accordance with one or more such further embodiments, device software, for example, a "video conference app," analyzes the multiple images for parallax to ascertain the user's head position and an orientation of the user's head in three dimensions using any one of a number of methods that are well known to those of ordinary skill in the art.
Sound Recording
In accordance with one or more embodiments, the housing contains one or more microphones (refer to FIG. 1). Device software, for example, a "video conference app," analyzes the microphone output using any one of a number of methods that are well known to those of ordinary skill in the art to create a directional microphone capable of HD sound. In accordance with one or more such embodiments, device software, for example, a "video conference app," analyzes the 3D head position and orientation information described above using any one of a number of methods that are well known to those of ordinary skill in the art to further aim the directional microphone to preferentially record the user's voice and to reduce recording and transmission of background noise.
Sound Projector
In accordance with one or more embodiments, a gambled parabolic sound projector is mounted on or behind the device display screen. Device software, for example, a "video conference app," analyzes the 3D head position and orientation information described above using any one of a number of methods that are well known to those of ordinary skill in the art to cause the gambled parabolic sound projector to project sound to the user. The inventor has discovered that such embodiments may avoid the use of headphones, and can make the transmitted image appear more natural. In addition, the inventor has discovered that such embodiments may be useful in noisy environments such as a cubicle or a public space.
Recap Summary of the Figures
In light of the description above, the following summary description of the figures will enable one of ordinary skill in the art to make and use the invention.
FIG. 1 shows orthogonal views of an add-on embodiment that includes a camera module disposed at the tip of a transparent arm, an illuminator, a microphone and a magnetic coupler. As shown in FIG. 1, housing 101 includes magnets 104, illuminator 106 and dual microphones 101. As further shown in FIG. 1, camera module 103 hangs down from housing 101 via camera module arm 102 which swivels on axis 105.
FIG. 2 shows a front view of another add-on embodiment that includes a camera module disposed at the tip of an ultra-thin, displaceable arm, a USB cable and a magnetic coupler. As shown in FIG. 2, housing 207 includes dual microphones 202, and illuminator 201. As further shown in FIG. 2, housing 207 supports camera module arm 204 and camera module 203. In accordance with one or more such embodiments, camera module arm 204, and can swivel so that camera module 203 may be moved between a deployed or "down" position and a non-deployed or "up" position 205 abutting housing 207. In accordance with one or more such embodiments, camera module 203 may be held in the non-deployed position, for example and without limitation, by a magnet. As further shown in FIG. 2, the add-on embodiment communicates with a device via USB cable 206.
FIG. 3 shows orthogonal views of an add-on embodiment that includes a camera module disposed at the tip of an ultra- thin camera module arm, an illuminator, a USB cable, and a carrying case capable of holding the camera module, the camera module arm, the illuminator, and the USB cable. As shown in FIG. 3, a housing includes illuminator 301, thin camera module arm 303, and camera module 302, which camera module 302 is supported by camera module arm 303. As further shown in FIG. 3, the add-in embodiment is powered by, and communicates with, the device via USB cable 307. As further shown in FIG. 3, carrying case 308 serves as a holder for camera module 303, camera module arm 302, illuminator 301, and USB cable 307.
FIG. 4 shows an add-on embodiment, like that shown in FIG. 2, in use where the add-on embodiment is mounted on a laptop computer for use in a video call or video conference. As shown in FIG. 4, the add-on embodiment includes a housing that supports illuminator 401, thin camera module arm 403, and camera module 404, which camera module 404 is supported by camera module arm 403. As further shown in FIG. 4, the add-on embodiment is affixed to laptop 402 and laptop 402 displays an image of participant 405 and laptop user 408. As further shown in FIG. 4, camera module arm 403 has been rotated so that camera module 404 is disposed at non-deployed position 407. Lastly, FIG. 4 shows the add-on embodiment detached from laptop 402 and disposed in carrying case 406. FIG. 5 shows orthogonal views of an add-on embodiment that includes a camera module disposed at the tip of an ultra-thin, displaceable camera module arm, and a data port coupler. As shown in FIG. 5, the add-on embodiment includes a housing that supports illuminator 501, thin camera module arm 502, camera module 503, which camera module 503 is supported by camera module arm 502, and USB connector 504. In accordance with one or more such embodiments, the add-on embodiment is affixed to a device by connecting USB connector 504 to a device USB port.
FIG. 6 shows orthogonal views of an embodiment that includes a camera module and an illuminator mounted on a fixture housing, a retractable, and a flexible camera module arm attached to the fixture housing, wherein the fixture housing is capable of storing the camera module arm. As shown in FIG. 6, fixture housing 608 includes dual microphones 603 and 605, dual illuminators 601 and 602, and spring arm 604. A retractable, flexible camera module arm (not shown) supports camera module 607, and as shown in FIG. 6, the retractable, flexible camera module arm is stored in housing 608 in its non-deployed position. As further shown in FIG. 6, spring arm 604 is disposed in an open position. When housing 608 is placed on a device display screen bezel and spring arm 604 is released from the open position, spring arm 604 applies force that pulls the back of the housing 609 against the bezel for secure positioning.
FIG. 7 shows orthogonal views of a built-in embodiment that includes a camera module disposed at the tip of an ultra-thin camera module arm, the camera module arm being built into a tablet device and being capable of rotating onto a display screen of the tablet, and an illuminator built into a display screen bezel of the tablet. As shown in FIG. 7, tablet 704 includes thin camera module arm 702 that supports camera module 703. As further shown in FIG. 7, camera module arm 702 is connected to finger activated, rotating pivot 701. As further shown in FIG. 7, in its stored or non-deployed position 703, camera module 703 fits into recess 705 in the bezel. When camera module 703 is in the stored or non-deployed position, camera module 703, and camera module arm 702, and rotating pivot 701 are nearly flush with the bezel.
FIG. 8 shows two built-in embodiments in use during a video call or video conference. As shown in FIG. 8, in one built-in embodiment: (a) a tablet or phone includes camera module 806 attached to rotating, rigid, transparent, camera module arm 804—shown in a deployed position; (b) a tablet or phone includes camera module 806 attached to rotating, rigid, transparent, camera module arm 808 —shown in a non- deployed position where camera module arm 808 (and hence camera module 807) has been rotated from the deployed position, and camera module 807 has been stored in a recess area in a bezel. As further shown in FIG. 8, in a second built-in embodiment: (a) a tablet or phone includes camera module 805 attached to retractable camera module arm 801 -shown in a deployed position 801 and (b) a tablet or phone includes camera module 802 attached to retractable camera module arm -shown in a non-deployed position where the camera module arm is not seen as it has been retracted in the device.
FIG. 9 shows a built-in embodiment that includes a miniature camera module disposed on a retractable camera module arm that moves behind a device wherein the retractable camera module arm is fabricated using nitinol and a magnet wire harness. In FIG. 9, camera module 902 is shown attached to phone 901 in a retracted or non- deployed position and is also shown attached to flexible camera module arm 904 in an expanded or deployed position. As shown in FIG. 9, camera module arm 906 which is attached to camera module 907 is guided through tube 905.
FIG. 10 shows a built-in embodiment that includes a miniature camera module attached to an ultra-thin, retractable, camera module arm that moves onto a top bezel of a mobile device. As shown in FIG. 10, in a retracted or non-deployed position, camera module 1001 is attached to thin, camera module arm 1003, which camera module arm 1003 is oriented so that its thin profile faces toward the user and which camera module arm 1003 is retracted into a bezel of phone or tablet 1004. As further shown in FIG. 10, as camera module arm 1003 is retracted, it is guided through guides 1002. As further shown in FIG. 10, in an expanded or deployed position, camera module 1001 is positioned on the screen.
FIG. 11 shows an MIPI mobile device interface flex PCB ("printed circuit board") with shielding used to fabricate one or more embodiments: (a) disposed in a flat configuration (i.e., with a BGA sensor module at the bottom before being folded and attached to nitinol wires; and (b) in a folded position where it is connected to a camera module. As shown in FIG. 11, flex circuit board 1100 (shown flat) is fabricated with ball grid array 1102 to connect to camera module 1105 and MIPI interface traces 1101 and 1109 and to terminate in connector traces 1103 which connect camera module 1105 to a CPU. As assembled, camera module 1105 and its associated capacitors and electronics 1106 and the flex circuit board arms 1101 and 1109 are folded over as 1107 to create a thin viewing profile and are laminated to nitinol 1104 with glue 1108.
FIG. 13 illustrates the quantum efficiency of an imager spectral filter having an IR notch near 810nm overlaid with the imager's spectral sensitivity curves. As shown in FIG. 13, an IR filter having transparent window 1302 in the visible wavelengths as well as narrow window 1301 in the IR.
FIG. 14 is a graph that indicates gaze angles useful in maintaining eye contact. As shown in FIG. 14, the eyes of viewer 1404 are directed to one of multiple LEDs 1401 arranged in an array surrounding camera module 1403, which camera module 1403 records the viewer's gaze. The graph shows that multiple observers of the viewer's gaze determine that eye contact is established generally when the viewer is looking at the camera or up to 7 degrees below it.
Participant Gaze
In accordance with one or more embodiments where a user shares his/her screen with a participant, device software, for example, a "video conference app," analyzes a participant's image using any one of a number of methods that are well known to those of ordinary skill in the art to determine gaze. Since the video conference app does not know where the participant is displaying the user's screen image on the participant's screen, the video conference: (a) includes an attention blink on each of the two diagonal vertices of a transmitted user's screen image; and (b) then uses this information to map the participant's eye gaze between these two diagonal vertices to the user's screen. Next, the video conference app causes the participant's gaze to be displayed on the device display screen, for example and without limitation, as a faint red circle with light traces so the user can be aware of where (or whether) the remote participant' s attention is focused during the presentation.
Unless otherwise defined, all technical and scientific terms used herein have the same meaning as is commonly understood by one of ordinary skill in the art to which these inventions belong. In case of conflict, the patent specification, including definitions, will control. In addition, the materials, methods, and examples are illustrative only and not intended to be limiting, and methods and materials similar or equivalent to those described herein can be used to make and use embodiments of the present invention. Implementation of the methods and systems of various embodiments of the present invention involves performing or completing selected tasks or steps manually, automatically, or a combination thereof. Moreover, according to actual instrumentation and equipment of embodiments of the methods and systems, several selected steps could be implemented by hardware or by software. For example, as hardware, selected steps could be implemented as a chip or a circuit. As software, selected steps could be implemented as a plurality of software instructions being executed by a processing unit using any suitable operating system. In any case, selected steps of the methods and systems could be described as being performed by a data processor, such as a computing platform for executing a plurality of instructions.
Embodiments of the present invention described above are exemplary. As such, many changes and modifications may be made to the disclosure set forth above while remaining within the scope of the invention. In addition, materials, methods, and mechanisms suitable for fabricating embodiments of the present invention have been described above by providing specific, non-limiting examples and/or by relying on the knowledge of one of ordinary skill in the art. Materials, methods, and mechanisms suitable for fabricating various embodiments or portions of various embodiments of the present invention described above have not been repeated, for sake of brevity, wherever it should be well understood by those of ordinary skill in the art that the various embodiments or portions of the various embodiments could be fabricated utilizing the same or similar previously described materials, methods or mechanisms.
The scope of the invention should be determined with reference to the appended claims along with their full scope of equivalents.

Claims

WHAT IS CLAIMED IS:
1. A system for videoconferencing comprising:
(a) an assembly attachable to, or integrated with, a computer, said assembly including a camera positionable within a display area of said computer; and
(b) an image positioning software executable by said computer and being capable of positioning an image of a remote individual within a portion of said display area occupied by said camera such that an image captured by said camera of a local individual aligns eye contact between said remote individual and said local individual.
2. The system of claim 1, further comprising (c) head tracking software for tracking a position and/or angle of a head of said local individual with respect to said camera, wherein said image positioning software is also capable of positioning said image of said remote individual within a portion of said display area according to a position and/or angle of said head of said local individual.
3. The system of claim 1, wherein said camera is connected to said assembly via a rotating arm.
4. The system of claim 1, wherein said camera is attached to said assembly via a substantially transparent arm.
5. The system of claim 1, wherein said assembly attaches to a top portion of a display of said computer and said camera hangs down into said display area.
6. The system of claim 1, wherein said assembly attaches to a side portion of a display of said computer and said camera protrudes into said display area.
7. The system of claim 1, wherein said assembly further includes a light source.
8. The system of claim 7, wherein said light source includes infra red light.
9. The system of claim 8, further comprising image analysis software for separating said local individual from a background in an image captured by said camera.
10. The system of claim 1, wherein said assembly further comprises a microphone and sound processing software for processing sound captured by said microphone and a built in microphone of said computer.
11. The system of claim 2, wherein said head tracking software processes images captured by said camera and optionally by a second camera.
PCT/IB2013/052240 2012-03-21 2013-03-21 Apparatus and system for imaging in video calls WO2013140359A2 (en)

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
US201261613900P 2012-03-21 2012-03-21
US61/613,900 2012-03-21
US201261679561P 2012-08-03 2012-08-03
US61/679,561 2012-08-03
US201261718045P 2012-10-24 2012-10-24
US61/718,045 2012-10-24

Publications (2)

Publication Number Publication Date
WO2013140359A2 true WO2013140359A2 (en) 2013-09-26
WO2013140359A3 WO2013140359A3 (en) 2015-08-13

Family

ID=49223412

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2013/052240 WO2013140359A2 (en) 2012-03-21 2013-03-21 Apparatus and system for imaging in video calls

Country Status (1)

Country Link
WO (1) WO2013140359A2 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102021108848A1 (en) 2021-04-09 2022-10-13 Klaus Rempe Computer screen device

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO1994030015A1 (en) * 1993-06-03 1994-12-22 Target Technologies, Inc. Data and television network for digital computer workstations
WO2007061678A2 (en) * 2005-11-18 2007-05-31 Roy Sandberg Screen mounted video teleconferencing camera apparatus
US8345082B2 (en) * 2008-10-08 2013-01-01 Cisco Technology, Inc. System and associated methodology for multi-layered site video conferencing
US8659637B2 (en) * 2009-03-09 2014-02-25 Cisco Technology, Inc. System and method for providing three dimensional video conferencing in a network environment
WO2011146259A2 (en) * 2010-05-20 2011-11-24 Irobot Corporation Mobile human interface robot

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102021108848A1 (en) 2021-04-09 2022-10-13 Klaus Rempe Computer screen device

Also Published As

Publication number Publication date
WO2013140359A3 (en) 2015-08-13

Similar Documents

Publication Publication Date Title
US9491418B2 (en) Method of providing a digitally represented visual instruction from a specialist to a user in need of said visual instruction, and a system therefor
US8749607B2 (en) Face equalization in video conferencing
US8384760B2 (en) Systems for establishing eye contact through a display
EP2720464B1 (en) Generating image information
US20070002130A1 (en) Method and apparatus for maintaining eye contact during person-to-person video telecommunication
US20030112325A1 (en) Camera positioning system and method for eye-to-eye communication
US20160063327A1 (en) Wearable Device To Display Augmented Reality Information
US20110134205A1 (en) Imaging terminal
US8878895B2 (en) Video communicating apparatus having eye-to-eye communication function and method thereof
US20080239061A1 (en) First portable communication device
US20110285861A1 (en) Integrated Display Camera Using A Pinhole Image Capturing Device
US9313452B2 (en) System and method for providing retracting optics in a video conferencing environment
US20100321467A1 (en) Method and system of providing lighting for videoconferencing
CN1914834A (en) Portable terminal with built-in camera for visible light communication
EP1894410A2 (en) Normalized images for cameras
WO2007061678A2 (en) Screen mounted video teleconferencing camera apparatus
JP2004135275A (en) Display imaging apparatus and method for matched visual line
WO2013140359A2 (en) Apparatus and system for imaging in video calls
JP2002541743A (en) Interface mainly using mirrors for computer viewing
US11972505B2 (en) Augmented image overlay on external panel
CN107392081B (en) Image recognition and photography module
WO2011071700A2 (en) Imaging terminal
US8421844B2 (en) Apparatus for correcting gaze, a method of videoconferencing and a system therefor
US8964018B2 (en) Video display systems
JP2000214517A (en) Image pickup device and image display device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13763978

Country of ref document: EP

Kind code of ref document: A2

NENP Non-entry into the national phase in:

Ref country code: DE

122 Ep: pct app. not ent. europ. phase

Ref document number: 13763978

Country of ref document: EP

Kind code of ref document: A2