US20140139668A1 - Projection capture system and method - Google Patents

Projection capture system and method Download PDF

Info

Publication number
US20140139668A1
US20140139668A1 US14/234,030 US201114234030A US2014139668A1 US 20140139668 A1 US20140139668 A1 US 20140139668A1 US 201114234030 A US201114234030 A US 201114234030A US 2014139668 A1 US2014139668 A1 US 2014139668A1
Authority
US
United States
Prior art keywords
projector
camera
work surface
capture
mirror
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US14/234,030
Other languages
English (en)
Inventor
David Bradley Short
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hewlett Packard Development Co LP
Original Assignee
Hewlett Packard Development Co LP
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hewlett Packard Development Co LP filed Critical Hewlett Packard Development Co LP
Publication of US20140139668A1 publication Critical patent/US20140139668A1/en
Assigned to HEWLETT-PACKARD DEVELOPMENT COMPANY, L.P. reassignment HEWLETT-PACKARD DEVELOPMENT COMPANY, L.P. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: SHORT, DAVID BRADLEY
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B15/00Special procedures for taking photographs; Apparatus therefor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B17/00Details of cameras or camera bodies; Accessories therefor
    • G03B17/48Details of cameras or camera bodies; Accessories therefor adapted for combination with other photographic or optical apparatus
    • G03B17/54Details of cameras or camera bodies; Accessories therefor adapted for combination with other photographic or optical apparatus with projector
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • G03B21/14Details
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • G03B21/14Details
    • G03B21/28Reflectors in projection beam
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03545Pens or stylus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/51Housings
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/56Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3102Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM] using two-dimensional electronic spatial light modulators
    • H04N9/3111Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM] using two-dimensional electronic spatial light modulators for displaying the colours sequentially, e.g. by using sequentially activated light sources
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • G03B21/14Details
    • G03B21/147Optical correction of image distortions, e.g. keystone
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • G03B21/14Details
    • G03B21/20Lamp housings
    • G03B21/2006Lamp housings characterised by the light source
    • G03B21/2033LED or laser light sources
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B27/00Photographic printing apparatus
    • G03B27/32Projection printing apparatus, e.g. enlarger, copying camera
    • G03B27/323Copying cameras
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B33/00Colour photography, other than mere exposure or projection of a colour film
    • G03B33/06Colour photography, other than mere exposure or projection of a colour film by additive-colour projection apparatus
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B5/00Adjustment of optical system relative to image or object surface other than for focusing
    • G03B5/02Lateral adjustment of lens

Definitions

  • a new projection capture system has been developed in an effort to improve digitally capturing images of documents and other objects and in an effort to improve the interactive user experience working with real objects and projected objects on a physical work surface.
  • FIGS. 1A and 1B are perspective, exterior views illustrating a new projection capture system, according to one example of the invention.
  • FIG. 1A the image of a two dimensional object (a hardcopy photograph) has been captured and displayed.
  • FIG. 1B the image of a three dimensional object (a cube) has been captured and displayed.
  • FIG. 2 is a perspective, interior view illustrating a projection capture system, such as the system of FIG. 1 , according to one example of the invention.
  • FIG. 3 is a block diagram of the projection capture system shown in FIG. 2 .
  • FIG. 4 is block diagram illustrating one example of a user input device in the system shown in FIGS. 2 and 3 .
  • FIGS. 5 and 6 are side and front elevation views, respectively, illustrating the positioning of the camera and the projector in the projection capture system shown in FIGS. 2 and 3 .
  • FIGS. 7-11 are a progression of side elevation views showing various positions for the projector and the camera in a projection capture system, illustrating some of the problems associated with moving the glare spot out of camera capture area.
  • FIGS. 12 and 13 illustrate one example of the camera in the projection capture system shown in FIGS. 2 and 3 .
  • FIG. 14 illustrates one example of the projector in the projection capture system shown in FIGS. 2 and 3 .
  • FIGS. 15 and 16 illustrate examples of the user input device in the projection capture system shown in FIGS. 2 and 3 .
  • a digital camera, a projector, and a mirror are housed together as a single unit in which, when the unit is deployed for use with a work surface, the camera is positioned above the projector, the projector is positioned below the camera, and the mirror is positioned above the projector and configured to reflect light from the projector into the camera capture area.
  • the projector provides a light source for the camera capturing images where the camera, the projector, and the mirror are positioned with respect to one another such that the glare spot from the projector light lies outside the camera capture area.
  • FIGS. 1A and 1B are perspective, exterior views illustrating one example of a new projection capture system 10 and an interactive workspace 12 associated with system 10 .
  • FIG. 2 is a perspective view illustrating one example of a projection capture system 10 with exterior housing 13 removed.
  • FIG. 3 is a block diagram of system 10 shown in FIG. 2 .
  • projection capture system 10 includes a digital camera 14 , a projector 16 , and a controller 18 .
  • Camera 14 and projector 16 are operatively connected to controller 18 for camera 14 capturing an image of an object 20 in workspace 12 and projector 16 projecting the object image 22 into workspace 12 and, in some examples, for camera 14 capturing an image of the projected object image 22 .
  • the lower part of housing 13 includes a transparent window 21 over projector 16 (and infrared camera 30 ).
  • a two dimensional object 20 (a hardcopy photograph) placed onto a work surface 24 in workspace 12 has been photographed by camera 14 ( FIG. 2 ), object 20 removed to the side of workspace 12 , and object image 22 projected onto a work surface 24 where it can be photographed by camera 14 ( FIG. 2 ) and/or otherwise manipulated by a user.
  • a three dimensional object 20 (a cube) placed onto work surface 24 has been photographed by camera 14 ( FIG. 2 ), object 20 removed to the side of workspace 12 , and object image 22 projected into workspace 12 where it can be photographed by camera 12 and/or otherwise manipulated by a user.
  • System 10 also includes a user input device 26 that allows the user to interact with system 10 .
  • a user may interact with object 20 and/or object image 22 in workspace 12 through input device 26 , object image 22 transmitted to other workspaces 12 on remote systems 10 (not shown) for collaborative user interaction, and, if desired, object image 22 maybe photographed by camera 14 and re-projected into local and/or remote workspaces 12 for further user interaction.
  • work surface 24 is part of the desktop or other underlying support structure 23 .
  • work surface 24 is on a portable mat 25 that may include touch sensitive areas.
  • a user control panel 27 is projected on to work surface 24 while in FIG.
  • control panel 27 may be embedded in a touch sensitive area of mat 25 .
  • an A4, letter or other standard size document placement area 29 may be projected onto work surface 24 in FIG. 1A or printed on a mat 25 in FIG. 1B .
  • other configurations for work surface 24 are possible.
  • control panel 27 and document placement area 29 may be projected on to the blank mat 25 in FIG. 1B just as they are projected on to the desktop 23 in FIG. 1A .
  • user input device 26 includes an infrared digital stylus 28 and an infrared camera 30 for detecting stylus 28 in workspace 12 .
  • a digital stylus has the advantage of allowing input in three dimensions, including along work surface 24 , without a sensing pad or other special surface.
  • system 10 can be used on a greater variety of work surfaces 24 .
  • the usually horizontal orientation of work surface 24 makes it useful for many common tasks.
  • the ability to use traditional writing instruments on work surface 24 is advantageous over vertical or mobile computing interfaces. Projecting an interactive display on to a working desktop mixes computing tasks with the standard objects that may exist on a real desktop, thus physical objects can coexist with projected objects.
  • a three-dimensional pad-free digital stylus enables annotation on top of or next to physical objects without having a sensing pad get in the way of using traditional instruments on work surface 24 .
  • projector 16 serves as the light source for camera 14 .
  • the light path from projector 16 through workspace 12 to work surface 24 should be positioned with respect to camera 14 to enable user display interaction with minimal shadow occlusion while avoiding specular glare off work surface 24 and objects in workspace 12 that would otherwise blind camera 14 .
  • the system configuration described below avoids the glare induced artifacts that would result from a conventional camera lighting geometry while still maintaining a sufficiently steep incident angle for the projector light path desired for proper illumination and projection of two and three dimensional objects in workspace 12 .
  • projector 16 would be mounted directly over workspace 12 at an infinite height above work surface 24 to insure parallel light rays. This configuration, of course, is not realistic. Even if projector 16 was moved down to a realistic height above work surface 24 (but still pointing straight down), the projector's light would be reflected off glossy and semi-glossy surfaces and objects straight back into camera 14 , creating a blinding specular glare. Thus, the glare spot must be moved out of camera capture area 32 . (Specular glare refers to glare from specular reflection in which the angle of incidence of the incident light ray and the angle of reflection of the reflected light ray are equal and the incident, reflected, and normal directions are coplanar.)
  • camera 14 and projector 16 are shifted away from the center of capture and display areas 32 , 34 and projector 16 is positioned low, near base 36 , as shown in FIGS. 5 and 6 , and a fold mirror 38 is introduced into the projector's light path to simulate a projector position high above work surface 24 .
  • the simulated position of projector 16 and the corresponding light path above mirror 38 are shown in phantom lines in FIGS. 5 and 6 .
  • FIGS. 5 and 6 it is helpful to consider the problems associated with other possible configurations for moving the glare spot out of camera capture area 32 .
  • camera 14 is positioned at the center of capture area 32 with an overhead projector 16 slightly off center so that camera 14 does not block the projector light path.
  • the specular glare spot 39 (at the intersection of incident light ray 41 and reflected light ray 43 ) falls within capture area 32 and, thus, will blind camera 14 to some objects and images in capture area 32 .
  • system 10 would be top heavy and, thus, not desirable for a commercial product implementation. If projector 16 is positioned to the side the distance needed to move glare spot 39 out of camera capture area 32 , as shown in FIG. 8 , the corresponding projector lens offset required would not be feasible. Also, any product implementation for the configuration of system 10 shown in FIG. 8 would be undesirably broad and top heavy.
  • FIG. 10 Moving camera 14 off center over capture area 32 brings projector 16 in to make the system less broad, as shown in FIG. 9 , but the projector lens offset is still too great and the product still top heavy.
  • projector 16 is raised to a height so that it may be brought in close enough for an acceptable lens offset but, of course, the product is now too tall and top heavy.
  • the most desirable solution is a “folded” light path for projector 16 , shown in FIGS. 5 and 11 , in which the “high and tight” configuration of FIG. 10 is simulated using fold mirror 38 .
  • camera 14 is placed in front of the mirror 38 over workspace 12 so that it does not block the projector's light path.
  • Camera 14 is positioned off center in the Y direction ( FIG. 5 ) as part of the overall geometry to keep glare spot 39 out of capture area 32 with an acceptable offset for both camera 14 and projector 16 .
  • Projector 16 is focused on mirror 38 so that light from projector 16 is reflected off mirror 38 into workspace 12 .
  • glare spot 39 is kept out of capture area 32 with an acceptable projector offset and system 10 is sufficiently narrow, short and stable (not top heavy) to support a commercially attractive product implementation.
  • controller 18 may include a processor 42 , a memory 44 , and an input/output 46 housed together in device 40 .
  • the system programming to control and coordinate the functions of camera 14 and projector 16 may reside substantially on controller memory 44 for execution by processor 42 , thus enabling a standalone device 40 and reducing the need for special programming of camera 14 and projector 16 .
  • controller 18 is formed in whole or in part using a computer or server remote from camera 14 and projector 16
  • a compact standalone appliance such as device 40 shown in FIGS. 1A , 1 B and 2 offers the user full functionality in an integrated, compact mobile device 40 .
  • camera 14 is positioned in front of mirror 38 above workspace 12 at a location offset from the center of capture area 32 . As noted above, this offset position for camera 14 helps avoid specular glare when photographing objects in workspace 12 without blocking the light path of projector 16 .
  • camera 14 represents generally any suitable digital camera for selectively capturing still and video images in workspace 12 , it is expected that a high resolution digital camera will be used in most applications for system 10 .
  • a “high resolution” digital camera as used in this document means a camera having a sensor array of at least 12 megapixels. Lower resolution cameras may be acceptable for some basic scan and copy functions, but resolutions below 12 megapixels currently are not adequate to generate a digital image sufficiently detailed for a full range of manipulative and collaborative functions.
  • a high resolution sensor paired with the high performance digital signal processing (DSP) chips available in many digital cameras affords sufficiently fast image processing times, for example a click-to-preview time of less than a second, to deliver acceptable performance for most system 10 applications.
  • DSP digital signal processing
  • camera sensor 50 is oriented in a plane parallel to the plane of work surface 24 and light is focused on sensor 50 through a shift lens 52 .
  • This configuration for sensor 50 and lens 52 may be used to correct keystone distortion optically, without digital keystone correction in the object image.
  • the field of view of camera 14 defines a three dimensional capture space 51 in work space 12 within which camera 14 can effectively capture images.
  • Capture space 51 is bounded in the X and Y dimensions by camera capture area 32 on work surface 24 .
  • Lens 52 may be optimized for a fixed distance, fixed focus, and fixed zoom corresponding to capture space 51 .
  • projector 16 is positioned near base 36 outside projector display area 34 and focused on mirror 38 so that light from projector 16 is reflected off mirror 38 into workspace 12 .
  • Projector 16 and mirror 38 define a three dimensional display space 53 in workspace 12 within which projector 16 can effectively display images. Projector display space 53 overlaps camera capture space 51 ( FIG. 12 ) and is bounded in the X and Y dimensions by display area 34 on work surface 24 .
  • projector 16 represents generally any suitable light projector, the compact size and power efficiency of an LED or laser based DLP (digital light processing) projector will be desirable for most applications of system 10 .
  • Projector 16 may also employ a shift lens to allow for complete optical keystone correction in the projected image.
  • the use of mirror 38 increases the length of the projector's effective light path, mimicking an overhead placement of projector 16 , while still allowing a commercially reasonable height for an integrated, standalone device 40 .
  • projector 16 acts as the light source for camera 12 for still and video capture, the projector light must be bright enough to swamp out any ambient light that might cause defects from specular glare. It has been determined that a projector light 200 lumens or greater will be sufficiently bright to swamp out ambient light for the typical desktop application for system 10 and device 40 .
  • projector 16 shines white light into workspace 12 to illuminate object(s) 20 .
  • the time sequencing of the red, green, and blue LED's that make up the white light are synchronized with the video frame rate of camera 14 .
  • the refresh rate of projector 16 and each LED sub-frame refresh period should be an integral number of the camera's exposure time for each captured frame to avoid “rainbow banding” and other unwanted effects in the video image.
  • the camera's video frame rate should be synchronized with the frequency of any ambient fluorescent lighting that typically flickers at twice the AC line frequency (e.g., 120 Hz for a 60 Hz AC power line).
  • An ambient light sensor can be used to sense the ambient light frequency and adjust the video frame rate for camera 14 accordingly.
  • the projector's red, green, and blue LED's can be turned on simultaneously for the camera flash to increase light brightness in workspace 12 , helping swamp out ambient light and allowing faster shutter speeds and/or smaller apertures to reduce noise in the image.
  • the example configuration for system 10 integrated into a standalone device 40 shown in the figures and described above achieves a desirable balance among product size, performance, usability, and cost.
  • the folded light path for projector 16 reduces the height of device 40 while maintaining an effective placement of the projector high above workspace 12 to prevent specular glare in the capture area of camera 12 .
  • the projector's light path shines on a horizontal work surface 24 at a steep angle enabling 3D object image capture. This combination of a longer light path and steep angle minimizes the light fall off across the capture area to maximize the light uniformity for camera flash.
  • the folded light path enables the placement of projector 16 near base 36 for product stability.
  • Suitable input devices and techniques for use in system 10 include, for example, finger touch, touch gestures, stylus, in-air gestures, voice recognition, head tracking and eye tracking.
  • a touch pad can be used to enable a multi-touch interface for navigating a graphical user interface or performing intuitive gesture actions like push, flick, swipe, scroll, pinch-to-zoom, and two-finger-rotate.
  • Depth cameras using structured light, time-of-flight, disturbed light pattern, or stereoscopic vision might also be used to enable in-air gesturing or limited touch and touch gesture detection without a touch pad.
  • a touch-free digital stylus is particularly well suited as a user input 26 for system 10 .
  • user input 26 includes an infrared digital stylus 28 and an infrared camera 30 for detecting stylus 28 in workspace 12 .
  • a touch-free digital stylus has the advantage of allowing input in three dimensions, including along work surface 24 , without a sensing pad or other special surface.
  • input device 26 includes infrared stylus 28 , infrared camera 30 and a stylus charging dock 54 .
  • Stylus 28 includes an infrared light 56 , a touch sensitive nib switch 58 to turn on and off light 56 automatically based on touch, and a manual on/off switch 60 to manually turn on and off light 56 .
  • Nib switch 58 and manual switch 60 are shown in the block diagram of FIG. 4 .
  • Light 56 may be positioned, for example, in the tip of stylus 28 as shown in FIG. 15 to help maintain a clear line-of-sight between camera 30 and light 56 .
  • Light 56 may also emit visible light to help the user determine if the light is on or off.
  • Nib switch 58 may be touch sensitive to about 2 gr of force, for example, to simulate a traditional writing instrument.
  • nib switch 58 detects the contact and turns on light 56 .
  • Light 56 turning on is detected by camera 30 which signals a touch contact event (similar to a mouse button click or a finger touch on a touch pad).
  • Camera 30 continues to signal contact, tracking any movement of stylus 28 , as long as light 56 stays on.
  • the user can slide stylus 28 around on any surface like a pen to trace the surface or to activate control functions.
  • light 56 is switched off and camera 30 signals no contact.
  • Manual light switch 60 may be used to signal a non-touching event. For example, when working in a three dimensional workspace 12 the user may wish to modify, alter, or otherwise manipulate a projected image above work surface 24 by manually signaling a “virtual” contact event.
  • Infrared camera 30 and mirror 38 define a three dimensional infrared capture space 61 in workspace 12 within which infrared camera 30 can effectively detect light from stylus 28 .
  • Capture space 61 is bounded in the X and Y dimensions by an infrared camera capture area 62 on work surface 24 .
  • infrared camera capture space 61 is coextensive with projector display space 53 .
  • infrared camera 30 may capture stylus activation anywhere in display space 53 .
  • camera 30 is integrated into the projection light path such that the projector field-of-view and the infrared camera field-of-view are coincident to help make sure stylus 28 and thus the tracking signal from infrared camera 30 is properly aligned with the projector display anywhere in workspace 12 .
  • visible light 64 generated by red, green and blue LEDs 66 , 68 , and 70 in projector 16 passes through various optics 72 (including a shift lens 74 ) out to mirror 38 ( FIG. 14 ).
  • Infrared light 75 from stylus 28 in workspace 12 reflected off mirror 38 toward projector 16 is directed to infrared camera sensor 76 by an infrared beam splitter 78 through a shift lens 80 .
  • infrared light sensor 76 for camera 30 may be oriented in a plane parallel to the plane of work surface 24 and light focused on sensor 76 through shift lens 80 for full optical keystone correction.
  • FIG. 16 It may be desirable for some commercial implementations to house projector 16 and infrared camera 30 together in a single housing 82 as shown in FIG. 16 .
  • the geometrical configuration for infrared camera 30 shown in FIG. 16 helps insure that the stylus tracking signal is aligned with the display no matter what height stylus 28 is above work surface 24 . If the projector field-of-view and the infrared camera field-of-view are not coincident, it may be difficult to calibrate the stylus tracking at more than one height above work surface 24 , creating the risk of a parallax shift between the desired stylus input position and the resultant displayed position.
  • workspace 12 usually will include a physical work surface 24 for supporting an object 20
  • work space 12 could also be implemented as a wholly projected work space without a physical work surface.
  • workspace 12 may be implemented as a three dimensional workspace for working with two and three dimensional objects or as a two dimensional workspace for working with only two dimensional objects. While the configuration of workspace 12 usually will be determined largely by the hardware and programming elements of system 10 , the configuration of workspace 12 can also be affected by the characteristics of a physical work surface 24 .
  • system 10 and device 40 it may be appropriate to consider that workspace 12 is part of system 10 in the sense that the virtual workspace accompanies system 10 to be manifested in a physical workspace when device 36 is operational, and in other examples it may be appropriate to consider that workspace 12 is not part of system 10 .
  • system 10 examples shown in the figures do not preclude the use of two or more cameras 14 and/or two or more projectors 16 . Indeed, it may be desirable in some applications for a system 10 to include more than one camera, more than one projector or more than one of other system components.
  • the articles “a” and “an” as used in this document mean one or more.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Studio Devices (AREA)
  • Projection Apparatus (AREA)
  • Accessories Of Cameras (AREA)
  • Transforming Electric Information Into Light Information (AREA)
  • Video Image Reproduction Devices For Color Tv Systems (AREA)
  • Camera Bodies And Camera Details Or Accessories (AREA)
US14/234,030 2011-08-02 2011-08-02 Projection capture system and method Abandoned US20140139668A1 (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2011/046253 WO2013019217A1 (en) 2011-08-02 2011-08-02 Projection capture system and method

Publications (1)

Publication Number Publication Date
US20140139668A1 true US20140139668A1 (en) 2014-05-22

Family

ID=47629565

Family Applications (1)

Application Number Title Priority Date Filing Date
US14/234,030 Abandoned US20140139668A1 (en) 2011-08-02 2011-08-02 Projection capture system and method

Country Status (8)

Country Link
US (1) US20140139668A1 (de)
EP (2) EP2740008A4 (de)
JP (2) JP6059223B2 (de)
KR (2) KR101825779B1 (de)
CN (2) CN103827744B (de)
BR (2) BR112014002463B1 (de)
IN (1) IN2014CN00543A (de)
WO (2) WO2013019217A1 (de)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140085265A1 (en) * 2011-12-22 2014-03-27 Apple Inc. Directional Light Sensors
US20150181072A1 (en) * 2013-12-25 2015-06-25 Pfu Limited Image capturing system
WO2016018413A1 (en) * 2014-07-31 2016-02-04 Hewlett-Packard Development Company, L.P. Object capture and illumination
WO2016018416A1 (en) * 2014-07-31 2016-02-04 Hewlett-Packard Development Company, L.P. Determining the location of a user input device
WO2016039736A1 (en) * 2014-09-10 2016-03-17 Hewlett-Packard Development Company, L.P. Image projection and capture with simultaneous display of led light
WO2016118173A1 (en) * 2015-01-23 2016-07-28 Hewlett-Packard Development Company, L.P. Tracking a handheld device on surfaces with optical patterns
US9659371B2 (en) * 2015-10-08 2017-05-23 Christie Digital Systems Usa, Inc. System and method for online projector-camera calibration from one or more images
US10089772B2 (en) 2015-04-23 2018-10-02 Hasbro, Inc. Context-aware digital play
EP3550330A4 (de) * 2016-11-29 2019-10-30 Panasonic Intellectual Property Management Co., Ltd. Entfernungsmessvorrichtung

Families Citing this family (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6175866B2 (ja) * 2013-04-02 2017-08-09 富士通株式会社 インタラクティブプロジェクタ
KR101832042B1 (ko) * 2013-07-31 2018-02-23 휴렛-팩커드 디벨롭먼트 컴퍼니, 엘.피. 프로젝터 유닛 및 컴퓨터를 가진 시스템
JP2016528647A (ja) * 2013-08-22 2016-09-15 ヒューレット−パッカード デベロップメント カンパニー エル.ピー.Hewlett‐Packard Development Company, L.P. プロジェクティブコンピューティングシステム
US9569892B2 (en) 2013-09-26 2017-02-14 Qualcomm Incorporated Image capture input and projection output
CN104656869B (zh) * 2013-11-20 2018-07-06 联想(北京)有限公司 一种数据处理方法及电子设备
WO2015130320A1 (en) * 2014-02-28 2015-09-03 Hewlett-Packard Development Company, L.P. Calibration of sensors and projector
JP2015215416A (ja) * 2014-05-08 2015-12-03 富士通株式会社 プロジェクタ装置
US10073567B2 (en) * 2014-05-23 2018-09-11 Piqs Technology (Shenzhen) Limited Interactive display systems
WO2016018372A1 (en) * 2014-07-31 2016-02-04 Hewlett-Packard Development Company, L.P. White flash generation from a light emitting diode (led) projector
CN106796385A (zh) * 2014-07-31 2017-05-31 惠普发展公司,有限责任合伙企业 具有针对白点的调整的图像投影和捕获
US20170223321A1 (en) * 2014-08-01 2017-08-03 Hewlett-Packard Development Company, L.P. Projection of image onto object
US10191637B2 (en) * 2014-08-04 2019-01-29 Hewlett-Packard Development Company, L.P. Workspace metadata management
EP3191918B1 (de) * 2014-09-12 2020-03-18 Hewlett-Packard Development Company, L.P. Entwicklung kontextueller informationen aus einem bild
CN106210702A (zh) * 2016-08-30 2016-12-07 四川川大智胜软件股份有限公司 一种同步三维数据采集仪、采集系统及采集方法
US11204662B2 (en) * 2017-01-17 2021-12-21 Hewlett-Packard Development Company, L.P. Input device with touch sensitive surface that assigns an action to an object located thereon
CN106791747A (zh) * 2017-01-25 2017-05-31 触景无限科技(北京)有限公司 台灯互动展示的分时处理方法、装置以及台灯
EP3605223B1 (de) * 2017-03-23 2022-04-27 Sony Group Corporation Projektor mit detektionsfunktion
JP6399135B1 (ja) * 2017-03-30 2018-10-03 日本電気株式会社 画像入出力装置及び画像入出力方法
EP3857307A4 (de) 2018-09-26 2022-05-04 Hewlett-Packard Development Company, L.P. Bilderfassungsanordnungen
CN115379063B (zh) * 2022-07-19 2024-01-16 北京麦哲科技有限公司 一种高拍仪

Citations (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US2529664A (en) * 1948-02-14 1950-11-14 Ward Hickok Visual educational projector
US5511148A (en) * 1993-04-30 1996-04-23 Xerox Corporation Interactive copying system
US5890787A (en) * 1996-02-16 1999-04-06 Videotronic Systems Desktop large image and eye-contact projection display
US6067112A (en) * 1996-07-12 2000-05-23 Xerox Corporation Interactive desktop display system for automatically adjusting pan and zoom functions in response to user adjustment of a feedback image
US6305818B1 (en) * 1998-03-19 2001-10-23 Ppt Vision, Inc. Method and apparatus for L.E.D. illumination
US6431711B1 (en) * 2000-12-06 2002-08-13 International Business Machines Corporation Multiple-surface display projector with interactive input capability
US20020180726A1 (en) * 2000-11-06 2002-12-05 Jianbo Shi Paper-based remote sketching system
US20030007135A1 (en) * 2001-07-06 2003-01-09 Sciammarella Eduardo A. Interactive projection system
US6618076B1 (en) * 1999-12-23 2003-09-09 Justsystem Corporation Method and apparatus for calibrating projector-camera system
US20040141162A1 (en) * 2003-01-21 2004-07-22 Olbrich Craig A. Interactive display device
US20050068442A1 (en) * 2003-09-26 2005-03-31 Corey Billington Presentation system and method of use
US20050180631A1 (en) * 2004-02-17 2005-08-18 Zhengyou Zhang System and method for visual echo cancellation in a projector-camera-whiteboard system
US6999118B2 (en) * 2000-11-27 2006-02-14 Sony Corporation Method for driving solid-state imaging pickup device at a variable frame rate and camera
US20070097333A1 (en) * 2005-10-31 2007-05-03 Masoud Zavarehi Determining an adjustment
US20070201863A1 (en) * 2006-02-28 2007-08-30 Microsoft Corporation Compact interactive tabletop with projection-vision
US20070276258A1 (en) * 2006-03-28 2007-11-29 Crane Robert L Synchronization of Illumination Source and Sensor for Improved Visualization of Subcutaneous Structures
US20080036897A1 (en) * 2006-08-09 2008-02-14 Fuji Xerox Co., Ltd. Image processing apparatus
US20090262098A1 (en) * 2008-04-21 2009-10-22 Masafumi Yamada Electronics device having projector module
US7843470B2 (en) * 2005-01-31 2010-11-30 Canon Kabushiki Kaisha System, image processing apparatus, and information processing method
US20110197147A1 (en) * 2010-02-11 2011-08-11 Apple Inc. Projected display shared workspaces
US8035614B2 (en) * 2002-05-28 2011-10-11 Intellectual Ventures Holding 67 Llc Interactive video window
US20110288964A1 (en) * 2010-05-24 2011-11-24 Massachusetts Institute Of Technology Kinetic Input/Output
US8294784B2 (en) * 2009-12-23 2012-10-23 Cisco Technology, Inc. Method, apparatus, and computer-readable storage medium for removing flickering in video images of objects illuminated by ambient lighting and screen light
US20120320157A1 (en) * 2011-06-14 2012-12-20 Microsoft Corporation Combined lighting, projection, and image capture without video feedback

Family Cites Families (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08336065A (ja) * 1995-06-09 1996-12-17 Nikon Corp 原稿照明装置及び画像入力装置
WO2001011880A1 (en) * 1999-08-10 2001-02-15 Peter Mcduffie White Communications system
JP4023048B2 (ja) * 1999-10-05 2007-12-19 株式会社日立製作所 背面投写型映像表示システム
US6965460B1 (en) * 2000-08-08 2005-11-15 Hewlett-Packard Development Company, L.P. Method and system for scanning an image using a look-down linear array scanner
JP2003152851A (ja) * 2001-11-14 2003-05-23 Nec Corp 携帯端末装置
JP2003276399A (ja) * 2002-03-25 2003-09-30 Matsushita Electric Ind Co Ltd 位置検出方法および装置ならびに電子黒板装置
JP2004265185A (ja) * 2003-03-03 2004-09-24 Canon Inc デジタイザ付きカメラ付き画像投影装置
JP2004320123A (ja) * 2003-04-11 2004-11-11 Nec Viewtechnology Ltd 資料提示装置
JP2005128413A (ja) * 2003-10-27 2005-05-19 Pentax Corp 照明制御装置
JP2005250392A (ja) * 2004-03-08 2005-09-15 Olympus Corp カメラ
JP2006004010A (ja) * 2004-06-15 2006-01-05 Brother Ind Ltd 画像入出力装置
JP2005354306A (ja) * 2004-06-09 2005-12-22 Brother Ind Ltd 画像入出力装置
US7519223B2 (en) * 2004-06-28 2009-04-14 Microsoft Corporation Recognizing gestures and using gestures for interacting with software applications
US7557966B2 (en) * 2004-08-11 2009-07-07 Acushnet Company Apparatus and method for scanning an object
JP4433960B2 (ja) * 2004-09-21 2010-03-17 株式会社ニコン 携帯型情報機器
JP3901185B2 (ja) * 2004-09-27 2007-04-04 カシオ計算機株式会社 投影装置、投影画像の撮影方法及びプログラム
JP4695645B2 (ja) * 2005-06-30 2011-06-08 株式会社リコー 投影画像表示装置
US7690795B2 (en) * 2006-10-06 2010-04-06 Hewlett-Packard Development Company, L.P. Projector/camera system
US7907781B2 (en) * 2007-06-21 2011-03-15 Mitsubishi Electric Research Laboraties, Inc. System and method for determining geometries of scenes
US9377874B2 (en) * 2007-11-02 2016-06-28 Northrop Grumman Systems Corporation Gesture recognition light and video image projector
JP5504570B2 (ja) * 2008-03-27 2014-05-28 カシオ計算機株式会社 カメラ内蔵プロジェクタ及びカメラ内蔵プロジェクタの撮影方法
JP5309724B2 (ja) * 2008-06-24 2013-10-09 船井電機株式会社 プロジェクタ
US8619178B2 (en) * 2009-01-28 2013-12-31 Hewlett-Packard Development Company, L.P. Image rendition and capture
US8355038B2 (en) * 2009-01-28 2013-01-15 Hewlett-Packard Development Company, L.P. Systems for capturing images through a display
JP2010224015A (ja) * 2009-03-19 2010-10-07 Sanyo Electric Co Ltd 投写型映像表示装置及び記載ボード及び投写映像システム
JP2010238213A (ja) * 2009-03-30 2010-10-21 Plus Vision Corp タブレットpcシステム及び電子筆記シート
JP5395507B2 (ja) * 2009-05-21 2014-01-22 キヤノン株式会社 三次元形状測定装置、三次元形状測定方法及びコンピュータプログラム
US8842096B2 (en) * 2010-01-08 2014-09-23 Crayola Llc Interactive projection system

Patent Citations (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US2529664A (en) * 1948-02-14 1950-11-14 Ward Hickok Visual educational projector
US5511148A (en) * 1993-04-30 1996-04-23 Xerox Corporation Interactive copying system
US5890787A (en) * 1996-02-16 1999-04-06 Videotronic Systems Desktop large image and eye-contact projection display
US6067112A (en) * 1996-07-12 2000-05-23 Xerox Corporation Interactive desktop display system for automatically adjusting pan and zoom functions in response to user adjustment of a feedback image
US6305818B1 (en) * 1998-03-19 2001-10-23 Ppt Vision, Inc. Method and apparatus for L.E.D. illumination
US6618076B1 (en) * 1999-12-23 2003-09-09 Justsystem Corporation Method and apparatus for calibrating projector-camera system
US20020180726A1 (en) * 2000-11-06 2002-12-05 Jianbo Shi Paper-based remote sketching system
US6999118B2 (en) * 2000-11-27 2006-02-14 Sony Corporation Method for driving solid-state imaging pickup device at a variable frame rate and camera
US6431711B1 (en) * 2000-12-06 2002-08-13 International Business Machines Corporation Multiple-surface display projector with interactive input capability
US20030007135A1 (en) * 2001-07-06 2003-01-09 Sciammarella Eduardo A. Interactive projection system
US8035614B2 (en) * 2002-05-28 2011-10-11 Intellectual Ventures Holding 67 Llc Interactive video window
US20040141162A1 (en) * 2003-01-21 2004-07-22 Olbrich Craig A. Interactive display device
US20050068442A1 (en) * 2003-09-26 2005-03-31 Corey Billington Presentation system and method of use
US20050180631A1 (en) * 2004-02-17 2005-08-18 Zhengyou Zhang System and method for visual echo cancellation in a projector-camera-whiteboard system
US7843470B2 (en) * 2005-01-31 2010-11-30 Canon Kabushiki Kaisha System, image processing apparatus, and information processing method
US20070097333A1 (en) * 2005-10-31 2007-05-03 Masoud Zavarehi Determining an adjustment
US20070201863A1 (en) * 2006-02-28 2007-08-30 Microsoft Corporation Compact interactive tabletop with projection-vision
US20070276258A1 (en) * 2006-03-28 2007-11-29 Crane Robert L Synchronization of Illumination Source and Sensor for Improved Visualization of Subcutaneous Structures
US20080036897A1 (en) * 2006-08-09 2008-02-14 Fuji Xerox Co., Ltd. Image processing apparatus
US20090262098A1 (en) * 2008-04-21 2009-10-22 Masafumi Yamada Electronics device having projector module
US8294784B2 (en) * 2009-12-23 2012-10-23 Cisco Technology, Inc. Method, apparatus, and computer-readable storage medium for removing flickering in video images of objects illuminated by ambient lighting and screen light
US20110197147A1 (en) * 2010-02-11 2011-08-11 Apple Inc. Projected display shared workspaces
US20110288964A1 (en) * 2010-05-24 2011-11-24 Massachusetts Institute Of Technology Kinetic Input/Output
US20120320157A1 (en) * 2011-06-14 2012-12-20 Microsoft Corporation Combined lighting, projection, and image capture without video feedback

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9582083B2 (en) * 2011-12-22 2017-02-28 Apple Inc. Directional light sensors
US20140085265A1 (en) * 2011-12-22 2014-03-27 Apple Inc. Directional Light Sensors
US20150181072A1 (en) * 2013-12-25 2015-06-25 Pfu Limited Image capturing system
US9350894B2 (en) * 2013-12-25 2016-05-24 Pfu Limited Image capturing system
WO2016018413A1 (en) * 2014-07-31 2016-02-04 Hewlett-Packard Development Company, L.P. Object capture and illumination
WO2016018416A1 (en) * 2014-07-31 2016-02-04 Hewlett-Packard Development Company, L.P. Determining the location of a user input device
US11460956B2 (en) 2014-07-31 2022-10-04 Hewlett-Packard Development Company, L.P. Determining the location of a user input device
US11431959B2 (en) 2014-07-31 2022-08-30 Hewlett-Packard Development Company, L.P. Object capture and illumination
EP3192243A4 (de) * 2014-09-10 2018-05-02 Hewlett-Packard Development Company, L.P. Bildprojektion und -erfassung mit simultaner anzeige von led-licht
WO2016039736A1 (en) * 2014-09-10 2016-03-17 Hewlett-Packard Development Company, L.P. Image projection and capture with simultaneous display of led light
US10409143B2 (en) * 2015-01-23 2019-09-10 Hewlett-Packard Development Company, L.P. Tracking a handheld device on surfaces with optical patterns
US20170329207A1 (en) * 2015-01-23 2017-11-16 Hewlett-Packard Development Company, L.P. Tracking a handheld device on surfaces with optical patterns
WO2016118173A1 (en) * 2015-01-23 2016-07-28 Hewlett-Packard Development Company, L.P. Tracking a handheld device on surfaces with optical patterns
US10089772B2 (en) 2015-04-23 2018-10-02 Hasbro, Inc. Context-aware digital play
US9659371B2 (en) * 2015-10-08 2017-05-23 Christie Digital Systems Usa, Inc. System and method for online projector-camera calibration from one or more images
EP3550330A4 (de) * 2016-11-29 2019-10-30 Panasonic Intellectual Property Management Co., Ltd. Entfernungsmessvorrichtung
US11448757B2 (en) 2016-11-29 2022-09-20 Nuvoton Technology Corporation Japan Distance measuring device

Also Published As

Publication number Publication date
CN103828341B (zh) 2017-02-15
KR20140054146A (ko) 2014-05-08
IN2014CN00543A (de) 2015-04-03
EP2740259A1 (de) 2014-06-11
BR112014002463A2 (pt) 2017-02-21
BR112014002448B1 (pt) 2021-12-07
KR20140068902A (ko) 2014-06-09
BR112014002463B1 (pt) 2020-12-08
BR112014002448A2 (pt) 2017-02-21
CN103827744A (zh) 2014-05-28
CN103828341A (zh) 2014-05-28
JP6068392B2 (ja) 2017-01-25
KR101787180B1 (ko) 2017-10-18
EP2740008A4 (de) 2015-09-16
WO2013019252A1 (en) 2013-02-07
CN103827744B (zh) 2016-04-06
JP6059223B2 (ja) 2017-01-11
EP2740259A4 (de) 2015-04-22
WO2013019217A1 (en) 2013-02-07
EP2740008A1 (de) 2014-06-11
JP2014529925A (ja) 2014-11-13
KR101825779B1 (ko) 2018-02-05
JP2014239441A (ja) 2014-12-18

Similar Documents

Publication Publication Date Title
US9521276B2 (en) Portable projection capture device
US20140139668A1 (en) Projection capture system and method
US9560281B2 (en) Projecting an image of a real object
TWI682689B (zh) 智慧型照明裝置及其控制方法
US7176881B2 (en) Presentation system, material presenting device, and photographing device for presentation
US9148573B2 (en) Non-uniform correction illumination pattern
JP6078884B2 (ja) カメラ式マルチタッチ相互作用システム及び方法
WO2012124730A1 (ja) 検出装置、入力装置、プロジェクタ、及び電子機器
JP2022065419A (ja) 表示方法、及び表示装置

Legal Events

Date Code Title Description
AS Assignment

Owner name: HEWLETT-PACKARD DEVELOPMENT COMPANY, L.P., TEXAS

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:SHORT, DAVID BRADLEY;REEL/FRAME:033395/0875

Effective date: 20110805

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION