US20170347078A1 - Projection device - Google Patents
Projection device Download PDFInfo
- Publication number
- US20170347078A1 US20170347078A1 US15/595,965 US201715595965A US2017347078A1 US 20170347078 A1 US20170347078 A1 US 20170347078A1 US 201715595965 A US201715595965 A US 201715595965A US 2017347078 A1 US2017347078 A1 US 2017347078A1
- Authority
- US
- United States
- Prior art keywords
- projection
- module
- camera module
- area
- optical axis
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 230000003287 optical effect Effects 0.000 claims abstract description 85
- 238000010586 diagram Methods 0.000 description 22
- 238000001514 detection method Methods 0.000 description 3
- 230000002452 interceptive effect Effects 0.000 description 3
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B21/00—Projectors or projection-type viewers; Accessories therefor
- G03B21/14—Details
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/042—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
- G06F3/0421—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means by interrupting or reflecting a light beam, e.g. optical touch-screen
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04845—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B21/00—Projectors or projection-type viewers; Accessories therefor
- G03B21/14—Details
- G03B21/20—Lamp housings
- G03B21/2053—Intensity control of illuminating light
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
-
- H—ELECTRICITY
- H05—ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
- H05B—ELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
- H05B47/00—Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
- H05B47/10—Controlling the light source
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3141—Constructional details thereof
-
- H—ELECTRICITY
- H05—ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
- H05B—ELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
- H05B47/00—Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
- H05B47/10—Controlling the light source
- H05B47/105—Controlling the light source in response to determined parameters
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B21/00—Projectors or projection-type viewers; Accessories therefor
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3179—Video signal processing therefor
- H04N9/3182—Colour adjustment, e.g. white balance, shading or gamut
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B15/00—Special procedures for taking photographs; Apparatus therefor
- G03B15/02—Illuminating scene
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B15/00—Special procedures for taking photographs; Apparatus therefor
- G03B15/02—Illuminating scene
- G03B15/03—Combinations of cameras with lighting apparatus; Flash units
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B17/00—Details of cameras or camera bodies; Accessories therefor
- G03B17/48—Details of cameras or camera bodies; Accessories therefor adapted for combination with other photographic or optical apparatus
- G03B17/54—Details of cameras or camera bodies; Accessories therefor adapted for combination with other photographic or optical apparatus with projector
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B21/00—Projectors or projection-type viewers; Accessories therefor
- G03B21/14—Details
- G03B21/142—Adjusting of projection optics
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B21/00—Projectors or projection-type viewers; Accessories therefor
- G03B21/14—Details
- G03B21/20—Lamp housings
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B21/00—Projectors or projection-type viewers; Accessories therefor
- G03B21/14—Details
- G03B21/20—Lamp housings
- G03B21/2046—Positional adjustment of light sources
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/22—Matching criteria, e.g. proximity measures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/042—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
- G06F3/0425—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04817—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance using icons
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04842—Selection of displayed objects or displayed text elements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/0486—Drag-and-drop
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/16—Sound input; Sound output
- G06F3/167—Audio in a user interface, e.g. using voice commands for navigating, audio feedback
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/60—Editing figures and text; Combining figures or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/74—Image or video pattern matching; Proximity measures in feature spaces
- G06V10/761—Proximity, similarity or dissimilarity measures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
- G06V40/28—Recognition of hand or arm movements, e.g. recognition of deaf sign language
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/10—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
- H04N23/12—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths with one sensor only
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/45—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from two or more image sensors being of different type or operating in different modes, e.g. with a CMOS sensor for moving images in combination with a charge-coupled device [CCD] for still images
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/56—Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/62—Control of parameters via user interfaces
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/69—Control of means for changing angle of the field of view, e.g. optical zoom objectives or electronic zooming
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/71—Circuitry for evaluating the brightness variation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/72—Combination of two or more compensation controls
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/73—Circuitry for compensating brightness variation in the scene by influencing the exposure time
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/74—Circuitry for compensating brightness variation in the scene by influencing the scene brightness using illuminating means
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/75—Circuitry for compensating brightness variation in the scene by influencing optical camera components
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/90—Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
-
- H04N5/2256—
-
- H04N5/2258—
-
- H04N5/23296—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/262—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
- H04N5/2628—Alteration of picture size, shape, position or orientation, e.g. zooming, rotation, rolling, perspective, translation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/30—Transforming light or analogous information into electric information
- H04N5/33—Transforming infrared radiation
-
- H04N9/07—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3141—Constructional details thereof
- H04N9/3147—Multi-projection systems
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3141—Constructional details thereof
- H04N9/315—Modulator illumination systems
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3141—Constructional details thereof
- H04N9/315—Modulator illumination systems
- H04N9/3155—Modulator illumination systems for controlling the light source
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3141—Constructional details thereof
- H04N9/3173—Constructional details thereof wherein the projection device is specially adapted for enhanced portability
- H04N9/3176—Constructional details thereof wherein the projection device is specially adapted for enhanced portability wherein the projection device is incorporated in a camera
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3179—Video signal processing therefor
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3191—Testing thereof
- H04N9/3194—Testing thereof including sensor feedback
-
- H—ELECTRICITY
- H05—ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
- H05B—ELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
- H05B47/00—Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
- H05B47/10—Controlling the light source
- H05B47/105—Controlling the light source in response to determined parameters
- H05B47/11—Controlling the light source in response to determined parameters by determining the brightness or colour temperature of ambient light
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/50—Information retrieval; Database structures therefor; File system structures therefor of still image data
- G06F16/58—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/043—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using propagating acoustic waves
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/045—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using resistive elements, e.g. a single continuous surface or two parallel surfaces put in contact
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06K—GRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K7/00—Methods or arrangements for sensing record carriers, e.g. for reading patterns
- G06K7/10—Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
- G06K7/14—Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation using light without selection of wavelength, e.g. sensing reflected white light
- G06K7/1404—Methods for optical code recognition
- G06K7/1408—Methods for optical code recognition the method being specifically adapted for the type of code
- G06K7/1417—2D bar codes
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02B—CLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO BUILDINGS, e.g. HOUSING, HOUSE APPLIANCES OR RELATED END-USER APPLICATIONS
- Y02B20/00—Energy efficient lighting technologies, e.g. halogen lamps or gas discharge lamps
- Y02B20/40—Control techniques providing energy savings, e.g. smart controller or presence detection
Definitions
- the present invention relates to a projection device, and more particularly to a projection device with functions of operation-detection.
- projection modules With the development of technologies in the projector industry, the size of projection modules has been significantly reduced. Thus, in recent years, projection modules have been gradually integrated into other electronic products, such as interactive electronic products.
- a projector has a camera capable of detecting infrared light, and uses an infrared light emitting module to generate an infrared curtain over the display surface.
- an object e.g., a user's finger
- reflection spots of infrared light are generated.
- the reflection spots on the display screen can be captured by the camera capable of detecting infrared light, and control instructions are performed according to the positions of the reflection spots to enable the projector to project various images.
- a color camera can also be used to capture and recognize a user's gesture so as to control the projector to project different images.
- An objective of the present invention is to provide a projection device with improved detection and operation performances.
- an embodiment of a projection device of the invention includes a projection module and a first camera module.
- the projection device has a first optical axis and configured to form a projection area, wherein a projection of the first optical axis on an X-Z plane of the projection device is perpendicular to an X-Y plane on which the projection area is formed.
- the first camera module is disposed on a side of the projection module and includes a second optical axis, wherein the first camera module is configured to form a first shooting area, the second optical axis forms a first angle ⁇ 1 with respect to the first optical axis.
- the projection area at least partially overlaps the first shooting area to form an overlapping area, and the first angle ⁇ 1 is a function of the distance between the projection module and the first camera module.
- the projection module projects an image to a bearing surface bearing the projection device to form the projection area.
- the projection device further includes a base on the bearing surface; the projection module and the first camera module are on aside of the base.
- the bearing surface has a first distance Z 1 to the projection module and a second distance Z 2 to the first camera module.
- a first gap D 1 is formed between the projection module and the first camera module.
- the first shooting area is quadrilateral and includes two long sides and two wide sides, and a length of the long side near the base is 2X.
- the projection device further includes a reference plane.
- the projection module and the first camera module are disposed on the reference plane.
- the projection device further includes a baseline perpendicular to the reference plane.
- the first optical axis of the projection module is parallel to the baseline, and the second optical axis of the first camera module forms the first angle ⁇ 1 with respect to the baseline.
- the first camera module is a color camera module.
- the projection device further includes a light emitting module configured to form a sensing area.
- the first camera module shoots movements of a user occurring in the sensing area, and the first shooting area covers the sensing area.
- the first camera module is an infrared camera module and the light emitting module is an infrared emitting module.
- the projection device further comprising a light emitting module configured to form a sensing area, wherein the first camera module shoots movements of a user occurring in the sensing area, and the first shooting area covers the sensing area.
- the first camera module is an infrared camera module
- the light emitting module is an infrared emitting module
- the projection device further includes a processing module electrically connected to the projection module and the first camera module.
- the processing module is configured to enable the projection module and the first camera module.
- the projection device further includes a camera driving module electrically connected to the processing module.
- the processing module enables the camera driving module to drive the first camera module to rotate to a specific angle on a Y-Z plane when the projection module rotates to the specific angle on the Y-Z plane.
- the camera driving module includes at least one servo motor and a gear set.
- the projection device further includes a second camera module and a light emitting module.
- the second camera module is disposed between the projection module and the first camera module, wherein the second camera module includes a third optical axis and is configured to form a second shooting area, the third optical axis forms a second angle ⁇ 2 with respect to the first optical axis, and the second shooting area.
- the first shooting area and the projection area at least partially overlap one another to form the overlapping area.
- the light emitting module is configured to form a sensing area, wherein the second camera module is configured to shoot movements of a user occurring in the sensing area, and the second shooting area covers the sensing area.
- the projection module projects an image to a bearing surface bearing the projection device to form the projection area.
- the projection device further includes a base on the bearing surface; the projection module and the first camera module are on aside of the base.
- the bearing surface has a first distance Z 1 to the projection module, a second distance Z 2 to the first camera module.
- the second camera module has a third distance Z 3 to the bearing surface.
- a first gap D 1 is formed between the projection module and the first camera module, a second gap D 2 is formed between the projection module and the second camera module.
- the second shooting area is quadrilateral and includes two long sides and two wide sides; and a length of the long side near the base is 2X 1 .
- the projection device further includes a reference plane.
- the projection module, the first camera module and the second camera module are disposed on the reference plane.
- the projection device further includes a baseline perpendicular to the reference plane.
- the second optical axis of the first camera module forms the first angle ⁇ 1 with respect to the baseline
- the third optical axis of the second camera module forms the second angle ⁇ 2 with respect to the baseline
- the first optical axis of the projection module forms a third angle ⁇ 3 with respect to the baseline.
- the third angle falls within the range of 0 to 30 degrees.
- the first camera module is a color camera module
- the second camera module is an infrared camera module
- the light emitting module is an infrared emitting module
- a projection device in another embodiment, includes a projection module and a first camera module.
- the projection module includes a first optical axis and configured to form a projection area.
- the first camera module is disposed on a first side of the projection module and including a second optical axis.
- the first camera module and the projection module are disposed on a reference plane, and the first optical axis and the second optical axis are perpendicular to the reference plane.
- the first camera module is configured to form a first shooting area, and the projection area at least partially overlaps the first shooting area to form an overlapping area.
- the projection device further includes a second camera module and a light emitting module.
- the second camera module is disposed on a second side opposite to the first side of the projection module and on the reference plane.
- the second camera module includes a third optical axis and is configured to form a second shooting area.
- the third optical axis is perpendicular to the reference plane.
- the projection area, the first shooting area and the second shooting area at least partially overlap one another to form the overlapping area.
- the light emitting module is configured to form a sensing area.
- the second camera module is configured to shoot movements of a user occurring in the sensing area, and the second shooting area covers the sensing area.
- the first camera module is connected to the first side of the projection module
- the second camera module is connected to the second side of the projection module
- the first camera module is a color camera module
- the second camera module is an infrared camera module
- the light emitting module is an infrared emitting module
- FIG. 1 is a schematic diagram of an embodiment of a projection device of the present invention
- FIG. 2 is a schematic diagram of a projection area of a projection module and a shooting area of the first camera module of the projection device of FIG. 1 ;
- FIG. 3 is a schematic diagram of another embodiment of a projection device of the present invention.
- FIG. 4 is a schematic diagram of another embodiment of a projection device of the present invention.
- FIG. 5 is a schematic diagram of a projection area of a projection module, a shooting area of the first camera module and a shooting area of the second camera module of the projection device of FIG. 4 ;
- FIG. 6 is a block diagram of another embodiment of a projection device of the present invention.
- FIG. 7 is a schematic diagram of another embodiment of a projection device of the present invention.
- FIG. 8 is a schematic diagram of another embodiment of a projection device of the present invention.
- FIG. 9 is a schematic diagram of a projection area of a projection module, a shooting area of the first camera module and a shooting area of the second camera module of the projection device of FIG. 8 ;
- FIG. 10 is a schematic diagram of another embodiment of a projection device of the present invention.
- FIG. 11 is a schematic diagram of a projection area of a projection module, a shooting area of the first camera module and a shooting area of the second camera module of the projection device of FIG. 10 .
- FIG. 1 is a schematic diagram of an embodiment of a projection device of the present invention
- FIG. 2 is a schematic diagram of a projection area of a projection module and a shooting area of the first camera module of the projection device of FIG. 1
- the projection device 10 of this embodiment includes a projection module 10 and a first camera module 11 .
- the projection module 10 includes a first optical axis AX 1 , and the projection module 10 is configured to form a projection area PA.
- the projection module 10 projects an image to a bearing surface 100 bearing the projection device 1 to form the projection area PA.
- a projection of the first optical axis AX 1 on a projection plane formed by an X-axis and a Z-axis is perpendicular to the projection area PA which is on a plane formed by the X axis and a Y axis.
- the first camera module 11 is disposed on a side of the projection module 10 and has a second optical axis AX 2 .
- the first camera module 11 is configured to form a first shooting area CA 1 on the bearing surface 100 .
- the first camera module 11 is a color camera module, but the invention is not limited thereto.
- the color camera module captures a user's gestures or operational movements to a mouse or a keyboard in the first shooting area CA 1 so that the projection module 10 is controlled to project different images.
- the second optical axis AX 2 forms a first angle 401 with respect to the first optical axis AX 1 .
- the projection area PA of the projection module 10 at least partially overlaps the first shooting area CA 1 of the first camera module 11 to form an overlapping area OA.
- the first angle 401 between the first optical axis AX 1 and the second optical axis AX 2 is a function of a distance (a first gap D 1 ) between the projection module 10 and the first camera module 11 .
- the projection module 10 and the first camera module 11 are disposed over the bearing surface 100 .
- the first camera module 11 is disposed in the housing 15
- the projection module 10 is disposed on a side of the housing 15 .
- the housing 15 is connected to a base 17 through a frame 16 , and the base 17 is disposed on the bearing surface 100 .
- the projection module 10 and the first camera module 11 are disposed above the base 17 .
- the bearing surface 100 has a first distance Z 1 to the projection module 10 and a second distance Z 2 to the first camera module 11 .
- the first distance Z 1 and the second distance Z 2 range from 350 mm to 450 mm.
- the projection module 10 and the first camera module 11 are disposed on the same reference plane RP.
- the reference plane RP is parallel to the bearing surface 100 . That is a height (the first distance Z 1 ) of the projection module 10 with respect to the bearing surface 100 is equal to a height (the second distance Z 2 ) of the first camera module 11 with respect to the bearing surface 100 .
- the invention is not limited thereto.
- the first distance Z 1 is not equal to the second distance Z 2 .
- the first gap D 1 is formed between the first optical axis AX 1 of the projection module 10 and the projection module 10 and the second optical axis AX 2 of the first camera module 11 .
- the first gap D 1 ranges from 160 mm to 170 mm, but the invention is not limited thereto.
- the first shooting area CA 1 formed on the bearing surface 100 by the first camera module 11 is quadrilateral which has a long side X near the first shooting area CA 1 .
- the projection device 1 of this embodiment further includes a baseline L perpendicular to the reference plane RP.
- the first optical axis AX 1 of the projection module 10 is parallel to the baseline L.
- the second optical axis AX 2 has the first angle ⁇ 1 with respect to the baseline L.
- the first angle ⁇ 1 ranges from 3 degree to 5 degree. That is the projection direction of the projection module 10 is maintained, but the shooting direction of the first camera module 11 is shifted for 3 degree to 5 degree with respect to the projection direction of the projection module 10 to allow the projection area PA at least partially overlaps the a first shooting area CA 1 of the first camera module 11 to form the overlapping area OA.
- the projection module 10 has a view angle ⁇ F 1 ranging from 60 degree to 70 degree.
- the first camera module 11 has a view angle ⁇ F 2 ranging from 60 degree to 75 degree.
- the first angle ⁇ 1 is the angle between the second optical axis AX 2 of the first camera module 11 and the first optical axis AX 1 of the projection module 10 .
- the first angle ⁇ 1 is a function of the distance (the first gap D 1 ) between the projection module 10 and the first camera module 11 .
- FIG. 3 is a schematic diagram of another embodiment of a projection device of the present invention.
- the projection device 1 a is similar to the projection device 1 of FIG. 1 .
- the projection device 1 a of this embodiment further includes a light emitting module 13 .
- the light emitting module 13 is an infrared emitting module
- the first camera module 11 is an infrared camera module.
- the light emitting module 13 is configured to form a sensing area above the bearing surface 100 (not shown in FIG. 3 ).
- the sensing area is an infrared curtain.
- the first shooting area of the first camera module 11 (similar to the first shooting area CA 1 of FIG.
- the sensing area 2 includes the sensing area and a user's operational movements are captured in the sensing area.
- the user's fingers enter the sensing area and reflect light to generate reflected light spot (such as a reflected infrared spot), and the first camera module 11 captures an image including the reflected light spot.
- the position of the reflected light spot is identified to perform corresponding operational commands to enable the projection module 10 to project different images.
- the projection area of the projection module 10 , the first shooting area of the first camera module 11 and the overlapping area formed by the projection area at least partially overlapping the first shooting area are similar to that of FIG. 2 , and the description for them is thus omitted.
- FIG. 4 is a schematic diagram of another embodiment of a projection device of the present invention
- FIG. 5 is a schematic diagram of a projection area of a projection module, a shooting area of the first camera module and a shooting area of the second camera module of the projection device of FIG. 4
- the projection device 1 b is similar to the projection device 1 of FIG. 1
- the projection device 1 b of this embodiment further includes a second camera module 12 and a light emitting module 13 .
- the second camera module 12 is disposed between the projection module 10 and the first camera module 11 .
- the second camera module 12 has a third optical axis AX 3 to form a second shooting area CA 2 .
- the light emitting module 13 is configured to form a sensing area (not shown in FIGS. 4 and 5 ).
- the sensing area is an infrared curtain.
- the second shooting area CA 2 of the second camera module 12 includes the sensing area and a user's operational movements are captured in the sensing area.
- the first camera module 11 is a color camera module
- the second camera module 12 is an infrared camera module
- light emitting module 13 is an infrared emitting module.
- the projection device lb is operated by gestures captured by the color camera module or touch-controlled by movements captured by the infrared camera module and the infrared emitting module.
- the second optical axis AX 2 of the first camera module 11 has the first angle ⁇ 1 with respect to the first optical axis AX 1 of the projection module 10
- the third optical axis AX 3 of the second camera module 12 has the second angle ⁇ 2 with respect to the first optical axis AX 1 of the projection module 10 .
- the projection area PA of the projection module 10 , the first shooting area CA 1 of the first camera module 11 and the second shooting area CA 2 of the second camera module 12 at least partially overlap one another to form an overlapping area OA′.
- the projection module 10 , the first camera module 11 and the second camera module 12 are disposed above the bearing surface 100 of the projection device 1 .
- the first camera module 11 and the second camera module 12 are disposed within the housing 15 , and the projection module 10 is disposed on a side of the housing 15 .
- the housing 15 is connected to the base 17 through the frame 16 .
- the base 17 is on the bearing surface 100 . That is, the projection module 10 , the first camera module 11 and the second camera module 12 are disposed above the base 17 .
- the projection module 10 is spaced from the bearing surface 100 bearing the projection device 1 for a first distance Z 1
- the first camera module 11 is spaced from the bearing surface 100 for a second distance Z 2
- the second camera module 12 is spaced from the bearing surface 100 for a third distance Z 3 .
- the first distance Z 1 , the second distance Z 2 and the third distance Z 3 ranges from 350 mm to 450 mm.
- the projection module 10 , the first camera module 11 and the second camera module 12 are located on the same reference plane RP.
- the height (the first distance Z 1 ) of the projection module 10 with respect to the bearing surface 100 , the height (the second distance Z 2 ) of the first camera module 11 with respect to the bearing surface 100 and the height (the third distance Z 3 ) of the second camera module 12 with respect to the bearing surface 100 are equal.
- the invention is not limited thereto.
- the first distance Z 1 , the second distance Z 2 and the third distance Z 3 are unequal.
- the projection module 10 is spaced from the first camera module 11 for a first gap D 1 .
- the first gap D 1 ranges from 160 mm to 170 mm.
- the projection module 10 is spaced from the second camera module 12 for a second gap D 2 .
- the second gap D 2 ranges from 110 mm to 120 mm.
- the invention is not limited thereto.
- the first gap D 1 between the projection module 10 and the first camera module 11 ranges from 110 mm to 120 mm
- the second gap D 2 between the projection module 10 and the second camera module 12 ranges from 160 mm to 170 mm.
- the second shooting area CA 2 formed on the bearing surface 100 by the second camera module 12 is quadrilateral which has a long side X 1 near the second shooting area CA 2 .
- the projection device 1 of this embodiment further includes a baseline L perpendicular to the reference plane RP.
- the first optical axis AX 1 of the projection module 10 is parallel to the baseline L.
- the second optical axis AX 2 of the first camera module 11 forms a first angle ⁇ 1 with respect to the baseline L.
- the first angle ⁇ 1 ranges from 3 degree to 5 degree.
- the third optical axis AX 3 of the second camera module 12 forms a second angle ⁇ 2 with respect to the baseline L.
- the second angle ⁇ 2 ranges from 3 degree to 5 degree.
- the projection direction of the projection module 10 is maintained, but the shooting direction of the first camera module 11 is shifted for 3 degrees to 5 degrees with respect to the projection direction of the projection module 10 to allow the projection area PA, the first shooting area CA 1 and the second shooting area CA 2 to at least partially overlap one another to form the overlapping area OA′.
- the projection module 10 of this embodiment has a view angle ⁇ 1 ranging from 60 degree to 70 degree.
- the first camera module 11 has a view angle ⁇ F 2 ranging from 60 degree to 75 degree.
- the second camera module 12 has a view angle ⁇ F 3 ranging from 65 degree to 75 degree.
- the first angle 401 between the second optical axis AX 2 of the first camera module 11 and the first optical axis AX 1 of the projection module 10 is equal to the second angle ⁇ 2 between the third optical axis AX 3 of the second camera module 12 and the first optical axis AX 1 of the projection module 10 .
- the invention is not limited thereto.
- the first angle ⁇ 1 and the second angle ⁇ 2 are different.
- the first angle ⁇ 1 is the angle between the second optical axis AX 2 of the first camera module 11 and the first optical axis AX 1 of the projection module 10
- the second angle ⁇ 2 is the angle between the third optical axis AX 3 of the second camera module 12 and the first optical axis AX 1 of the projection module 10
- the first angle ⁇ 1 is a function of the distance (the first gap D 1 ) between the projection module 10 and the first camera module 11 .
- the second angle ⁇ 2 is a function of the distance (the second gap D 2 ) between the projection module 10 and the second camera module 12 .
- FIG. 6 is a block diagram of another embodiment of a projection device of the present invention.
- the projection device 1 c of this embodiment is similar to the projection device 1 of FIG. 1 .
- the projection device 1 c of this embodiment further includes a processing module 14 electrically connected to the projection module 10 and the first camera module 11 .
- the processing module 14 is configured to enable the projection module 10 and the first camera module 11 .
- the projection module 10 projects image to the bearing surface 100 according to the image signals provided by the processing module 14 .
- the processing module 14 controls the projection module 10 to project another image according to the image captured by the first camera module 11 .
- the projection device 1 c further includes a camera driving module 18 electrically connected to the processing module 14 .
- the processing module 14 enables the camera driving module 18 to drive the first camera module 11 to rotate to a specific angle on the Y-Z plane (for example, the first angle ⁇ 1 of the aforementioned embodiments).
- the camera driving module 18 includes at least one servo motor 181 and at least one gear set 182 . When the camera driving module 18 is enabled, the servo motor 181 rotates the gear set 182 so as to rotate the first camera 11 to a corresponding angle on the X-Y plane.
- FIG. 7 is a schematic diagram of another embodiment of a projection device of the present invention.
- the projection device 1 d is similar to the projection device 1 b of FIG. 4 .
- the first optical axis AX 1 of the projection module 10 of the projection device 1 d forms a third angle ⁇ 3 with respect to the baseline L.
- the third angle ⁇ 3 between the first optical axis AX 1 of the projection module 10 and the baseline L ranges from 0 degree to 30 degree.
- the overlapping area formed by the projection area of the projection module 10 , the first shooting area of the first camera module 11 and the second shooting area of the second camera module 12 is effectively increased.
- Other structures of the projection device 1 d are similar to that of the projection device 1 b of FIG. 4 , and the description for them is thus omitted.
- the projection area of the projection module 10 , the first shooting area of the first camera module 11 , the second shooting area of the second camera module 12 and the overlapping area formed by the projection area at least partially overlapping the first shooting area and the second shooting area are similar to that of FIG. 5 , and the description for them is thus omitted.
- FIG. 8 is a schematic diagram of another embodiment of a projection device of the present invention
- FIG. 9 is a schematic diagram of a projection area of a projection module, a shooting area of the first camera module and a shooting area of the second camera module of the projection device of FIG. 8
- the projection device le is similar to the projection device 1 b of FIG. 4
- the projection device le of this embodiment, projection module 10 , the first camera module 11 and the second camera module 12 are located on the same reference plane RP.
- the first optical axis AX 1 , the second optical axis AX 2 and the third optical axis AX 3 are perpendicular to the reference plane RP.
- the first optical axis AX 1 of the projection module 10 , the second optical axis AX 2 of the first camera module 11 and the third optical axis AX 3 of the second camera module 12 are perpendicular to the reference plane RP.
- the projection area PA of the projection module 10 , the projection area PA of the projection module 10 , the first shooting area CA 1 of the first camera module 11 and the second shooting area CA 2 of the second camera module 12 at least partially overlap one another to form an overlapping area OA′′ which is substantially quadrilateral.
- the projection module 10 has a projection element with smaller view angle so as to form a smaller projection area PA and increase the overlapping area OA′′ formed by the projection area PA of the projection module 10 , the projection area PA of the projection module 10 , the first shooting area CA 1 of the first camera module 11 and the second shooting area CA 2 of the second camera module 12 .
- FIG. 10 is a schematic diagram of another embodiment of a projection device of the present invention
- FIG. 11 is a schematic diagram of a projection area of a projection module, a shooting area of the first camera module and a shooting area of the second camera module of the projection device of FIG. 10
- the projection device if is similar to the projection device 1 e of FIG. 8
- the first camera module 11 of the projection device 1 f of this embodiment is connected to a first side 101 of the projection module 10
- the second camera module 12 is connected to a second side 102 of the projection module 10 .
- the projection area PA of the projection module 10 , the projection area PA of the projection module 10 , the first shooting area CA 1 of the first camera module 11 and the second shooting area CA 2 of the second camera module 12 at least partially overlap one another to form an overlapping area OA′′′ which is substantially quadrilateral.
- the second optical axis of the first camera module and the third optical axis of the second camera module are inclined with respect to the first optical axis of the projection module to enlarge the overlapping area formed by the projection area, the first shooting area and the second shooting area so as to improve operational performance.
Abstract
Description
- The present invention relates to a projection device, and more particularly to a projection device with functions of operation-detection.
- With the development of technologies in the projector industry, the size of projection modules has been significantly reduced. Thus, in recent years, projection modules have been gradually integrated into other electronic products, such as interactive electronic products.
- In various interactive electronic products, for example, a projector has a camera capable of detecting infrared light, and uses an infrared light emitting module to generate an infrared curtain over the display surface. When the infrared curtain is blocked by an object (e.g., a user's finger), reflection spots of infrared light are generated. The reflection spots on the display screen can be captured by the camera capable of detecting infrared light, and control instructions are performed according to the positions of the reflection spots to enable the projector to project various images. In addition, a color camera can also be used to capture and recognize a user's gesture so as to control the projector to project different images.
- As seen from the above, how to improve the detection ability and controllability of an interactive projector has been the focus among the persons skilled in the technical field.
- An objective of the present invention is to provide a projection device with improved detection and operation performances.
- Other objectives and advantages of the present invention can be further understood by the technical features disclosed by the invention.
- To achieve the objectives, an embodiment of a projection device of the invention includes a projection module and a first camera module. The projection device has a first optical axis and configured to form a projection area, wherein a projection of the first optical axis on an X-Z plane of the projection device is perpendicular to an X-Y plane on which the projection area is formed. The first camera module is disposed on a side of the projection module and includes a second optical axis, wherein the first camera module is configured to form a first shooting area, the second optical axis forms a first angle Δθ1 with respect to the first optical axis. The projection area at least partially overlaps the first shooting area to form an overlapping area, and the first angle Δθ1 is a function of the distance between the projection module and the first camera module.
- In another embodiment, the projection module projects an image to a bearing surface bearing the projection device to form the projection area. The projection device further includes a base on the bearing surface; the projection module and the first camera module are on aside of the base. The bearing surface has a first distance Z1 to the projection module and a second distance Z2 to the first camera module. A first gap D1 is formed between the projection module and the first camera module. The first camera module shoots towards the bearing surface to form the first shooting area on the bearing surface when Δθ1=0. The first shooting area is quadrilateral and includes two long sides and two wide sides, and a length of the long side near the base is 2X.
- In another embodiment, the projection device further includes a reference plane. The projection module and the first camera module are disposed on the reference plane.
- In another embodiment, the projection device further includes a baseline perpendicular to the reference plane. The first optical axis of the projection module is parallel to the baseline, and the second optical axis of the first camera module forms the first angle Δθ1 with respect to the baseline.
- In another embodiment, a value of the first angle Δθ1 is the function of the distance between the projection module and the first camera module, and Δθ1=ƒ(D1)=arctan((D1+X)/Z2)−arctan(X/Z2) when the projection area of the projection module is entirely included in the first shooting area.
- In another embodiment, the first camera module is a color camera module.
- In another embodiment, the projection device further includes a light emitting module configured to form a sensing area. The first camera module shoots movements of a user occurring in the sensing area, and the first shooting area covers the sensing area.
- In another embodiment, the first camera module is an infrared camera module and the light emitting module is an infrared emitting module.
- In another embodiment, the projection device further comprising a light emitting module configured to form a sensing area, wherein the first camera module shoots movements of a user occurring in the sensing area, and the first shooting area covers the sensing area.
- In another embodiment, the first camera module is an infrared camera module, and the light emitting module is an infrared emitting module.
- In another embodiment, the projection device further includes a processing module electrically connected to the projection module and the first camera module. The processing module is configured to enable the projection module and the first camera module.
- In another embodiment, the projection device further includes a camera driving module electrically connected to the processing module. The processing module enables the camera driving module to drive the first camera module to rotate to a specific angle on a Y-Z plane when the projection module rotates to the specific angle on the Y-Z plane.
- In another embodiment, the camera driving module includes at least one servo motor and a gear set.
- In another embodiment, the projection device further includes a second camera module and a light emitting module. The second camera module is disposed between the projection module and the first camera module, wherein the second camera module includes a third optical axis and is configured to form a second shooting area, the third optical axis forms a second angle Δθ2 with respect to the first optical axis, and the second shooting area. The first shooting area and the projection area at least partially overlap one another to form the overlapping area. The light emitting module is configured to form a sensing area, wherein the second camera module is configured to shoot movements of a user occurring in the sensing area, and the second shooting area covers the sensing area.
- In another embodiment, the projection module projects an image to a bearing surface bearing the projection device to form the projection area. The projection device further includes a base on the bearing surface; the projection module and the first camera module are on aside of the base. The bearing surface has a first distance Z1 to the projection module, a second distance Z2 to the first camera module. The second camera module has a third distance Z3 to the bearing surface. A first gap D1 is formed between the projection module and the first camera module, a second gap D2 is formed between the projection module and the second camera module. The second camera module shoots towards the bearing surface to form the second shooting area on the bearing surface when the second angle Δθ2=0. The second shooting area is quadrilateral and includes two long sides and two wide sides; and a length of the long side near the base is 2X1.
- In another embodiment, the projection device further includes a reference plane. The projection module, the first camera module and the second camera module are disposed on the reference plane.
- In another embodiment, the projection device further includes a baseline perpendicular to the reference plane. The second optical axis of the first camera module forms the first angle Δθ1 with respect to the baseline, the third optical axis of the second camera module forms the second angle Δθ2 with respect to the baseline, and the first optical axis of the projection module forms a third angle Δθ3 with respect to the baseline. When the projection area of the projection module is entirely included in the first shooting area, a value of the first angle Δθ1ƒ(D1)=arctan((D1+X)/Z2)−arctan(X/Z2). When the projection area of the projection module is entirely included in the second shooting area, a value of the second angle Δθ2=ƒ(D2)=arctan((D2+X1)/Z3)−arctan(X1/Z3). The third angle falls within the range of 0 to 30 degrees.
- In another embodiment, the first camera module is a color camera module, the second camera module is an infrared camera module, and the light emitting module is an infrared emitting module.
- In another embodiment, a projection device includes a projection module and a first camera module. The projection module includes a first optical axis and configured to form a projection area. The first camera module is disposed on a first side of the projection module and including a second optical axis. The first camera module and the projection module are disposed on a reference plane, and the first optical axis and the second optical axis are perpendicular to the reference plane. The first camera module is configured to form a first shooting area, and the projection area at least partially overlaps the first shooting area to form an overlapping area.
- In another embodiment, the projection device further includes a second camera module and a light emitting module. The second camera module is disposed on a second side opposite to the first side of the projection module and on the reference plane. The second camera module includes a third optical axis and is configured to form a second shooting area. The third optical axis is perpendicular to the reference plane. The projection area, the first shooting area and the second shooting area at least partially overlap one another to form the overlapping area. The light emitting module is configured to form a sensing area. The second camera module is configured to shoot movements of a user occurring in the sensing area, and the second shooting area covers the sensing area.
- In another embodiment, the first camera module is connected to the first side of the projection module, and the second camera module is connected to the second side of the projection module.
- In another embodiment, the first camera module is a color camera module, the second camera module is an infrared camera module, and the light emitting module is an infrared emitting module.
- The present invention will become more readily apparent to those ordinarily skilled in the art after reviewing the following detailed description and accompanying drawings, in which:
-
FIG. 1 is a schematic diagram of an embodiment of a projection device of the present invention; -
FIG. 2 is a schematic diagram of a projection area of a projection module and a shooting area of the first camera module of the projection device ofFIG. 1 ; -
FIG. 3 is a schematic diagram of another embodiment of a projection device of the present invention; -
FIG. 4 is a schematic diagram of another embodiment of a projection device of the present invention; -
FIG. 5 is a schematic diagram of a projection area of a projection module, a shooting area of the first camera module and a shooting area of the second camera module of the projection device ofFIG. 4 ; -
FIG. 6 is a block diagram of another embodiment of a projection device of the present invention; -
FIG. 7 is a schematic diagram of another embodiment of a projection device of the present invention; -
FIG. 8 is a schematic diagram of another embodiment of a projection device of the present invention; -
FIG. 9 is a schematic diagram of a projection area of a projection module, a shooting area of the first camera module and a shooting area of the second camera module of the projection device ofFIG. 8 ; -
FIG. 10 is a schematic diagram of another embodiment of a projection device of the present invention; and -
FIG. 11 is a schematic diagram of a projection area of a projection module, a shooting area of the first camera module and a shooting area of the second camera module of the projection device ofFIG. 10 . - The present invention will now be described more specifically with reference to the following embodiments. It is to be noted that the following descriptions of preferred embodiments of this invention are presented herein for purpose of illustration and description only. It is not intended to be exhaustive or to be limited to the precise form disclosed.
- Referring to
FIGS. 1 and 2 .FIG. 1 is a schematic diagram of an embodiment of a projection device of the present invention, andFIG. 2 is a schematic diagram of a projection area of a projection module and a shooting area of the first camera module of the projection device ofFIG. 1 . Theprojection device 10 of this embodiment includes aprojection module 10 and afirst camera module 11. Theprojection module 10 includes a first optical axis AX1, and theprojection module 10 is configured to form a projection area PA. Theprojection module 10 projects an image to abearing surface 100 bearing theprojection device 1 to form the projection area PA. A projection of the first optical axis AX1 on a projection plane formed by an X-axis and a Z-axis is perpendicular to the projection area PA which is on a plane formed by the X axis and a Y axis. Thefirst camera module 11 is disposed on a side of theprojection module 10 and has a second optical axis AX2. Thefirst camera module 11 is configured to form a first shooting area CA1 on thebearing surface 100. In this embodiment, thefirst camera module 11 is a color camera module, but the invention is not limited thereto. The color camera module captures a user's gestures or operational movements to a mouse or a keyboard in the first shooting area CA1 so that theprojection module 10 is controlled to project different images. In this embodiment, the second optical axis AX2 forms a first angle 401 with respect to the first optical axis AX1. The projection area PA of theprojection module 10 at least partially overlaps the first shooting area CA1 of thefirst camera module 11 to form an overlapping area OA. The first angle 401 between the first optical axis AX1 and the second optical axis AX2 is a function of a distance (a first gap D1) between theprojection module 10 and thefirst camera module 11. - Other detailed structures of the
projection device 1 of the embodiment are described as follows. - As shown in
FIGS. 1 and 2 , theprojection module 10 and thefirst camera module 11 are disposed over the bearingsurface 100. Thefirst camera module 11 is disposed in thehousing 15, and theprojection module 10 is disposed on a side of thehousing 15. Thehousing 15 is connected to a base 17 through aframe 16, and thebase 17 is disposed on thebearing surface 100. Theprojection module 10 and thefirst camera module 11 are disposed above thebase 17. In this embodiment, the bearingsurface 100 has a first distance Z1 to theprojection module 10 and a second distance Z2 to thefirst camera module 11. The first distance Z1 and the second distance Z2 range from 350 mm to 450 mm. In this embodiment, theprojection module 10 and thefirst camera module 11 are disposed on the same reference plane RP. The reference plane RP is parallel to thebearing surface 100. That is a height (the first distance Z1) of theprojection module 10 with respect to thebearing surface 100 is equal to a height (the second distance Z2) of thefirst camera module 11 with respect to thebearing surface 100. However, the invention is not limited thereto. In another embodiment, the first distance Z1 is not equal to the second distance Z2. In addition, the first gap D1 is formed between the first optical axis AX1 of theprojection module 10 and theprojection module 10 and the second optical axis AX2 of thefirst camera module 11. In this embodiment, the first gap D1 ranges from 160 mm to 170 mm, but the invention is not limited thereto. In another embodiment, the first shooting area CA1 formed on thebearing surface 100 by thefirst camera module 11 is quadrilateral which has a long side X near the first shooting area CA1. - As shown in
FIGS. 1 and 2 , theprojection device 1 of this embodiment, further includes a baseline L perpendicular to the reference plane RP. In this embodiment, the first optical axis AX1 of theprojection module 10 is parallel to the baseline L. The second optical axis AX2 has the first angle Δθ1 with respect to the baseline L. The first angle Δθ1 ranges from 3 degree to 5 degree. That is the projection direction of theprojection module 10 is maintained, but the shooting direction of thefirst camera module 11 is shifted for 3 degree to 5 degree with respect to the projection direction of theprojection module 10 to allow the projection area PA at least partially overlaps the a first shooting area CA1 of thefirst camera module 11 to form the overlapping area OA. In addition, theprojection module 10 has a view angle θF1 ranging from 60 degree to 70 degree. Thefirst camera module 11 has a view angle θF2 ranging from 60 degree to 75 degree. - Particularly, since the first optical axis AX1 of the
projection module 10 is parallel to the baseline L, the first angle Δθ1 is the angle between the second optical axis AX2 of thefirst camera module 11 and the first optical axis AX1 of theprojection module 10. In this embodiment, the first angle Δθ1 is a function of the distance (the first gap D1) between theprojection module 10 and thefirst camera module 11. When the projection area PA of theprojection module 10 is entirely included in the first shooting area CA1 of thefirst camera module 11, -
Δθ1=ƒ(D1)=arctan ((D1+X)/Z2)−arctan (X/Z2). - Referring to
FIG. 3 .FIG. 3 is a schematic diagram of another embodiment of a projection device of the present invention. As shown inFIG. 3 , the projection device 1 a is similar to theprojection device 1 ofFIG. 1 . However, the projection device 1 a of this embodiment further includes alight emitting module 13. In this embodiment, thelight emitting module 13 is an infrared emitting module, and thefirst camera module 11 is an infrared camera module. Thelight emitting module 13 is configured to form a sensing area above the bearing surface 100 (not shown inFIG. 3 ). For example, the sensing area is an infrared curtain. The first shooting area of the first camera module 11 (similar to the first shooting area CA1 ofFIG. 2 ) includes the sensing area and a user's operational movements are captured in the sensing area. When the user's fingers enter the sensing area and reflect light to generate reflected light spot (such as a reflected infrared spot), and thefirst camera module 11 captures an image including the reflected light spot. The position of the reflected light spot is identified to perform corresponding operational commands to enable theprojection module 10 to project different images. As other structures of the projection device 1 a are similar to that of theprojection device 1 ofFIG. 1 , the description is thus omitted here. In addition, the projection area of theprojection module 10, the first shooting area of thefirst camera module 11 and the overlapping area formed by the projection area at least partially overlapping the first shooting area are similar to that ofFIG. 2 , and the description for them is thus omitted. - Referring to
FIGS. 4 and 5 .FIG. 4 is a schematic diagram of another embodiment of a projection device of the present invention, andFIG. 5 is a schematic diagram of a projection area of a projection module, a shooting area of the first camera module and a shooting area of the second camera module of the projection device ofFIG. 4 . As shown inFIGS. 4 and 5 , theprojection device 1 b is similar to theprojection device 1 ofFIG. 1 . However, theprojection device 1 b of this embodiment further includes asecond camera module 12 and alight emitting module 13. Thesecond camera module 12 is disposed between theprojection module 10 and thefirst camera module 11. Thesecond camera module 12 has a third optical axis AX3 to form a second shooting area CA2. Thelight emitting module 13 is configured to form a sensing area (not shown inFIGS. 4 and 5 ). For example, the sensing area is an infrared curtain. The second shooting area CA2 of thesecond camera module 12 includes the sensing area and a user's operational movements are captured in the sensing area. In this embodiment, for example, thefirst camera module 11 is a color camera module, thesecond camera module 12 is an infrared camera module and light emittingmodule 13 is an infrared emitting module. However, the invention is not limited thereto. The projection device lb is operated by gestures captured by the color camera module or touch-controlled by movements captured by the infrared camera module and the infrared emitting module. In this embodiment, the second optical axis AX2 of thefirst camera module 11 has the first angle Δθ1 with respect to the first optical axis AX1 of theprojection module 10, and the third optical axis AX3 of thesecond camera module 12 has the second angle Δθ2 with respect to the first optical axis AX1 of theprojection module 10. The projection area PA of theprojection module 10, the first shooting area CA1 of thefirst camera module 11 and the second shooting area CA2 of thesecond camera module 12 at least partially overlap one another to form an overlapping area OA′. - Referring to
FIGS. 4 and 5 . Theprojection module 10, thefirst camera module 11 and thesecond camera module 12 are disposed above the bearingsurface 100 of theprojection device 1. Thefirst camera module 11 and thesecond camera module 12 are disposed within thehousing 15, and theprojection module 10 is disposed on a side of thehousing 15. Thehousing 15 is connected to the base 17 through theframe 16. Thebase 17 is on thebearing surface 100. That is, theprojection module 10, thefirst camera module 11 and thesecond camera module 12 are disposed above thebase 17. In this embodiment, theprojection module 10 is spaced from the bearingsurface 100 bearing theprojection device 1 for a first distance Z1, thefirst camera module 11 is spaced from the bearingsurface 100 for a second distance Z2, and thesecond camera module 12 is spaced from the bearingsurface 100 for a third distance Z3. The first distance Z1, the second distance Z2 and the third distance Z3 ranges from 350 mm to 450 mm. In this embodiment, theprojection module 10, thefirst camera module 11 and thesecond camera module 12 are located on the same reference plane RP. That is, the height (the first distance Z1) of theprojection module 10 with respect to thebearing surface 100, the height (the second distance Z2) of thefirst camera module 11 with respect to thebearing surface 100 and the height (the third distance Z3) of thesecond camera module 12 with respect to thebearing surface 100 are equal. However, the invention is not limited thereto. In another embodiment, the first distance Z1, the second distance Z2 and the third distance Z3 are unequal. In addition, theprojection module 10 is spaced from thefirst camera module 11 for a first gap D1. In this embodiment, the first gap D1 ranges from 160 mm to 170 mm. Theprojection module 10 is spaced from thesecond camera module 12 for a second gap D2. In this embodiment, the second gap D2 ranges from 110 mm to 120 mm. However, the invention is not limited thereto. In another embodiment, the first gap D1 between theprojection module 10 and thefirst camera module 11 ranges from 110 mm to 120 mm, and the second gap D2 between theprojection module 10 and thesecond camera module 12 ranges from 160 mm to 170 mm. In addition, in this embodiment, the second shooting area CA2 formed on thebearing surface 100 by thesecond camera module 12 is quadrilateral which has a long side X1 near the second shooting area CA2. - As shown in
FIGS. 4 and 5 , theprojection device 1 of this embodiment further includes a baseline L perpendicular to the reference plane RP. In this embodiment, the first optical axis AX1 of theprojection module 10 is parallel to the baseline L. The second optical axis AX2 of thefirst camera module 11 forms a first angle Δθ1 with respect to the baseline L. The first angle Δθ1 ranges from 3 degree to 5 degree. The third optical axis AX3 of thesecond camera module 12 forms a second angle Δθ2 with respect to the baseline L. The second angle Δθ2 ranges from 3 degree to 5 degree. That is, the projection direction of theprojection module 10 is maintained, but the shooting direction of thefirst camera module 11 is shifted for 3 degrees to 5 degrees with respect to the projection direction of theprojection module 10 to allow the projection area PA, the first shooting area CA1 and the second shooting area CA2 to at least partially overlap one another to form the overlapping area OA′. In addition, theprojection module 10 of this embodiment has a view angle θ1 ranging from 60 degree to 70 degree. Thefirst camera module 11 has a view angle θF2 ranging from 60 degree to 75 degree. Thesecond camera module 12 has a view angle θF3 ranging from 65 degree to 75 degree. - In this embodiment, the first angle 401 between the second optical axis AX2 of the
first camera module 11 and the first optical axis AX1 of theprojection module 10 is equal to the second angle Δθ2 between the third optical axis AX3 of thesecond camera module 12 and the first optical axis AX1 of theprojection module 10. However, the invention is not limited thereto. In another embodiment, the first angle Δθ1 and the second angle Δθ2 are different. - Particularly, since the first optical axis AX1 of the
projection module 10 is parallel to the baseline L, the first angle Δθ1 is the angle between the second optical axis AX2 of thefirst camera module 11 and the first optical axis AX1 of theprojection module 10, and the second angle Δθ2 is the angle between the third optical axis AX3 of thesecond camera module 12 and the first optical axis AX1 of theprojection module 10. In this embodiment, the first angle Δθ1 is a function of the distance (the first gap D1) between theprojection module 10 and thefirst camera module 11. When the projection area PA of theprojection module 10 is entirely included in the first shooting area CA1 of thefirst camera module 11, Δθ1=ƒ(D1)=arctan((D1+X)/Z2)−arctan(X/Z2). The second angle Δθ2 is a function of the distance (the second gap D2) between theprojection module 10 and thesecond camera module 12. When the projection area PA of theprojection module 10 is entirely included in the second shooting area CA2 of thefirst camera module 12, -
Δθ2=ƒ(D2)=arctan ((D2+X1)/Z3)−arctan (X1/Z3). - Referring to
FIG. 6 .FIG. 6 is a block diagram of another embodiment of a projection device of the present invention. As shown inFIG. 6 , the projection device 1 c of this embodiment is similar to theprojection device 1 ofFIG. 1 . However, the projection device 1 c of this embodiment further includes aprocessing module 14 electrically connected to theprojection module 10 and thefirst camera module 11. Theprocessing module 14 is configured to enable theprojection module 10 and thefirst camera module 11. For example, theprojection module 10 projects image to thebearing surface 100 according to the image signals provided by theprocessing module 14. When a user operates with a gesture or touch control, theprocessing module 14 controls theprojection module 10 to project another image according to the image captured by thefirst camera module 11. - As shown in
FIG. 6 , the projection device 1 c further includes acamera driving module 18 electrically connected to theprocessing module 14. When theprojection module 10 rotates to an angle and is enabled on the Y-Z plane(for example, the plane perpendicular to thebearing surface 100 or the plane not parallel to bearing surface 100), theprocessing module 14 enables thecamera driving module 18 to drive thefirst camera module 11 to rotate to a specific angle on the Y-Z plane (for example, the first angle Δθ1 of the aforementioned embodiments). In addition, thecamera driving module 18 includes at least oneservo motor 181 and at least onegear set 182. When thecamera driving module 18 is enabled, theservo motor 181 rotates the gear set 182 so as to rotate thefirst camera 11 to a corresponding angle on the X-Y plane. - Referring to
FIG. 7 .FIG. 7 is a schematic diagram of another embodiment of a projection device of the present invention. As shown inFIG. 7 , theprojection device 1 d is similar to theprojection device 1 b ofFIG. 4 . However, the first optical axis AX1 of theprojection module 10 of theprojection device 1 d forms a third angle Δθ3 with respect to the baseline L. In this embodiment, the third angle Δθ3 between the first optical axis AX1 of theprojection module 10 and the baseline L ranges from 0 degree to 30 degree. When the projection direction, the shooting direction of thefirst camera module 11 and the shooting direction of thesecond camera module 12 are inclined simultaneously, the overlapping area formed by the projection area of theprojection module 10, the first shooting area of thefirst camera module 11 and the second shooting area of thesecond camera module 12 is effectively increased. Other structures of theprojection device 1 d are similar to that of theprojection device 1 b ofFIG. 4 , and the description for them is thus omitted. In addition, the projection area of theprojection module 10, the first shooting area of thefirst camera module 11, the second shooting area of thesecond camera module 12 and the overlapping area formed by the projection area at least partially overlapping the first shooting area and the second shooting area are similar to that ofFIG. 5 , and the description for them is thus omitted. - Referring to
FIGS. 8 and 9 .FIG. 8 is a schematic diagram of another embodiment of a projection device of the present invention, andFIG. 9 is a schematic diagram of a projection area of a projection module, a shooting area of the first camera module and a shooting area of the second camera module of the projection device ofFIG. 8 . As shown inFIGS. 8 and 9 , the projection device le is similar to theprojection device 1 b ofFIG. 4 . However, the projection device le of this embodiment,projection module 10, thefirst camera module 11 and thesecond camera module 12 are located on the same reference plane RP. The first optical axis AX1, the second optical axis AX2 and the third optical axis AX3 are perpendicular to the reference plane RP. In this embodiment, the first optical axis AX1 of theprojection module 10, the second optical axis AX2 of thefirst camera module 11 and the third optical axis AX3 of thesecond camera module 12 are perpendicular to the reference plane RP. In such a structure, the projection area PA of theprojection module 10, the projection area PA of theprojection module 10, the first shooting area CA1 of thefirst camera module 11 and the second shooting area CA2 of thesecond camera module 12 at least partially overlap one another to form an overlapping area OA″ which is substantially quadrilateral. In this embodiment, theprojection module 10 has a projection element with smaller view angle so as to form a smaller projection area PA and increase the overlapping area OA″ formed by the projection area PA of theprojection module 10, the projection area PA of theprojection module 10, the first shooting area CA1 of thefirst camera module 11 and the second shooting area CA2 of thesecond camera module 12. - Referring to
FIGS. 10 and 11 .FIG. 10 is a schematic diagram of another embodiment of a projection device of the present invention, andFIG. 11 is a schematic diagram of a projection area of a projection module, a shooting area of the first camera module and a shooting area of the second camera module of the projection device ofFIG. 10 . As shown inFIGS. 10 and 11 , the projection device if is similar to the projection device 1 e ofFIG. 8 . However, thefirst camera module 11 of the projection device 1 f of this embodiment is connected to afirst side 101 of theprojection module 10, and thesecond camera module 12 is connected to asecond side 102 of theprojection module 10. That is no gap is formed between thefirst camera module 11 and theprojection module 10, and no gap is formed between thesecond camera module 12 and theprojection module 10. In such a structure, the projection area PA of theprojection module 10, the projection area PA of theprojection module 10, the first shooting area CA1 of thefirst camera module 11 and the second shooting area CA2 of thesecond camera module 12 at least partially overlap one another to form an overlapping area OA′″ which is substantially quadrilateral. - In the structure of the projection device of the present invention, the second optical axis of the first camera module and the third optical axis of the second camera module are inclined with respect to the first optical axis of the projection module to enlarge the overlapping area formed by the projection area, the first shooting area and the second shooting area so as to improve operational performance.
- While the invention has been described in terms of what is presently considered to be the most practical and preferred embodiments, it is to be understood that the invention needs not be limited to the disclosed embodiment. On the contrary, it is intended to cover various modifications and similar arrangements included within the spirit and scope of the appended claims which are to be accorded with the broadest interpretation so as to encompass all such modifications and similar structures.
Claims (20)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/595,965 US10437140B2 (en) | 2016-05-24 | 2017-05-16 | Projection device with camera module |
Applications Claiming Priority (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201662341053P | 2016-05-24 | 2016-05-24 | |
US201662361477P | 2016-07-12 | 2016-07-12 | |
US201662361470P | 2016-07-12 | 2016-07-12 | |
US201662370682P | 2016-08-03 | 2016-08-03 | |
US15/595,965 US10437140B2 (en) | 2016-05-24 | 2017-05-16 | Projection device with camera module |
Publications (2)
Publication Number | Publication Date |
---|---|
US20170347078A1 true US20170347078A1 (en) | 2017-11-30 |
US10437140B2 US10437140B2 (en) | 2019-10-08 |
Family
ID=60417952
Family Applications (8)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/594,676 Active US10750144B2 (en) | 2016-05-24 | 2017-05-15 | Smart lighting device and operation mode transforming method of a smart lighting device for switching between a first operation mode and second operation mode |
US15/594,674 Active 2037-08-14 US10481475B2 (en) | 2016-05-24 | 2017-05-15 | Smart lighting device and control method thereof |
US15/594,677 Active 2037-12-30 US10338460B2 (en) | 2016-05-24 | 2017-05-15 | Projection apparatus |
US15/594,671 Active 2037-10-06 US10719001B2 (en) | 2016-05-24 | 2017-05-15 | Smart lighting device and control method thereof |
US15/595,961 Active 2037-09-15 US11048332B2 (en) | 2016-05-24 | 2017-05-16 | Picture selection method of projection touch |
US15/595,962 Abandoned US20170344190A1 (en) | 2016-05-24 | 2017-05-16 | Computer system having sensing function |
US15/595,965 Active 2037-12-22 US10437140B2 (en) | 2016-05-24 | 2017-05-16 | Projection device with camera module |
US16/427,363 Active 2037-06-13 US11385720B2 (en) | 2016-05-24 | 2019-05-31 | Picture selection method of projection touch |
Family Applications Before (6)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/594,676 Active US10750144B2 (en) | 2016-05-24 | 2017-05-15 | Smart lighting device and operation mode transforming method of a smart lighting device for switching between a first operation mode and second operation mode |
US15/594,674 Active 2037-08-14 US10481475B2 (en) | 2016-05-24 | 2017-05-15 | Smart lighting device and control method thereof |
US15/594,677 Active 2037-12-30 US10338460B2 (en) | 2016-05-24 | 2017-05-15 | Projection apparatus |
US15/594,671 Active 2037-10-06 US10719001B2 (en) | 2016-05-24 | 2017-05-15 | Smart lighting device and control method thereof |
US15/595,961 Active 2037-09-15 US11048332B2 (en) | 2016-05-24 | 2017-05-16 | Picture selection method of projection touch |
US15/595,962 Abandoned US20170344190A1 (en) | 2016-05-24 | 2017-05-16 | Computer system having sensing function |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/427,363 Active 2037-06-13 US11385720B2 (en) | 2016-05-24 | 2019-05-31 | Picture selection method of projection touch |
Country Status (3)
Country | Link |
---|---|
US (8) | US10750144B2 (en) |
CN (19) | CN107426554B (en) |
TW (8) | TWI653563B (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10338460B2 (en) * | 2016-05-24 | 2019-07-02 | Compal Electronics, Inc. | Projection apparatus |
Families Citing this family (36)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107113380B (en) * | 2014-12-19 | 2020-03-06 | 惠普发展公司,有限责任合伙企业 | Computing system, method and machine-readable non-transitory storage medium |
US10754476B2 (en) * | 2016-08-25 | 2020-08-25 | Tactual Labs Co. | Systems and methods for ultrasonic, millimeter wave and hybrid sensing |
TWI671621B (en) | 2017-05-24 | 2019-09-11 | 仁寶電腦工業股份有限公司 | Electronic device and host base |
FR3075425A1 (en) * | 2017-12-14 | 2019-06-21 | Societe Bic | APPARATUS FOR ENHANCED REALITY APPLICATION |
CN108091123A (en) * | 2017-12-28 | 2018-05-29 | 科勒(中国)投资有限公司 | Project touch-control control device, sanitaryware and control method |
TWI687812B (en) * | 2018-03-19 | 2020-03-11 | 仁寶電腦工業股份有限公司 | Electronic device and operating method thereof |
TWI656362B (en) * | 2018-03-26 | 2019-04-11 | 仁寶電腦工業股份有限公司 | Electronic device and object rework method thereof |
JP6987347B2 (en) * | 2018-03-29 | 2021-12-22 | マクセル株式会社 | projector |
CN108712808A (en) * | 2018-04-04 | 2018-10-26 | 湖南城市学院 | A kind of intelligent environment artistic decoration lamp light control system |
JP7054362B2 (en) * | 2018-05-07 | 2022-04-13 | キヤノン株式会社 | Imaging devices, light emitting devices and their control methods, programs |
CN110712589A (en) * | 2018-07-13 | 2020-01-21 | 神讯电脑(昆山)有限公司 | Image capturing device for vehicle and setting method thereof |
CN108957927A (en) * | 2018-07-19 | 2018-12-07 | 苏州博学智能科技有限公司 | A kind of office projection instrument |
JP7042464B2 (en) * | 2018-08-28 | 2022-03-28 | パナソニックIpマネジメント株式会社 | Lighting equipment group setting method and lighting system |
CN109377795A (en) * | 2018-09-27 | 2019-02-22 | 广东小天才科技有限公司 | A kind of the study exchange method and smart machine of smart machine |
FR3087721B1 (en) * | 2018-10-24 | 2021-07-30 | Valeo Vision | SYSTEM AND METHOD FOR LIGHTING A SIDE REGION OF A VEHICLE |
EP3948490A1 (en) * | 2019-03-26 | 2022-02-09 | Telefonaktiebolaget LM Ericsson (publ) | Determining a transformation between coordinate systems in an ultrasonic haptic device and a visual sensor device |
WO2020210841A1 (en) * | 2019-04-12 | 2020-10-15 | Daniel Seidel | Projection system with interactive exclusion zones and topological adjustment |
CN111857487A (en) * | 2019-04-26 | 2020-10-30 | 台达电子工业股份有限公司 | Method and device for setting direction of conversion direction key |
CN110162225A (en) * | 2019-05-05 | 2019-08-23 | 青岛小鸟看看科技有限公司 | A kind of projection lamp and the touch control method for projection lamp |
CN110223619A (en) * | 2019-06-11 | 2019-09-10 | 上海易视计算机科技股份有限公司 | Method for controlling projection, device, optical filter and optical projection system |
WO2021034681A1 (en) * | 2019-08-16 | 2021-02-25 | Bossa Nova Robotics Ip, Inc. | Systems and methods for image capture and shelf content detection |
CN110781339A (en) * | 2019-09-23 | 2020-02-11 | 厦门盈趣科技股份有限公司 | Rendering method, system, rendering device and storage medium |
CN110769164B (en) * | 2019-11-21 | 2020-09-18 | 天津九安医疗电子股份有限公司 | Method for automatically adjusting illumination level of target scene |
CN110928126B (en) * | 2019-12-09 | 2020-09-22 | 四川长虹电器股份有限公司 | Projection equipment capable of automatically adjusting brightness |
JP7330636B2 (en) * | 2020-01-27 | 2023-08-22 | 株式会社ディスコ | Method for adjusting brightness of illuminator in processing equipment |
KR102177384B1 (en) * | 2020-05-19 | 2020-11-12 | (주)웅진씽크빅 | System and method for supporting reading by linking additional content to book |
CN113781823B (en) * | 2020-06-09 | 2022-11-22 | 恒景科技股份有限公司 | Ambient light estimation system |
CN111796715A (en) * | 2020-06-24 | 2020-10-20 | 歌尔光学科技有限公司 | Detection method and detection device of touch control light film |
CN113889014B (en) * | 2020-07-02 | 2024-03-22 | 苏州佳世达电通有限公司 | Display system and control method |
CN112764300B (en) * | 2021-03-05 | 2022-03-22 | 深圳市火乐科技发展有限公司 | Optical machine module angle adjusting mechanism and projector |
JP7215513B2 (en) * | 2021-03-29 | 2023-01-31 | セイコーエプソン株式会社 | Lighting device with projector and lighting device |
CN113382174B (en) * | 2021-05-18 | 2022-09-27 | 北京优彩科技有限公司 | Method and device for adjusting light supplement lamp of ultraviolet imaging device |
CN116165904A (en) * | 2021-11-24 | 2023-05-26 | 昆山扬皓光电有限公司 | Projection environment control system and method |
TWI800188B (en) * | 2021-12-29 | 2023-04-21 | 群光電子股份有限公司 | Device and method for image capturing |
TWI822376B (en) * | 2022-10-05 | 2023-11-11 | 張啓川 | Smart lamp with color temperature and illuminance light source output information |
CN115790840B (en) * | 2023-02-10 | 2023-04-28 | 济宁市质量计量检验检测研究院(济宁半导体及显示产品质量监督检验中心、济宁市纤维质量监测中心) | Device and method for testing illumination of operation shadowless lamp |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020021287A1 (en) * | 2000-02-11 | 2002-02-21 | Canesta, Inc. | Quasi-three-dimensional method and apparatus to detect and localize interaction of user-object and virtual transfer device |
US20070025612A1 (en) * | 2004-03-31 | 2007-02-01 | Brother Kogyo Kabushiki Kaisha | Image input-and-output apparatus |
US20130241820A1 (en) * | 2012-03-13 | 2013-09-19 | Samsung Electronics Co., Ltd. | Portable projector and image projecting method thereof |
US20130271573A1 (en) * | 2011-09-30 | 2013-10-17 | Steinbichler Optotechnik Gmbh | Method and apparatus for determining the 3d coordinates of an object |
US20140046184A1 (en) * | 2011-03-30 | 2014-02-13 | Koninklijke Philips N.V. | Contactless sleep disorder screening system |
US20140139717A1 (en) * | 2011-07-29 | 2014-05-22 | David Bradley Short | Projection capture system, programming and method |
US20140184751A1 (en) * | 2012-12-27 | 2014-07-03 | Industrial Technology Research Institute | Device for acquiring depth image, calibrating method and measuring method therefor |
US20160103497A1 (en) * | 2014-10-08 | 2016-04-14 | Canon Kabushiki Kaisha | Information processing apparatus |
Family Cites Families (178)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5541820A (en) * | 1995-01-26 | 1996-07-30 | Mclaughlin; Michael K. | Combined lamp and movie projector |
GB9614837D0 (en) | 1996-07-12 | 1996-09-04 | Rank Xerox Ltd | Interactive desktop system with multiple image capture and display modes |
US6628283B1 (en) * | 2000-04-12 | 2003-09-30 | Codehorse, Inc. | Dynamic montage viewer |
US20020047047A1 (en) * | 2000-09-06 | 2002-04-25 | Paul Poloniewicz | Zero-footprint camera-based point-of-sale bar code presentation scanning system |
KR20030072591A (en) * | 2001-01-08 | 2003-09-15 | 브이케이비 인코포레이티드 | A data input device |
US7710391B2 (en) * | 2002-05-28 | 2010-05-04 | Matthew Bell | Processing an image utilizing a spatially varying pattern |
TWI242162B (en) | 2002-08-13 | 2005-10-21 | Kye Systems Corp | Finger-operated cursor input device |
US20040100563A1 (en) * | 2002-11-27 | 2004-05-27 | Sezai Sablak | Video tracking system and method |
CN1275123C (en) * | 2003-06-10 | 2006-09-13 | 仁宝电脑工业股份有限公司 | Method and apparatus for fast switching of operation modes for touch control equipments |
US7204596B2 (en) * | 2003-09-19 | 2007-04-17 | Nec Corporation | Projector with tilt angle measuring device |
CN1918532A (en) * | 2003-12-09 | 2007-02-21 | 雷阿卡特瑞克斯系统公司 | Interactive video window display system |
JP2005203859A (en) * | 2004-01-13 | 2005-07-28 | Olympus Corp | Projector |
US7379562B2 (en) * | 2004-03-31 | 2008-05-27 | Microsoft Corporation | Determining connectedness and offset of 3D objects relative to an interactive surface |
US7204428B2 (en) * | 2004-03-31 | 2007-04-17 | Microsoft Corporation | Identification of object on interactive display surface by identifying coded pattern |
US20050227217A1 (en) * | 2004-03-31 | 2005-10-13 | Wilson Andrew D | Template matching on interactive surface |
EP1757087A4 (en) * | 2004-04-16 | 2009-08-19 | James A Aman | Automatic event videoing, tracking and content generation system |
US7394459B2 (en) * | 2004-04-29 | 2008-07-01 | Microsoft Corporation | Interaction between objects and a virtual environment display |
US7397464B1 (en) * | 2004-04-30 | 2008-07-08 | Microsoft Corporation | Associating application states with a physical object |
US7134756B2 (en) * | 2004-05-04 | 2006-11-14 | Microsoft Corporation | Selectable projector and imaging modes of display table |
US7467380B2 (en) * | 2004-05-05 | 2008-12-16 | Microsoft Corporation | Invoking applications with virtual objects on an interactive display |
US7787706B2 (en) * | 2004-06-14 | 2010-08-31 | Microsoft Corporation | Method for controlling an intensity of an infrared source used to detect objects adjacent to an interactive display surface |
US7358962B2 (en) * | 2004-06-15 | 2008-04-15 | Microsoft Corporation | Manipulating association of data with a physical object |
US7593593B2 (en) * | 2004-06-16 | 2009-09-22 | Microsoft Corporation | Method and system for reducing effects of undesired signals in an infrared imaging system |
US7432917B2 (en) * | 2004-06-16 | 2008-10-07 | Microsoft Corporation | Calibration of an interactive display system |
US7168813B2 (en) * | 2004-06-17 | 2007-01-30 | Microsoft Corporation | Mediacube |
US7519223B2 (en) * | 2004-06-28 | 2009-04-14 | Microsoft Corporation | Recognizing gestures and using gestures for interacting with software applications |
US7743348B2 (en) * | 2004-06-30 | 2010-06-22 | Microsoft Corporation | Using physical objects to adjust attributes of an interactive display application |
US7576725B2 (en) * | 2004-10-19 | 2009-08-18 | Microsoft Corporation | Using clear-coded, see-through objects to manipulate virtual objects |
TWI262352B (en) * | 2005-03-10 | 2006-09-21 | Asustek Comp Inc | Luminant device |
CN100468190C (en) | 2005-03-15 | 2009-03-11 | 华硕电脑股份有限公司 | Luminous device |
US7570249B2 (en) * | 2005-03-30 | 2009-08-04 | Microsoft Corporation | Responding to change of state of control on device disposed on an interactive display surface |
US7499027B2 (en) * | 2005-04-29 | 2009-03-03 | Microsoft Corporation | Using a light pointer for input on an interactive display surface |
TWI275979B (en) * | 2005-05-20 | 2007-03-11 | Chung Shan Inst Of Science | Open virtual input and display device and method thereof |
US7525538B2 (en) * | 2005-06-28 | 2009-04-28 | Microsoft Corporation | Using same optics to image, illuminate, and project |
US7911444B2 (en) * | 2005-08-31 | 2011-03-22 | Microsoft Corporation | Input method for surface of interactive display |
US20070091434A1 (en) * | 2005-10-21 | 2007-04-26 | Hewlett-Packard Development Company, L.P. | Luminance adjustment |
US7614753B2 (en) * | 2005-10-31 | 2009-11-10 | Hewlett-Packard Development Company, L.P. | Determining an adjustment |
US8060840B2 (en) * | 2005-12-29 | 2011-11-15 | Microsoft Corporation | Orientation free user interface |
JP3953500B1 (en) * | 2006-02-07 | 2007-08-08 | シャープ株式会社 | Image projection method and projector |
US8180114B2 (en) * | 2006-07-13 | 2012-05-15 | Northrop Grumman Systems Corporation | Gesture recognition interface system with vertical display |
US9696808B2 (en) * | 2006-07-13 | 2017-07-04 | Northrop Grumman Systems Corporation | Hand-gesture recognition method |
CN101135930A (en) * | 2007-06-30 | 2008-03-05 | 上海序参量科技发展有限公司 | Projection formula order dishes system and realization method thereof |
CN101346023A (en) * | 2007-07-10 | 2009-01-14 | 精碟科技股份有限公司 | Illuminating apparatus |
US7978928B2 (en) * | 2007-09-18 | 2011-07-12 | Seiko Epson Corporation | View projection for dynamic configurations |
US8139110B2 (en) * | 2007-11-01 | 2012-03-20 | Northrop Grumman Systems Corporation | Calibration of a gesture recognition interface system |
US9377874B2 (en) * | 2007-11-02 | 2016-06-28 | Northrop Grumman Systems Corporation | Gesture recognition light and video image projector |
US20090213093A1 (en) * | 2008-01-07 | 2009-08-27 | Next Holdings Limited | Optical position sensor using retroreflection |
CN101630112B (en) * | 2008-07-14 | 2011-04-27 | 英华达股份有限公司 | Projector and operation method thereof |
CN101639204A (en) * | 2008-07-30 | 2010-02-03 | 和硕联合科技股份有限公司 | Illumination device and brightness control method thereof |
JP4793422B2 (en) * | 2008-10-10 | 2011-10-12 | ソニー株式会社 | Information processing apparatus, information processing method, information processing system, and information processing program |
KR101526995B1 (en) * | 2008-10-15 | 2015-06-11 | 엘지전자 주식회사 | Mobile terminal and method for controlling display thereof |
JP5444963B2 (en) * | 2008-11-26 | 2014-03-19 | セイコーエプソン株式会社 | projector |
US20100128112A1 (en) * | 2008-11-26 | 2010-05-27 | Samsung Electronics Co., Ltd | Immersive display system for interacting with three-dimensional content |
US8013904B2 (en) * | 2008-12-09 | 2011-09-06 | Seiko Epson Corporation | View projection matrix based high performance low latency display pipeline |
CN101749560B (en) * | 2008-12-12 | 2013-05-08 | 财团法人工业技术研究院 | Lighting system |
WO2010095952A1 (en) * | 2009-02-19 | 2010-08-26 | 3D Perception As | Method and device for measuring at least one of light intensity and colour in at least one modulated image |
US20100245264A1 (en) * | 2009-03-31 | 2010-09-30 | Arima Lasers Corp. | Optical Detection Apparatus and Method |
CN101858496B (en) * | 2009-04-07 | 2012-07-18 | 绎立锐光科技开发(深圳)有限公司 | Light source and control method thereof as well as projection system with same |
CN101895783A (en) | 2009-05-18 | 2010-11-24 | 华晶科技股份有限公司 | Detection device for stability of digital video camera and digital video camera |
JP5430254B2 (en) * | 2009-07-01 | 2014-02-26 | キヤノン株式会社 | Image display apparatus and control method thereof |
GB0920754D0 (en) * | 2009-11-27 | 2010-01-13 | Compurants Ltd | Inamo big book 1 |
JP5527327B2 (en) * | 2009-10-30 | 2014-06-18 | 日本電気株式会社 | Light emitting device, light source device, and projection display device |
TWI387700B (en) | 2009-11-13 | 2013-03-01 | I Chiun Precision Ind Co Ltd | Led lamp having projector device |
TWI439788B (en) * | 2010-01-04 | 2014-06-01 | Ind Tech Res Inst | System and method for projection correction |
CN101887207B (en) | 2010-03-08 | 2011-12-28 | 圆展科技股份有限公司 | Real object projector with image auxiliary data searching and display functions and method thereof |
IT1399161B1 (en) * | 2010-03-26 | 2013-04-11 | Seco S R L | LIGHTING DEVICE EQUIPPED WITH MEANS OF RECEPTION AND DIFFUSION OF MULTIMEDIA CONTENT. |
CN101833731A (en) * | 2010-04-30 | 2010-09-15 | 翁荣森 | Intelligent clothing matching system and method aiming at sale terminals |
US8751049B2 (en) * | 2010-05-24 | 2014-06-10 | Massachusetts Institute Of Technology | Kinetic input/output |
US8692178B2 (en) * | 2010-06-11 | 2014-04-08 | Industrial Technology Research Institute | Photosensitive control system, and method of operating thereof |
JP5477185B2 (en) * | 2010-06-17 | 2014-04-23 | セイコーエプソン株式会社 | Multi-projection system, projector, and image projection control method |
CN101907954B (en) * | 2010-07-02 | 2012-06-13 | 中国科学院深圳先进技术研究院 | Interactive projection system and interactive projection method |
CN102375614A (en) * | 2010-08-11 | 2012-03-14 | 扬明光学股份有限公司 | Output and input device as well as man-machine interaction system and method thereof |
US8434685B1 (en) * | 2010-09-14 | 2013-05-07 | Amazon Technologies, Inc. | Accessory devices configured to display supplemental content |
GB2486445B (en) * | 2010-12-14 | 2013-08-14 | Epson Norway Res And Dev As | Camera-based multi-touch interaction apparatus system and method |
CN102109745A (en) * | 2011-01-21 | 2011-06-29 | 鸿富锦精密工业(深圳)有限公司 | Projector with brightness adjustment function and method |
CN102096529A (en) * | 2011-01-27 | 2011-06-15 | 北京威亚视讯科技有限公司 | Multipoint touch interactive system |
JP2012177768A (en) * | 2011-02-25 | 2012-09-13 | Sanyo Electric Co Ltd | Projection type video display device |
US8669966B2 (en) * | 2011-02-25 | 2014-03-11 | Jonathan Payne | Touchscreen displays incorporating dynamic transmitters |
JP2012208926A (en) * | 2011-03-15 | 2012-10-25 | Nikon Corp | Detection device, input device, projector and electronic apparatus |
CN102169277A (en) | 2011-04-18 | 2011-08-31 | 鸿富锦精密工业(深圳)有限公司 | Projector |
TWI476364B (en) * | 2011-05-09 | 2015-03-11 | Lin Cho Yi | Detecting method and apparatus |
US20160070410A1 (en) * | 2011-05-09 | 2016-03-10 | Cho-Yi Lin | Display apparatus, electronic apparatus, hand-wearing apparatus and control system |
TWI473497B (en) * | 2011-05-18 | 2015-02-11 | Chip Goal Electronics Corp | Object tracking apparatus, interactive image display system using object tracking apparatus, and methods thereof |
KR20120129664A (en) * | 2011-05-20 | 2012-11-28 | 삼성전자주식회사 | Projector and method for controlling of projector |
US8520114B2 (en) * | 2011-06-01 | 2013-08-27 | Global Oled Technology Llc | Apparatus for displaying and sensing images |
TW201337686A (en) * | 2012-03-13 | 2013-09-16 | Uc Logic Technology Corp | Optical touch device |
GB201110159D0 (en) * | 2011-06-16 | 2011-07-27 | Light Blue Optics Ltd | Touch sensitive display devices |
CN102221887B (en) * | 2011-06-23 | 2016-05-04 | 康佳集团股份有限公司 | Interactive projection system and method |
US8587635B2 (en) * | 2011-07-15 | 2013-11-19 | At&T Intellectual Property I, L.P. | Apparatus and method for providing media services with telepresence |
CN103164023A (en) * | 2011-12-15 | 2013-06-19 | 西安天动数字科技有限公司 | Interactive graffiti entertainment system |
JP2013125166A (en) * | 2011-12-15 | 2013-06-24 | Seiko Epson Corp | Lighting system |
JP6083172B2 (en) * | 2011-12-26 | 2017-02-22 | セイコーエプソン株式会社 | Lighting device |
US9207812B2 (en) * | 2012-01-11 | 2015-12-08 | Smart Technologies Ulc | Interactive input system and method |
CN202502335U (en) * | 2012-01-20 | 2012-10-24 | 光峰光电(无锡)有限公司 | Short-focus optical interactive projector system |
WO2013109282A1 (en) * | 2012-01-20 | 2013-07-25 | Empire Technology Development Llc | Mirror array display system |
EP2635022A1 (en) * | 2012-02-29 | 2013-09-04 | Flir Systems AB | A method and system for performing alignment of a projection image to detected infrared (IR) radiation information |
JP2013191005A (en) * | 2012-03-14 | 2013-09-26 | Hitachi Solutions Ltd | Digitizer device |
GB201205303D0 (en) * | 2012-03-26 | 2012-05-09 | Light Blue Optics Ltd | Touch sensing systems |
CN202587539U (en) * | 2012-05-23 | 2012-12-05 | 重庆雷士实业有限公司 | Voice lighting control device |
US9294746B1 (en) * | 2012-07-09 | 2016-03-22 | Amazon Technologies, Inc. | Rotation of a micro-mirror device in a projection and camera system |
CN102799317B (en) * | 2012-07-11 | 2015-07-01 | 联动天下科技(大连)有限公司 | Smart interactive projection system |
JP6051648B2 (en) * | 2012-07-23 | 2016-12-27 | セイコーエプソン株式会社 | Projector and control method thereof |
TWI472701B (en) * | 2012-07-23 | 2015-02-11 | Shinyoptics Corp | Projection table lamp |
US8922486B2 (en) * | 2012-07-24 | 2014-12-30 | Christie Digital Systems Usa, Inc. | Method, system and apparatus for determining locations in a projected image |
JP6011143B2 (en) | 2012-08-10 | 2016-10-19 | セイコーエプソン株式会社 | Lighting device |
GB2506106A (en) * | 2012-08-14 | 2014-03-26 | Light Blue Optics Ltd | Touch sensing systems using a pair of beam deflectors controlled in tandem |
TWM445115U (en) * | 2012-08-20 | 2013-01-11 | zhi-rong Chen | Multi-functional micro projection lamp |
CN202815404U (en) * | 2012-08-24 | 2013-03-20 | 陈志荣 | Multifunctional micro projection lamp |
US9197870B1 (en) * | 2012-09-12 | 2015-11-24 | Amazon Technologies, Inc. | Automatic projection focusing |
US9619084B2 (en) * | 2012-10-04 | 2017-04-11 | Corning Incorporated | Touch screen systems and methods for sensing touch screen displacement |
JP6089551B2 (en) * | 2012-10-09 | 2017-03-08 | セイコーエプソン株式会社 | Lighting device |
TWM452294U (en) | 2012-10-23 | 2013-05-01 | Brightek Optoelectronic Co Ltd | Lamp having projection function |
JP6167511B2 (en) * | 2012-12-04 | 2017-07-26 | セイコーエプソン株式会社 | Document camera and document camera control method |
CN103024324B (en) * | 2012-12-10 | 2016-06-22 | Tcl通力电子(惠州)有限公司 | A kind of short out-of-focus projection system |
KR102090269B1 (en) * | 2012-12-14 | 2020-03-17 | 삼성전자주식회사 | Method for searching information, device, and computer readable recording medium thereof |
CN103869961A (en) * | 2012-12-18 | 2014-06-18 | 联想(北京)有限公司 | Method and system for interacting projector and video camera |
JP2014130170A (en) * | 2012-12-27 | 2014-07-10 | Ricoh Co Ltd | Image projection apparatus and control method |
US8879782B2 (en) * | 2013-01-17 | 2014-11-04 | Disney Enterprises, Inc. | Projector light bulb |
TWM454356U (en) * | 2013-01-21 | 2013-06-01 | Darfon Electronics Corp | Display apparatus, illumination apparatus and vehicle having projector device |
US9524059B2 (en) * | 2013-03-15 | 2016-12-20 | Texas Instruments Incorporated | Interaction detection using structured light images |
JP2014202951A (en) * | 2013-04-05 | 2014-10-27 | 船井電機株式会社 | Image projection device and operation matter detection method |
FR3004249B1 (en) * | 2013-04-09 | 2016-01-22 | Vit | SYSTEM FOR ACQUIRING THREE DIMENSIONAL IMAGES |
KR102169012B1 (en) * | 2013-04-18 | 2020-10-23 | 삼성디스플레이 주식회사 | Eye-glasses which attaches projector and method of controlling thereof |
US9609262B2 (en) * | 2013-06-27 | 2017-03-28 | Intel Corporation | Device for adaptive projection |
JP6225532B2 (en) * | 2013-07-22 | 2017-11-08 | セイコーエプソン株式会社 | Projector and projector control method |
TWI489353B (en) * | 2013-09-09 | 2015-06-21 | Wistron Corp | Optical coordinate input device |
CN104573597B (en) * | 2013-10-10 | 2018-12-11 | 腾讯科技(深圳)有限公司 | A kind of two-dimensional code identification method and device |
US9143720B2 (en) * | 2013-10-31 | 2015-09-22 | Htc Corporation | Handheld electronic device and image projection method of the same |
JP6229972B2 (en) * | 2013-11-05 | 2017-11-15 | パナソニックIpマネジメント株式会社 | Lighting device |
JP2015090405A (en) * | 2013-11-05 | 2015-05-11 | パナソニックIpマネジメント株式会社 | Lighting system |
EP3072032B1 (en) * | 2013-11-21 | 2020-01-01 | Hewlett-Packard Development Company, L.P. | Projection screen for specularly reflecting infrared light |
WO2015091821A1 (en) * | 2013-12-18 | 2015-06-25 | Flir Systems Ab | Processing infrared images based on swipe gestures |
CN104750235B (en) * | 2013-12-27 | 2018-03-27 | 联想(北京)有限公司 | A kind of data processing method and electronic equipment |
TW201528048A (en) * | 2014-01-03 | 2015-07-16 | Egismos Technology Corp | Image-based virtual interactive device and method thereof |
JP6387644B2 (en) * | 2014-01-21 | 2018-09-12 | セイコーエプソン株式会社 | Position detection device, position detection system, and position detection method |
KR102077677B1 (en) * | 2014-01-21 | 2020-02-14 | 엘지전자 주식회사 | Mobile terminal and method for controlling the same |
US9602787B2 (en) * | 2014-02-15 | 2017-03-21 | Robert Warren Blaser, JR. | Ceiling medallion projection system |
TWI504931B (en) * | 2014-02-19 | 2015-10-21 | Coretronic Corp | Projection system and projection method thereof |
CN103809880B (en) * | 2014-02-24 | 2017-02-08 | 清华大学 | Man-machine interaction system and method |
CN203930682U (en) * | 2014-04-11 | 2014-11-05 | 周光磊 | Multi-point touch and the recognition system that catches gesture motion in three dimensions |
TWM484404U (en) * | 2014-04-11 | 2014-08-21 | Eue Medical Technology | Imaging projection system equipment application |
CN103914152B (en) | 2014-04-11 | 2017-06-09 | 周光磊 | Multi-point touch and the recognition methods and system that catch gesture motion in three dimensions |
JPWO2015159548A1 (en) * | 2014-04-18 | 2017-04-13 | Necソリューションイノベータ株式会社 | Projection control apparatus, projection control method, and projection control program |
US20170038912A1 (en) * | 2014-04-18 | 2017-02-09 | Nec Solution Innovators, Ltd. | Information providing device |
CN105474071A (en) * | 2014-05-27 | 2016-04-06 | 联发科技股份有限公司 | Projection processor for projective display system |
CN203868778U (en) | 2014-06-03 | 2014-10-08 | 深圳市奇脉电子技术有限公司 | LED lamp with projector |
KR102303115B1 (en) * | 2014-06-05 | 2021-09-16 | 삼성전자 주식회사 | Method For Providing Augmented Reality Information And Wearable Device Using The Same |
TWI522722B (en) * | 2014-06-06 | 2016-02-21 | 中強光電股份有限公司 | Light source device and adjusting method thereof |
EP3156763B1 (en) * | 2014-06-13 | 2019-02-06 | Nikon Corporation | Shape measurement device |
CN204100990U (en) * | 2014-07-03 | 2015-01-14 | 冯晓锋 | A kind of Structure Precision of Single Camera Stereo Vision sensor device based on mirror imaging |
CN104202865B (en) * | 2014-07-29 | 2016-08-03 | 福建歌航电子信息科技有限公司 | A kind of Intelligent lightening device |
US10623649B2 (en) * | 2014-07-31 | 2020-04-14 | Hewlett-Packard Development Company, L.P. | Camera alignment based on an image captured by the camera that contains a reference marker |
CN104133599A (en) * | 2014-08-01 | 2014-11-05 | 上海斐讯数据通信技术有限公司 | Terminal device and method allowing projection surface to be operated |
CN104181762B (en) * | 2014-08-15 | 2016-10-05 | 苏州佳世达光电有限公司 | Optical projection system |
JP2016050972A (en) * | 2014-08-29 | 2016-04-11 | ソニー株式会社 | Control device, control method, and program |
TWI521358B (en) * | 2014-09-03 | 2016-02-11 | Intelligent networking lamp | |
JP6624541B2 (en) * | 2014-09-12 | 2019-12-25 | パナソニックIpマネジメント株式会社 | Light projection device and illumination device using the same |
CN104281335A (en) * | 2014-09-17 | 2015-01-14 | 上海创幸计算机科技有限公司 | Multi-point touch control interactive large screen system and control method of multi-point touch control interactive large screen system |
US10275092B2 (en) * | 2014-09-24 | 2019-04-30 | Hewlett-Packard Development Company, L.P. | Transforming received touch input |
TWI549516B (en) * | 2014-10-03 | 2016-09-11 | 宏碁股份有限公司 | Electronic device and method for projecting screen adjustment |
JP6278494B2 (en) * | 2014-10-20 | 2018-02-14 | Necディスプレイソリューションズ株式会社 | Infrared light adjustment method and position detection system |
CN105589552B (en) * | 2014-10-30 | 2018-10-12 | 联想(北京)有限公司 | Projection interactive method based on gesture and projection interactive device |
TWI531954B (en) | 2014-11-14 | 2016-05-01 | 中強光電股份有限公司 | Touch and gesture control system and touch and gesture control method |
CN105652571B (en) * | 2014-11-14 | 2018-09-07 | 中强光电股份有限公司 | Projection arrangement and its optical projection system |
US9347828B1 (en) * | 2014-11-27 | 2016-05-24 | Hui Zhao | Method for detecting ambient light brightness and apparatus for achieving the method |
CN104656890A (en) * | 2014-12-10 | 2015-05-27 | 杭州凌手科技有限公司 | Virtual realistic intelligent projection gesture interaction all-in-one machine |
CN105828493B (en) * | 2015-01-06 | 2018-09-11 | 中强光电股份有限公司 | Lighting system and its multi-mode lamps and lanterns |
WO2016145430A1 (en) * | 2015-03-12 | 2016-09-15 | Vita-Mix Management Corporation | Display system for blending systems |
WO2016166869A1 (en) * | 2015-04-16 | 2016-10-20 | 日立マクセル株式会社 | Illumination apparatus |
TWM520160U (en) * | 2015-08-11 | 2016-04-11 | 李卓澔 | Interactive projection device |
CN105260021A (en) * | 2015-10-15 | 2016-01-20 | 深圳市祈锦通信技术有限公司 | Intelligent interactive projection system |
CN205071408U (en) | 2015-10-30 | 2016-03-02 | 南京信息工程大学 | Intelligent desk lamp |
CN105430311A (en) * | 2015-11-16 | 2016-03-23 | 上海尚镜信息科技有限公司 | Infrared tracking projecting system and method |
TWI653563B (en) * | 2016-05-24 | 2019-03-11 | 仁寶電腦工業股份有限公司 | Projection touch image selection method |
US9964840B1 (en) * | 2016-10-19 | 2018-05-08 | Wuhai Gao | Multifunctional desk lamp that provides both functions of lighting and image projection |
WO2018155235A1 (en) * | 2017-02-24 | 2018-08-30 | ソニー株式会社 | Control device, control method, program, and projection system |
JP6275312B1 (en) * | 2017-06-02 | 2018-02-07 | キヤノン株式会社 | Projection apparatus, control method therefor, and program |
TWI656362B (en) * | 2018-03-26 | 2019-04-11 | 仁寶電腦工業股份有限公司 | Electronic device and object rework method thereof |
-
2017
- 2017-02-03 TW TW106103751A patent/TWI653563B/en active
- 2017-02-10 TW TW106104396A patent/TWI641985B/en active
- 2017-02-10 TW TW106104397A patent/TWI682688B/en active
- 2017-02-10 TW TW107122767A patent/TWI653909B/en active
- 2017-02-23 TW TW106106024A patent/TWI630530B/en active
- 2017-02-23 TW TW106106023A patent/TWI635324B/en active
- 2017-03-21 CN CN201710169460.0A patent/CN107426554B/en active Active
- 2017-03-23 CN CN201710179067.XA patent/CN107422581B/en active Active
- 2017-03-23 CN CN201710179028.XA patent/CN107422586B/en active Active
- 2017-03-23 CN CN201710178212.2A patent/CN107426555B/en active Active
- 2017-03-23 CN CN201710179368.2A patent/CN107426556B/en active Active
- 2017-03-24 CN CN201710180467.2A patent/CN107422925B/en active Active
- 2017-03-24 CN CN201710180369.9A patent/CN107422924A/en active Pending
- 2017-03-28 CN CN201710190890.0A patent/CN107426887B/en active Active
- 2017-03-28 CN CN201710190913.8A patent/CN107422587B/en active Active
- 2017-03-28 CN CN201710190899.1A patent/CN107426888B/en active Active
- 2017-03-28 CN CN201710190914.2A patent/CN107426557B/en active Active
- 2017-03-28 CN CN201710190900.0A patent/CN107422593B/en active Active
- 2017-03-28 CN CN201710190889.8A patent/CN107422576B/en active Active
- 2017-03-28 CN CN201710190906.8A patent/CN107426889B/en active Active
- 2017-03-28 CN CN201710190902.XA patent/CN107426469B/en active Active
- 2017-03-28 CN CN201710190903.4A patent/CN107426503B/en active Active
- 2017-03-28 CN CN201710190888.3A patent/CN107426886B/en active Active
- 2017-03-30 CN CN201710200682.4A patent/CN107422949B/en active Active
- 2017-03-30 CN CN201710200683.9A patent/CN107422950B/en active Active
- 2017-05-15 US US15/594,676 patent/US10750144B2/en active Active
- 2017-05-15 US US15/594,674 patent/US10481475B2/en active Active
- 2017-05-15 US US15/594,677 patent/US10338460B2/en active Active
- 2017-05-15 US US15/594,671 patent/US10719001B2/en active Active
- 2017-05-16 US US15/595,961 patent/US11048332B2/en active Active
- 2017-05-16 US US15/595,962 patent/US20170344190A1/en not_active Abandoned
- 2017-05-16 US US15/595,965 patent/US10437140B2/en active Active
- 2017-05-23 TW TW106117065A patent/TWI682689B/en active
- 2017-05-23 TW TW106117064A patent/TWI645244B/en active
-
2019
- 2019-05-31 US US16/427,363 patent/US11385720B2/en active Active
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020021287A1 (en) * | 2000-02-11 | 2002-02-21 | Canesta, Inc. | Quasi-three-dimensional method and apparatus to detect and localize interaction of user-object and virtual transfer device |
US20070025612A1 (en) * | 2004-03-31 | 2007-02-01 | Brother Kogyo Kabushiki Kaisha | Image input-and-output apparatus |
US20140046184A1 (en) * | 2011-03-30 | 2014-02-13 | Koninklijke Philips N.V. | Contactless sleep disorder screening system |
US20140139717A1 (en) * | 2011-07-29 | 2014-05-22 | David Bradley Short | Projection capture system, programming and method |
US20130271573A1 (en) * | 2011-09-30 | 2013-10-17 | Steinbichler Optotechnik Gmbh | Method and apparatus for determining the 3d coordinates of an object |
US20130241820A1 (en) * | 2012-03-13 | 2013-09-19 | Samsung Electronics Co., Ltd. | Portable projector and image projecting method thereof |
US20140184751A1 (en) * | 2012-12-27 | 2014-07-03 | Industrial Technology Research Institute | Device for acquiring depth image, calibrating method and measuring method therefor |
US20160103497A1 (en) * | 2014-10-08 | 2016-04-14 | Canon Kabushiki Kaisha | Information processing apparatus |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10338460B2 (en) * | 2016-05-24 | 2019-07-02 | Compal Electronics, Inc. | Projection apparatus |
Also Published As
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10437140B2 (en) | Projection device with camera module | |
TWI483143B (en) | Hybrid pointing device | |
US10534436B2 (en) | Multi-modal gesture based interactive system and method using one single sensing system | |
US20140139668A1 (en) | Projection capture system and method | |
JP2010277122A (en) | Optical position detection apparatus | |
US10013068B2 (en) | Information processing apparatus including a mirror configured to reflect an image and a projector and an image capturing unit arranged below the mirror | |
US7675022B2 (en) | Reflecting optical trace detecting module having an optical path diverting element | |
TWI531954B (en) | Touch and gesture control system and touch and gesture control method | |
US20110193969A1 (en) | Object-detecting system and method by use of non-coincident fields of light | |
US20110242053A1 (en) | Optical touch screen device | |
JP2010160772A (en) | Electronic apparatus with virtual input device | |
US8780084B2 (en) | Apparatus for detecting a touching position on a flat panel display and a method thereof | |
US9784407B2 (en) | Optical touch system, light curtain generating module and adjusting structure | |
TWI436242B (en) | Movement detection device | |
TWI439906B (en) | Sensing system | |
TWI582672B (en) | An optical touch device and touch detecting method using the same | |
TWI587196B (en) | Optical touch system and optical detecting method for touch position | |
JP2010282463A (en) | Touch panel device | |
JP2018085553A (en) | Projector system | |
TWI543046B (en) | Optical touch-control system | |
TWI496054B (en) | Optical touch control device, optical touch control and displacement detecing device, adjustable light guiding device, optical touch control method, and optical touch control and displacement detecing method | |
CN102646003B (en) | Sensing system | |
TWI484379B (en) | Optical detecting device | |
US9189106B2 (en) | Optical touch panel system and positioning method thereof | |
JP2009238167A (en) | Position detection apparatus |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: COMPAL ELECTRONICS, INC., TAIWAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:WANG, WEI-JUN;CHIU, WEN-YI;WU, TING-WEI;AND OTHERS;REEL/FRAME:042386/0125 Effective date: 20170508 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |