WO2016157804A1 - プロジェクター、及び、プロジェクターの制御方法 - Google Patents
プロジェクター、及び、プロジェクターの制御方法 Download PDFInfo
- Publication number
- WO2016157804A1 WO2016157804A1 PCT/JP2016/001602 JP2016001602W WO2016157804A1 WO 2016157804 A1 WO2016157804 A1 WO 2016157804A1 JP 2016001602 W JP2016001602 W JP 2016001602W WO 2016157804 A1 WO2016157804 A1 WO 2016157804A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- unit
- projection
- indicator
- size
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/36—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
- G09G5/38—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory with means for controlling the display position
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B21/00—Projectors or projection-type viewers; Accessories therefor
- G03B21/14—Details
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0354—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
- G06F3/03542—Light pens for emitting or receiving light
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0354—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
- G06F3/03545—Pens or stylus
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/042—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04845—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3179—Video signal processing therefor
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3179—Video signal processing therefor
- H04N9/3185—Geometric adjustment, e.g. keystone or convergence
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3191—Testing thereof
- H04N9/3194—Testing thereof including sensor feedback
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2320/00—Control of display operating conditions
- G09G2320/08—Arrangements within a display terminal for setting, manually or automatically, display parameters of the display terminal
Definitions
- the present invention relates to a projector and a projector control method.
- a projector includes a projection unit that projects a projection image including a first object on a projection surface, a projection size detection unit that detects a size of a region on which the projection image is projected, When the operation of the indicator detected by the detection unit and the operation of the indicator detected by the detection unit is an operation of moving the first object, the projection image is projected. And a movement amount adjustment unit that varies the movement amount of the first object according to the size of the region.
- the projector moves the position of the first object based on an operation of moving the first object in response to the size of the area on which the projection image is projected being not constant. Can be set to a value corresponding to the size.
- the movement amount adjustment unit may perform the first operation of moving the first object when the size of the area on which the projection image is projected is the first size.
- the first operation is performed when the first movement amount of the first object and the size of the area on which the projection image is projected are a second size larger than the first size.
- the second movement amount of the first object is increased in the second movement amount.
- the movement amount adjustment unit may move the first object when the same operation of the indicator is performed as the size of the region on which the projection image is projected is larger. It is characterized by increasing. According to the configuration of the present invention, the larger the size of the area onto which the projected image is projected, the larger the amount of movement of the first object when the same indicator is operated, and the operator ( The work when the first object is moved by the user is facilitated.
- the operation of moving the first object is an operation of continuously shifting from a state where the indicator moves in contact with the projection surface to a state where the indicator moves without contact.
- the moving amount adjusting unit multiplies the moving distance of the indicator after the indicator has shifted to the non-contact state with the projection surface by a coefficient corresponding to the size of the region on which the projection image is projected.
- the moving amount of the first object is calculated. According to the configuration of the present invention, an operation intentionally performed by the operator with the intention to move the first object can be performed as a half of the operation for moving the first object. Further, the movement amount adjustment unit can set the value of the movement amount to an appropriate value according to the size of the image projection area using the movement amount coefficient.
- the projector includes an imaging unit that captures the projection surface, and the projection size detection unit causes the projection unit to project a specific pattern image on the projection surface, and the pattern image is projected.
- the projection surface is photographed by the photographing unit, and the size of the region on which the projection image is projected is detected based on the photographing result by the photographing unit.
- the user's work is not required for the detection of the size of the area where the projection image is projected, and the convenience for the user is improved.
- a projector control method of the present invention is a projector control method including a projection unit that projects a projection image including a first object on a projection surface, and the projection image is projected.
- the size of the area to be detected is detected, the operation of the indicator on the projection surface is detected, and when the detected operation of the indicator is an operation of moving the first object, the projection image is projected.
- the amount of movement of the first object varies depending on the size of the area to be recorded.
- the figure which shows the usage condition of a projector The functional block diagram of a projector and a 1st indicator.
- movement of a projector The figure used for description of a movement gesture.
- FIG. 1 is a diagram illustrating an installation state of the projector 100.
- the projector 100 is installed directly above or obliquely above the screen SC (projection surface), and projects an image (projected image) toward the screen SC obliquely below.
- the screen SC is a flat plate or curtain fixed to the wall surface or standing on the floor surface.
- the present invention is not limited to this example, and the wall surface can be used as the screen SC.
- the projector 100 may be attached to the upper part of the wall surface used as the screen SC.
- the projector 100 is connected to an image supply device such as a PC (personal computer), a video playback device, a DVD playback device, a Blu-ray (registered trademark) Disc (Blu-ray Disc) playback device, or the like.
- the projector 100 projects an image on the screen SC based on an analog image signal or digital image data supplied from the image supply device. Further, the projector 100 may read out image data stored in the built-in storage unit 60 (FIG. 2) or an externally connected storage medium, and display an image on the screen SC based on the image data.
- the projector 100 detects an operator's operation on the screen SC.
- the pen-shaped first indicator 70 or the second indicator 80 which is the finger of the operator is used.
- the first indicator 70 or the second indicator 80 is used to designate (instruct) a certain position on the screen SC, or different positions on the screen SC are continuously displayed. Instruction operations are included.
- the operation of designating (instructing) a certain position on the screen SC is an operation of bringing the first indicator 70 or the second indicator 80 into contact with the certain position on the screen SC for a certain period of time.
- the operation of sequentially indicating different positions on the screen SC is an operation of drawing a character, a figure, or the like by moving the first indicator 70 or the second indicator 80 while making contact with the screen SC.
- the projector 100 detects an operation performed by the operator using the first indicator 70 or the second indicator 80, and reflects the detected operation on the projected image of the screen SC. Further, the projector 100 may operate as a pointing device by detecting the designated position and output the coordinates of the designated position on the screen SC. Further, a GUI (Graphical User Interface) operation can be performed on the projector 100 using these coordinates.
- GUI Graphic User Interface
- FIG. 2 is a configuration diagram showing the configuration of the projector 100.
- the projector 100 includes an I / F (interface) unit 11 and an image I / F (interface) unit 12 as interfaces connected to external devices.
- the I / F unit 11 and the image I / F unit 12 may include a connector for wired connection, and may include an interface circuit corresponding to this connector.
- the I / F unit 11 and the image I / F unit 12 may include a wireless communication interface. Examples of the connector and interface circuit for wired connection include those based on wired LAN, IEEE 1394, USB, and the like. Examples of the wireless communication interface include those that comply with a wireless LAN, Bluetooth (registered trademark), or the like.
- the image I / F unit 12 may be an image data interface such as an HDMI (registered trademark) interface.
- the image I / F unit 12 may include an interface through which audio data is input.
- the I / F unit 11 is an interface that transmits and receives various data to and from an external device such as a PC.
- the I / F unit 11 inputs and outputs data relating to image projection, data for setting the operation of the projector 100, and the like.
- the control unit 30 described later has a function of transmitting / receiving data to / from an external device via the I / F unit 11.
- the image I / F unit 12 is an interface through which digital image data is input.
- the projector 100 according to the present embodiment projects an image based on digital image data input via the image I / F unit 12.
- the projector 100 may have a function of projecting an image based on the analog image signal.
- the image I / F unit 12 converts the analog image signal into digital image data and an analog image interface.
- the projector 100 includes a projection unit 20 that forms an optical image.
- the projection unit 20 includes a light source unit 21, a light modulation device 22, and a projection optical system 23.
- the light source unit 21 includes a light source including a xenon lamp, an ultra-high pressure mercury lamp, an LED (Light Emitting Diode), a laser light source, or the like.
- the light source unit 21 may include a reflector and an auxiliary reflector that guide light emitted from the light source to the light modulation device 22.
- the projector 100 includes a lens group (not shown) for enhancing the optical characteristics of the projection light, a polarizing plate, or a light control element that reduces the amount of light emitted from the light source on the path to the light modulation device 22. You may have.
- the light modulator 22 includes, for example, three transmissive liquid crystal panels corresponding to the three primary colors of RGB, and modulates the light transmitted through the liquid crystal panel to generate image light.
- the light from the light source unit 21 is separated into three color lights of RGB, and each color light is incident on the corresponding liquid crystal panel.
- the color light modulated by passing through each liquid crystal panel is combined by a combining optical system such as a cross dichroic prism and emitted to the projection optical system 23.
- the projection optical system 23 includes a lens group that guides the image light modulated by the light modulation device 22 toward the screen SC and forms an image on the screen SC. Further, the projection optical system 23 may include a zoom mechanism for enlarging / reducing the display image on the screen SC and adjusting the focus, and a focus adjusting mechanism for adjusting the focus. When the projector 100 is a short focus type, the projection optical system 23 may be provided with a concave mirror that reflects image light toward the screen SC.
- the projection unit 20 is connected to a light source driving unit 45 that turns on the light source unit 21 according to the control of the control unit 30 and a light modulation device driving unit 46 that operates the light modulation device 22 according to the control of the control unit 30.
- the light source drive unit 45 may have a function of adjusting the light amount of the light source unit 21 by switching the light source unit 21 on and off.
- the projector 100 includes an image processing system that processes an image projected by the projection unit 20.
- the image processing system includes a control unit 30 that controls the projector 100, a storage unit 60, an operation detection unit 17, an image processing unit 40, a light source driving unit 45, and a light modulation device driving unit 46.
- a frame memory 41 is connected to the image processing unit 40, and a position detection unit 50 is connected to the control unit 30. These units may be included in the image processing system.
- the control unit 30 controls each unit of the projector 100 by executing a predetermined control program 61.
- the storage unit 60 stores data processed by the control unit 30 and other data in a nonvolatile manner.
- the image processing unit 40 processes the image data input via the image I / F unit 12 under the control of the control unit 30, and outputs an image signal to the light modulation device driving unit 46.
- the processing executed by the image processing unit 40 includes 3D (stereoscopic) image and 2D (planar) image discrimination processing, resolution conversion processing, frame rate conversion processing, distortion correction processing, digital zoom processing, color tone correction processing, luminance correction processing, and the like. It is.
- the image processing unit 40 executes processing designated by the control unit 30 and performs processing using parameters input from the control unit 30 as necessary. Of course, it is also possible to execute a combination of a plurality of processes.
- the image processing unit 40 is connected to the frame memory 41.
- the image processing unit 40 expands the image data input from the image input I / F 12 in the frame memory 41, and executes the various processes described above on the expanded image data.
- the image processing unit 40 reads out the processed image data from the frame memory 41, generates R, G, and B image signals corresponding to the image data, and outputs them to the light modulation device driving unit 46.
- the light modulation device driving unit 46 is connected to the liquid crystal panel of the light modulation device 22.
- the light modulator driving unit 46 drives the liquid crystal panel based on the image signal input from the image processing unit 40 and draws an image on each liquid crystal panel.
- the operation detection unit 17 is connected to a remote control light receiving unit 18 and an operation panel 19 that function as an input device, and detects an operation via the remote control light reception unit 18 and the operation panel 19.
- the remote control light receiving unit 18 receives an infrared signal transmitted by a remote controller (not shown) used by the operator of the projector 100 in response to a button operation.
- the remote control light receiving unit 18 decodes the infrared signal received from the remote control, generates operation data indicating the operation content in the remote controller (remote control), and outputs the operation data to the control unit 30.
- the operation panel 19 is provided in the exterior casing of the projector 100 and includes various switches and indicator lamps.
- the operation detection unit 17 appropriately turns on and off the indicator lamp of the operation panel 19 according to the operation state and setting state of the projector 100 according to the control of the control unit 30.
- operation data corresponding to the operated switch is output from the operation detection unit 17 to the control unit 30.
- the position detection unit 50 detects a position indicated by at least one of the first indicator 70 and the second indicator 80.
- the position detection unit 50 includes an imaging unit 51, a transmission unit 52, an imaging control unit 53, an object detection unit 54, and a coordinate calculation unit 55.
- the photographing unit 51 forms a photographed image obtained by photographing a range including the screen SC and its peripheral part as a photographing range.
- the imaging unit 51 can execute imaging using infrared light and imaging using visible light.
- an infrared imaging device that captures infrared light a visible light imaging device that captures visible light
- an interface circuit of the infrared imaging device an interface circuit of the visible light imaging device.
- photographs visible light and infrared light with one image pick-up element may be sufficient.
- the imaging unit 51 when the imaging unit 51 is provided with a filter that blocks part of light incident on the image sensor, and the infrared light is received by the image sensor, a filter that mainly transmits light in the infrared region is used as the image sensor. You may arrange
- the shooting direction and shooting range (view angle) at the time of shooting with infrared light of the shooting unit 51 are directed in the same direction or substantially the same direction as the projection optical system 23, and the projection optical system 23 projects an image on the screen SC. Cover the range.
- the shooting direction and the shooting range when shooting with the visible light of the shooting unit 51 are directed in the same direction or substantially the same direction as the projection optical system 23, and the range in which the projection optical system 23 projects an image on the screen SC. Cover.
- the imaging unit 51 outputs captured image data captured with infrared light and captured image data captured with visible light.
- the imaging control unit 53 controls the imaging unit 51 according to the control of the control unit 30 and causes the imaging unit 51 to execute imaging.
- the imaging control unit 53 acquires the data of the captured image of the imaging unit 51 and outputs it to the target detection unit 54.
- captured image data captured by the capturing unit 51 with visible light is referred to as “visible light captured image data”
- captured image data captured by the capturing unit 51 with infrared light is referred to as “infrared captured image data”.
- the first indicator 70 is present in the imaging range, the first indicator 70 appears in the visible light image data. Further, in the visible light captured image data, when the operator's finger as the second indicator 80 exists in the imaging range, the second indicator 80 is reflected. In addition, an infrared image captured by the first indicator 70 appears in the data of the infrared light image.
- the transmission unit 52 transmits an infrared signal to the first indicator 70 according to the control of the imaging control unit 53.
- the transmission unit 52 has a light source such as an infrared LED, and turns on and off the light source according to the control of the imaging control unit 53.
- the object detection unit 54 detects the second indicator 80 by detecting the image of the operator's finger from the visible light image data.
- the object detection unit 54 detects the second indicator 80 by the following method. That is, the target detection unit 54 detects a person region in which a person is captured from visible light image data.
- the person area is an area including a person image in the photographed image.
- a generally known method can be used for the detection of the person area by the object detection unit 54.
- the target detection unit 54 detects an edge of the input visible light captured image data, and detects an area that matches the shape of the person as a person area.
- the object detection unit 54 detects an area in which color information (luminance, chromaticity, etc.) changes within a predetermined time, and detects the size of the detected area that is greater than or equal to a predetermined value, A movement range within a predetermined range may be detected as a person area.
- the target detection unit 54 detects a region close to a predetermined finger shape or feature as a region of the second indicator 80 from the detected person region.
- the operator's fingers detected by the target detection unit 54 may be one or more fingers, the entire hand, or a part of the hand including the finger.
- the target detection unit 54 specifies the tip (fingertip) of the finger from the detected region of the second indicator 80, and detects the position of the specified tip of the finger as the indication position.
- the target detection unit 54 calculates the coordinates of the indication position of the second indicator 80 using the coordinates in the captured image data.
- the target detection unit 54 detects the distance between the detected indication position of the second indicator 80 and the screen SC.
- the target detection unit 54 determines the distance between the detected tip of the finger (instructed position) and the screen SC based on the visible light image data.
- the target detection unit 54 detects a finger image and a finger shadow image from the captured image data, and determines the distance between the tip of the finger and the screen SC based on the distance between the detected images.
- the method for detecting the distance between the indication position of the second indicator 80 and the screen SC is not limited to the exemplified method.
- another imaging unit that captures a predetermined range from the surface of the screen SC with visible light with an optical axis parallel to the surface of the screen SC is provided.
- the data of the captured image based on the imaging result of the other imaging unit is output to the target detection unit 54.
- the target detection unit 54 detects the position of the tip of the finger as an indicated position from the input captured image data in the same manner as described above, and detects the detected indicated position in the captured image data and the surface of the screen SC.
- the distance between the designated position (tip of finger) and the screen SC is obtained based on the distance from the image corresponding to.
- the target detection unit 54 detects the coordinates of the designated position of the first indicator 70 based on the infrared light image data.
- the object detection unit 54 detects the infrared light image captured in the captured image data captured by the imaging unit 51 with infrared light, and coordinates of the position indicated by the first indicator 70 in the captured image data. Is detected. Details of the method for specifying the first indicator 70 from the data of the captured image of the imaging unit 51 will be described later.
- the target detection unit 54 determines whether or not the tip portion 71 of the first indicator 70 is in contact with the screen SC, and indicates touch information indicating whether or not the tip portion 71 is in contact with the screen SC. Is generated. A method for determining whether or not the tip 71 of the first indicator 70 is in contact with the screen SC will also be described later.
- the target detection unit 54 detects the distance between the tip 71 (indicated position) of the first indicator 70 and the screen SC.
- the object detection unit 54 as in the method exemplified in the calculation of the distance between the indication position of the second indicator 80 and the screen SC, the first indicator 70 image from the visible light image data, the first The shadow image of the indicator 70 is detected, and the distance between the tip of the finger and the screen SC is obtained based on the distance between the detected images.
- the object detection unit 54 performs a predetermined operation from the surface of the screen SC with an optical axis parallel to the surface of the screen SC, similarly to the method exemplified in the calculation of the distance between the indication position of the second indicator 80 and the screen SC.
- the distance between the indication position of the first indicator 70 and the screen SC is obtained based on the data of the captured image input from another imaging unit that captures the range with visible light.
- the coordinate calculation unit 55 converts the coordinates of the designated position into the coordinates of the designated position in the display image of the screen SC.
- the coordinates of the designated positions of the first indicator 70 and the second indicator 80 detected by the object detection unit 54 are coordinates in the captured image data.
- the coordinate calculation unit 55 calculates the coordinates of the designated position on the coordinate axis virtually provided on the display image of the screen SC from the coordinates of the designated position detected by the target detection unit 54 based on the result of calibration.
- the coordinates in the captured image data are affected by various factors such as the distance between the projector 100 and the screen SC, the zoom ratio in the projection optical system 23, the installation angle of the projector 100, and the distance between the imaging unit 51 and the screen SC.
- the coordinate calculation unit 55 calculates the coordinates of the designated position in the display image of the screen SC from the coordinates of the designated position in the captured image data based on the result of calibration performed in advance.
- a predetermined pattern image is projected from the projection unit 20 onto the screen SC, and the displayed pattern image is photographed by the photographing unit 51.
- the correspondence (coordinate conversion parameter) between the coordinates in the captured image data and the coordinates on the display image of the screen SC is derived.
- the coordinate calculation unit 55 calculates the coordinates of the indication position of the second indicator 80 and the distance between the indication position of the second indicator 80 detected by the target detection unit 54 and the screen SC.
- Information (hereinafter referred to as “second separation distance information”) is output to the control unit 30.
- the coordinate calculation unit 55 includes, for the first indicator 70, the coordinates of the indication position of the first indicator 70, the indication position of the second indicator 80 detected by the target detection unit 54, and the screen SC.
- Information indicating the distance hereinafter referred to as “first separation distance information”
- touch information are output to the control unit 30.
- the 1st indicator 70 is provided with the control part 73, the transmission / reception part 74, the operation switch 75, and the power supply part 76, and these each part is accommodated in the axial part 72 (FIG. 1).
- the control unit 73 is connected to the transmission / reception unit 74 and the operation switch 75 and detects the on / off state of the operation switch 75.
- the transmission / reception unit 74 includes a light source such as an infrared LED and a light receiving element that receives infrared light.
- the operation switch 75 is a switch that is turned on / off depending on whether or not the tip 71 of the first indicator 70 is in contact.
- the power supply unit 76 includes a dry battery or a secondary battery as a power supply, and supplies power to the control unit 73, the transmission / reception unit 74, and the operation switch 75.
- the first indicator 70 may include a power switch that turns on / off the power supply from the power supply unit 76.
- the control unit 30 controls the imaging control unit 53 to transmit a synchronization signal from the transmission unit 52. That is, the imaging control unit 53 turns on the light source of the transmission unit 52 at a predetermined cycle in accordance with the control of the control unit 30.
- the control unit 73 receives infrared light emitted from the transmission unit 52 of the projector 100 by the transmission / reception unit 74.
- the control unit 73 synchronizes with the timing of the infrared light, and the lighting pattern unique to the first indicator 70 set in advance. Then, the light source of the transmission / reception unit 74 is turned on (emits light). Further, the control unit 73 switches the lighting pattern of the transmission / reception unit 74 according to the operation state of the operation switch 75. Therefore, the target detection unit 54 of the projector 100 can determine the operation state of the first indicator 70, that is, whether or not the distal end portion 71 is pressed against the screen SC, based on data of a plurality of captured images.
- the control unit 73 repeatedly executes the above pattern while power is supplied from the power supply unit 76. That is, the transmission unit 52 periodically transmits an infrared signal for synchronization to the first indicator 70, and the first indicator 70 is synchronized with the infrared signal transmitted by the transmission unit 52 in advance. Send the set infrared signal.
- the shooting control unit 53 performs control to match the shooting timing of the shooting unit 51 with the timing when the first indicator 70 is turned on. This shooting timing is determined based on the timing at which the shooting control unit 53 turns on the transmission unit 52.
- the object detection unit 54 can specify a pattern in which the first indicator 70 is turned on depending on whether or not the image of the light of the first indicator 70 is reflected in the data of the captured image of the imaging unit 51.
- the target detection unit 54 determines whether the tip 71 of the first indicator 70 is pressed against the screen SC based on the data of the plurality of captured images, and generates touch information.
- the lighting pattern of the first indicator 70 includes a pattern unique to each individual of the first indicator 70, or a pattern common to a plurality of first indicators 70 and a pattern unique to each individual. Can do. In this case, when the captured image data includes infrared light images emitted from the plurality of first indicators 70, the target detection unit 54 sets each image as an image of a different first indicator 70. Can be distinguished.
- the control unit 30 reads and executes the control program 61 stored in the storage unit 60, thereby realizing the functions of the projection control unit 31, the projection size detection unit 32, the detection unit 33, and the movement amount adjustment unit 34, and the projector 100. Control each part.
- the projection control unit 31 acquires the operation content performed by the operator operating the remote control.
- the projection control unit 31 controls the image processing unit 40, the light source driving unit 45, and the light modulation device driving unit 46 according to the operation performed by the operator, and projects an image on the screen SC. Further, the projection control unit 31 controls the image processing unit 40 to determine the above-described 3D (stereoscopic) image and 2D (planar) image, resolution conversion processing, frame rate conversion processing, distortion correction processing, digital zoom processing, Color tone correction processing, luminance correction processing, and the like are executed. Further, the projection control unit 31 controls the light source driving unit 45 in accordance with the processing of the image processing unit 40 and controls the light amount of the light source unit 21.
- the projection size detection unit 32 performs the following processing in power-on of the projector 100 or in calibration performed in response to a user instruction, and forms an area (pixel formation) on the screen SC where an image can be projected by the projector 100.
- the size of the image projection area that is a possible area) is detected.
- the size of the image projection area corresponds to “the size of the area on which the projection image is projected”. That is, in calibration, the projection size detection unit 32 acquires image data (hereinafter referred to as “specific pattern image data”) of a specific pattern image (hereinafter referred to as “specific pattern image data”).
- the specific pattern image data is stored in advance in a predetermined storage area of the storage unit 60.
- the projection size detection unit 32 controls the image processing unit 40 based on the acquired specific pattern image data, and also controls the light source driving unit 45, the light modulation device driving unit 46, and other mechanisms to display the screen SC.
- a specific pattern image is projected.
- the specific pattern image is, for example, an image including images of a predetermined shape indicating the four corners at the four corners of the maximum area where the pixels can be formed (area corresponding to the image projection area).
- the projection size detection unit 32 controls the photographing control unit 53 to control the photographing unit 51 to photograph the screen SC.
- Shooting image data based on the shooting result of the shooting unit 51 is output from the shooting control unit 53 to the projection size detection unit 32.
- the projection size detection unit 32 analyzes the captured image data input from the imaging control unit 53 and detects the size of the image projection area by the following method, for example. In other words, the projection size detection unit 32 specifies image data indicating four corners in the captured image data by a method such as pattern matching. Next, the projection size detection unit 32 detects the separation distance in the captured image data of the data of the two corners that are separated in the vertical direction from the data of the four corners.
- the projection size detection unit 32 takes into account image processing such as trapezoidal correction performed when projecting the specific pattern image, and settings related to the projection optical system 23 such as the zoom ratio in the projection optical system 23, and then performs detection. Based on the separated distance, the vertical length of the image projection area is detected. In the same manner, the projection size detection unit 32 detects the length of the image projection area in the left-right direction based on the distance between the data of the two corner images separated in the left-right direction in the captured image data.
- a combination of the vertical length and the horizontal length of the image projection area detected as described above corresponds to the size of the image projection area.
- the vertical length of the image projection area and the horizontal length of the image projection area mean a physical length that can be expressed in a predetermined unit.
- the detection unit 33 For the second indicator 80, the detection unit 33 outputs the coordinates of the indication position of the second indicator 80 output by the coordinate calculation unit 55, and the second separation distance information (the indication position of the second indicator 80 and the screen). Information indicating the distance to the SC).
- the detection unit 33 detects that the predetermined operation has been performed based on the acquired information. For example, when a GUI including an icon for instructing execution of a predetermined process is operated by the second indicator 80, the detection unit 33 detects that based on the acquired information.
- the detection unit 33 outputs the coordinates of the indication position of the first indicator 70 output from the coordinate calculation unit 55 and the first separation distance information (the indication position of the second indicator 80.
- the detection unit 33 detects that the predetermined operation has been performed. For example, when a GUI including an icon for instructing execution of a predetermined process is operated by the first indicator 70, the detection unit 33 detects that based on the acquired information.
- the detection unit 33 uses a first gesture 70 or a second indicator 80 to give a predetermined gesture (hereinafter referred to as “movement”) that instructs the object image (first object) to move on the screen SC. If a gesture “) is performed, this is detected.
- a predetermined gesture hereinafter referred to as “movement”
- an object image refers to an image whose display position on the screen SC can be moved by a movement gesture (described later).
- the operator is provided by a GUI provided by the function of the projector 100, a window based on image data input from an external device via the I / F unit 11, the second indicator 80, or the first indicator 70.
- An image or the like drawn by which can move the display position by a movement gesture corresponds to an object image.
- the size of the above-described image projection area on the screen SC varies depending on the model of the projector 100, the separation distance between the screen SC and the projector 100, the projector settings, and other factors, and is not constant.
- the object image displayed in the image projection region the object image is contacted with the indicator S, and the indicator is maintained while the indicator S is in contact with the screen SC.
- the object image By moving S in a predetermined direction, the object image can be moved according to the movement of the indicator S.
- the operator can use the method described above to detect the object image located at one end in the left and right direction of the image projection area. Can be easily moved to the other end.
- the left and right width of the image projection area is considerably larger than the range that the operator reaches with both hands open, the object image located at one end in the left and right direction of the image projection area by the above-described method, When moving to the other end, there may be a situation where the operator must walk, and the operation may not be easy for the operator. Based on the above, the projector 100 executes the following processing regarding the movement of the object image.
- FIG. 3 is a flowchart showing the operation of the projector 100.
- the image data corresponding to the object image is developed in the frame memory 41, and the object image is displayed in the image projection area of the screen SC. Is displayed.
- the detection unit 33 indicates the indicated position of the first indicator 70.
- the first separation distance information are acquired at a predetermined cycle
- the second indicator 80 is present in the imaging range of the imaging unit 51.
- the coordinates of the designated position of the second indicator 80 hereinafter referred to as “second designated position coordinates” and the second separation distance information are acquired at a predetermined cycle (step SA1).
- the detection unit 33 determines whether or not a movement gesture has been performed by the indicator S based on the information acquired in Step SA1 (Step SA2). That is, the detection unit 33 determines whether or not a movement gesture has been performed by the first indicator 70 based on the continuous first indication position coordinates acquired at a predetermined cycle in step SA1 and the first separation distance information. Further, it is determined whether or not a movement gesture has been performed by the second indicator 80 based on the continuous second indicated position coordinates acquired at a predetermined cycle and the second separation distance information.
- the process of step SA2 will be described in detail.
- FIG. 4 is a diagram for explaining a movement gesture.
- FIG. 4A shows the trajectory of the indication position of the indicator S when a movement gesture is performed along the surface of the screen SC when the screen SC is viewed upward.
- FIG. 4B is a diagram when the screen SC is viewed along the optical axis of the projection optical system (an axis extending obliquely downward toward the screen SC) in the setting state of the projector 100 illustrated in FIG. 4A shows the trajectory of the indicated position of the indicator S when the same movement gesture as the movement gesture shown in FIG.
- FIG. 4C shows the trajectory of the indication position of the indicator S when the same movement gesture as that shown in FIG. 4A is performed when the screen SC is viewed from the front.
- the movement gesture is continuous with the first movement in which the indication position of the indicator S moves in the first direction while being in contact with the screen SC and the first movement, and the indication position of the indicator S is not on the screen SC. It is a gesture including a second movement that moves in a second direction in a contact state.
- the trajectory K indicates the trajectory of the indicated position of the indicator S in the movement gesture starting from T1 and ending at T2.
- the trajectory K1 indicates the trajectory of the indication position of the indicator S that moves in the first direction toward the first direction H1
- the trajectory K2 indicates the second direction H2 in the second movement.
- the trajectory of the indicated position of the indicator S that moves toward is shown.
- the second movement is made on condition that the trajectory of the pointing position of the pointer S satisfies the following condition.
- the screen is started from the transition point.
- the angle in the direction orthogonal to the surface of the SC is within the first angle (“ ⁇ 1” in FIG. 4A) and the angle in the direction parallel to the surface of the screen SC is the second angle (FIG. 4 ( In the example of C), a three-dimensional area within a predetermined range (area shown by slant lines in FIG. 4) extending within “ ⁇ 2”) is referred to as a “movable area”.
- the size of the movable area is defined by the length of the imaginary straight line (L1 in FIG. 4) extending in the first direction starting from the transition point.
- the condition for determining the second movement is that the trajectory of the indicator S starting from the movement point has an angle in a direction perpendicular to the plane of the screen SC within the first angle, and is parallel to the plane of the screen SC. The condition is that the angle in the correct direction is within the second angle.
- the detection unit 33 does not determine the second movement.
- the trajectory K3 of the indicator S indicated by a broken line in FIG. 4A has an angle in a direction orthogonal to the surface of the screen SC exceeds the first angle ( ⁇ 1), and therefore the detection unit 33 is Is not determined as the second movement.
- the reason for determining the second movement when the trajectory of the indicator S is within the movable region is as follows. That is, this is because the gesture intentionally performed by the operator with the intention to move the object image in a manner to be described later is determined as a movement gesture. That is, the gesture determined as the movement gesture is an operation that is unlikely to occur accidentally in which the first movement and the second movement described above are continuous, and when the trajectory of the indicator S is within the movable region, By determining the second movement, a gesture intentionally performed by the user can be determined as a movement gesture.
- a gesture determined as a movement gesture is reminiscent of an operation performed when a physical paper medium is moved in a predetermined direction, and a user can easily imagine an operation to be performed for determination as a movement gesture. .
- the detection unit 33 determines the first indicator 70 based on the first indicated position coordinates based on the information continuously input at a predetermined period and the first separation distance information.
- the trajectory of the indicated position of one indicator 70 is detected.
- the detection unit 33 uses a virtual axis extending in the left-right direction parallel to the surface of the screen SC as the x-axis, and a virtual axis extending in the vertical direction parallel to the surface of the screen SC as the y-axis.
- the indicated position of the first indicator 70 that continues in a predetermined cycle is plotted, and the plotted points are connected.
- a line (hereinafter referred to as “orbit line”) is detected as the trajectory of the designated position of the first indicator 70.
- the detection unit 33 determines whether or not the trajectory line in the coordinate system described above includes a line corresponding to the first movement and a line corresponding to the second movement continuous with the line.
- the detection unit 33 calculates an area corresponding to the movable area in the coordinate system described above, and calculates Based on the positional relationship between the region and the predetermined line, it is determined whether or not the locus of the indication position of the first indicator 70 corresponding to the predetermined line satisfies the above-described condition, and the condition is satisfied.
- the predetermined line is not determined as a line corresponding to the second movement.
- the detection unit 33 determines that the trajectory line in the coordinate system described above includes a line corresponding to the first movement and a line corresponding to the second movement continuous with the line, a movement gesture is performed. It is determined that In step SA ⁇ b> 2, the detection unit 33 determines whether or not a movement gesture has been performed on the second indicator 80 in the same manner as the first indicator 70.
- step SA2 NO
- the movement amount adjustment unit 34 ends the process.
- step SA3 when it is determined that a movement gesture has been performed (step SA2: YES), the movement amount adjustment unit 34 executes the following process (step SA3). That is, the movement amount adjustment unit 34 acquires the indicator moving distance that is the length of the indicator S in the first movement in the first direction.
- the movement amount adjusting unit 34 changes the indication position of the pointing body S by the second movement.
- the length of the trajectory in the first direction exceeds the length of the movable region in the first direction
- the length of the movable region in the first direction is set as the value of the indicator moving distance. For example, in FIG.
- the trajectory K2 is the trajectory of the indication position of the indicator S in the second movement, and the end point T2 of the trajectory K2 exceeds the movable region.
- the movement amount adjusting unit 34 sets the length L1 of the movable distance in the first direction as the value of the pointer movement distance.
- the trajectory K4 indicated by the two-dot chain line has the end point T3 within the range of the movable region.
- the movement amount adjustment unit 34 sets the length L2 of the trajectory K4 in the first direction H1 as the value of the pointer movement distance.
- step SA3 the movement amount adjustment unit 34 calculates the value of the pointer movement distance based on the trajectory line in the coordinate system described above.
- the movement amount adjustment unit 34 acquires a movement amount coefficient (step SA4).
- the predetermined storage area of the storage unit 60 for each size of the image projection area (a combination of the vertical length of the image projection area and the horizontal length of the image projection area), the size and the movement amount A table for storing the coefficients in association with each other is stored in advance.
- Information indicating the size of the image projection area detected by the projection size detection unit 32 during calibration is stored in a predetermined recording area of the storage unit 60.
- the movement amount adjustment unit 34 acquires the size of the image projection area stored in the storage unit 60, refers to the above-described table, and is associated with the acquired size of the image projection area in the table. Get the movement amount coefficient.
- the larger the size of the image projection area the greater the value of the movement amount coefficient associated with the size of the image projection area.
- the method of using the movement amount coefficient and the reason why the size of the image projection area and the value of the movement amount coefficient have a positive correlation will be described later.
- the method of acquiring the movement amount coefficient is not limited to the method described above.
- a predetermined mathematical formula for calculating the movement amount coefficient is set in advance using the size of the image projection area, and the movement amount adjustment unit 34 may calculate the movement amount coefficient using the mathematical expression. .
- step SA5 the movement amount adjusting unit 34 calculates a movement amount based on the movement amount coefficient acquired in step SA4 (step SA5).
- step SA5 the process of step SA5 will be described in detail.
- FIG. 5A is a diagram used for explaining the movement amount.
- FIG. 5 (A1) is a diagram illustrating the state in which the object image G1 is displayed at the left end of the image projection area Q1 of the screen SC together with the operator.
- FIG. 5 (A2) is a diagram illustrating, together with the operator, how the operator performs a movement gesture in the state of FIG. 5 (A1) and the position of the object image G1 has moved to the right end of the image projection area Q1.
- a movement gesture first operation
- the movement gesture is calculated by a movement amount (second movement amount) calculated by the movement amount adjustment unit 34 by a process described later.
- the first object image moves in the first direction according to the above.
- the movement amount is a separation distance between the position of the object image before the movement and the position of the object image after the movement when the movement gesture is performed and the movement of the object image is performed.
- the distance between the position of the object image G1 shown in FIG. 5A1 and the position of the object image G1 shown in FIG. 5A2 corresponds to the amount of movement.
- step SA5 the movement amount adjustment unit 34 calculates the movement amount by the following formula based on the pointer movement distance acquired in step SA3 and the movement amount coefficient acquired in step SA4.
- Movement amount Indicator movement distance x Movement amount coefficient
- the size of the image projection area and the movement amount coefficient have a relationship that the corresponding movement amount coefficient is larger as the size of the image projection area is larger. Therefore, if the pointer movement distance is the same, the value of the movement amount obtained by the above equation increases as the size of the image projection area increases. Thereby, the following effects are produced.
- FIG. 5 (B1) is a diagram illustrating, together with the operator, the state in which the object image G1 is displayed at the left end of the image projection area Q2 that is smaller in size than the image projection area Q1.
- 5B2 shows that the operator performs a movement gesture in the same manner as in FIG. 5A in the state of FIG. 5B1, and the position of the object image G1 moves to the right end of the image projection area Q2.
- the movement gestures of the same mode are the first direction which is the direction of the locus of the pointing position of the indicator S in the first movement and the pointing position of the pointer S in the second movement for each of the movement gestures. This means that the second direction, which is the direction of the trajectory, matches, and that the pointer movement distances match.
- the movement amount coefficient increases as the size of the image projection area increases, the movement amount increases as the size of the image projection area increases when the movement gesture is performed in the same manner.
- the movement amount (first movement amount) of the object image when the operation (1) is performed is the movement amount (second movement amount) in the case where the size of the projection image area is large as shown in FIG. Is getting bigger.
- step SA6 the projection control unit 31 controls the image processing unit 40 to move the development position of the image data of the object image in the frame memory 41, and for the object image that has been moved by the movement gesture,
- the display position in the projection area is moved in the first direction by the movement amount calculated by the movement amount adjustment unit 34 in step SA6.
- the position of the object image moves by the amount of movement corresponding to the size of the image projection area, corresponding to the movement gesture.
- step SA6 when the object image is an image based on image data supplied from an external device, the projection control unit 31 may perform the following processing.
- information indicating the coordinates of the object image after movement associated with the movement gesture is transmitted to an external device.
- the external device updates, for example, information for managing the display position of the object image based on the received information, and sets the information to a value corresponding to the actual display position of the object image.
- the projector 100 includes the projection unit 20 that projects the projection image including the object image (first object) on the screen SC (projection surface), and the image projection area (projection image is projected).
- a projection size detection unit 32 that detects the size of the display area
- a detection unit 33 that detects an operation of the indicator S (the first indicator 70 and the second indicator 80) on the screen SC
- a detection unit 33 detects an operation of the indicator S (the first indicator 70 and the second indicator 80) on the screen SC
- a detection unit 33 detects an operation of the indicator S (the first indicator 70 and the second indicator 80) on the screen SC
- a detection unit 33 detects an operation of the indicator S (the first indicator 70 and the second indicator 80) on the screen SC
- a detection unit 33 detects an operation of the indicator S (the first indicator 70 and the second indicator 80) on the screen SC
- a detection unit 33 detects an operation of the indicator S (the first indicator 70 and the second indicator 80) on the screen SC
- a detection unit 33 detects an operation of the
- the movement amount adjustment unit 34 moves the object image when the operation of the indicator S (first operation) is performed when the size of the image projection area is the first size (first operation).
- the first movement amount) and the operation of the indicator S when the size of the image projection area is a second size larger than the first size (the same operation as that performed in the case of the first size).
- the movement amount (second movement amount) of the object image is larger in the movement amount (second movement amount) when the operation (1) is performed.
- the movement amount adjustment unit 34 increases the movement amount of the object image when the similar operation of the indicator S is performed as the size of the image projection region is larger. According to this configuration, as the size of the image projection area is larger, the amount of movement of the object image when the movement gesture is performed in the same manner can be increased, and the movement of the object image by the operator is facilitated. .
- the movement gesture is an operation for continuously shifting from the state in which the indicator S moves in contact with the screen SC to the state in which the pointer S moves in a non-contact manner. Then, the movement amount adjusting unit 34 uses a coefficient (movement) according to the size of the image projection area as the movement distance (indicator movement distance) of the indicator S after the indicator S has shifted to the non-contact state with the screen SC. The amount of movement of the object image is calculated by multiplying by the amount coefficient. According to this configuration, the gesture intentionally performed by the operator with the intention to move the object image can be determined as the movement gesture.
- a gesture determined as a movement gesture is reminiscent of an operation performed when a physical paper medium is moved in a predetermined direction, and it is easy for a user to imagine a work to be performed for determination as a movement gesture. .
- the movement amount adjustment unit 34 can set the value of the movement amount to an appropriate value according to the size of the image projection area using the movement amount coefficient.
- the projection size detection unit 32 causes the projection unit 20 to project a specific pattern image (specific pattern image) on the screen SC, and causes the imaging unit 51 to capture the screen SC on which the specific pattern image is projected. Based on the result of photographing by the photographing unit 51, the size of the image projection area is detected. According to this configuration, the user's work is not required for the detection of the size of the image projection area, and the convenience for the user is improved.
- the first indicator 70 is not limited to the pen-type indicator
- the second indicator 80 is not limited to the finger of the operator.
- a laser pointer, an indicator bar, or the like may be used as the first indicator 70, and its shape and size are not limited.
- the position detection unit 50 captures the screen SC by the photographing unit 51 and specifies the positions of the first indicator 70 and the second indicator 80. It is not limited to.
- the photographing unit 51 is not limited to the one provided in the main body of the projector 100 and photographing the projection direction of the projection optical system 23.
- the imaging unit 51 may be configured separately from the main body of the projector 100, and the imaging unit 51 may perform imaging from the side or front of the screen SC.
- a mode in which the user operates the first indicator 70 and the second indicator 80 on the screen SC on which an image is projected (displayed) from the front projection type projector 100 will be described. did.
- a mode in which an instruction operation is performed on a display screen displayed by a display device other than the projector 100 may be used.
- a display device other than the projector 100 a rear projection (rear projection) type projector, a liquid crystal display, and an organic EL (Electro Luminescence) display can be used.
- a plasma display, a CRT (cathode ray tube) display, a surface-conduction electron-emitter display (SED), or the like can be used.
- a synchronization is demonstrated.
- the signal for use is not limited to an infrared signal.
- a synchronization signal may be transmitted by radio wave communication or ultrasonic wireless communication. This configuration can be realized by providing the projector 100 with the transmission unit 52 that transmits a signal by radio wave communication or ultrasonic radio communication, and providing the same reception unit with the first indicator 70.
- the light modulation device 22 that modulates the light emitted from the light source has been described by taking as an example a configuration using three transmissive liquid crystal panels corresponding to RGB colors. Is not limited to this.
- a configuration using three reflective liquid crystal panels may be used, or a method in which one liquid crystal panel and a color wheel are combined may be used.
- a system using three digital mirror devices (DMD), a DMD system combining one digital mirror device and a color wheel, or the like may be used.
- a member corresponding to a composite optical system such as a cross dichroic prism is unnecessary.
- any light modulation device capable of modulating light emitted from the light source can be employed without any problem.
- each functional unit of the projector 100 illustrated in FIG. 2 indicates a functional configuration, and a specific mounting form is not particularly limited. That is, it is not always necessary to mount hardware corresponding to each function unit individually, and it is of course possible to adopt a configuration in which the functions of a plurality of function units are realized by one processor executing a program.
- a part of the function realized by software may be realized by hardware, or a part of the function realized by hardware may be realized by software.
- the specific detailed configuration of each other part of the projector 100 can be arbitrarily changed without departing from the gist of the present invention.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Computer Hardware Design (AREA)
- Geometry (AREA)
- Controls And Circuits For Display Device (AREA)
- Projection Apparatus (AREA)
- Position Input By Displaying (AREA)
- Transforming Electric Information Into Light Information (AREA)
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US15/560,380 US20180075821A1 (en) | 2015-03-30 | 2016-03-18 | Projector and method of controlling projector |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2015-068094 | 2015-03-30 | ||
| JP2015068094A JP6665415B2 (ja) | 2015-03-30 | 2015-03-30 | プロジェクター、及び、プロジェクターの制御方法 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2016157804A1 true WO2016157804A1 (ja) | 2016-10-06 |
Family
ID=57004905
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2016/001602 Ceased WO2016157804A1 (ja) | 2015-03-30 | 2016-03-18 | プロジェクター、及び、プロジェクターの制御方法 |
Country Status (3)
| Country | Link |
|---|---|
| US (1) | US20180075821A1 (enExample) |
| JP (1) | JP6665415B2 (enExample) |
| WO (1) | WO2016157804A1 (enExample) |
Families Citing this family (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2019078845A (ja) * | 2017-10-23 | 2019-05-23 | セイコーエプソン株式会社 | プロジェクターおよびプロジェクターの制御方法 |
| JP6868665B2 (ja) * | 2019-07-03 | 2021-05-12 | 三菱電機インフォメーションシステムズ株式会社 | データ入力装置、データ入力方法及びデータ入力プログラム |
| JP7709444B2 (ja) * | 2020-08-28 | 2025-07-16 | 富士フイルム株式会社 | 制御装置、制御方法、制御プログラム、及び投影システム |
| CN114363595B (zh) * | 2021-12-31 | 2024-06-04 | 上海联影医疗科技股份有限公司 | 一种投影装置及检查设备 |
| CN116074592A (zh) * | 2021-10-29 | 2023-05-05 | 中强光电股份有限公司 | 投影系统及其控制方法 |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2003280813A (ja) * | 2002-03-25 | 2003-10-02 | Ejikun Giken:Kk | ポインティングデバイス、ポインタ制御装置、ポインタ制御方法及びその方法を記録した記録媒体 |
| JP2005148661A (ja) * | 2003-11-19 | 2005-06-09 | Sony Corp | 情報処理装置、および情報処理方法 |
| WO2010032402A1 (ja) * | 2008-09-16 | 2010-03-25 | パナソニック株式会社 | データ表示装置、集積回路、データ表示方法、データ表示プログラム及び記録媒体 |
| JP2012220678A (ja) * | 2011-04-07 | 2012-11-12 | Seiko Epson Corp | カーソル表示システム |
| JP2014029656A (ja) * | 2012-06-27 | 2014-02-13 | Soka Univ | 画像処理装置および画像処理方法 |
Family Cites Families (10)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2002003316A1 (en) * | 2000-07-05 | 2002-01-10 | Smart Technologies Inc. | Camera-based touch system |
| US20070013873A9 (en) * | 2004-04-29 | 2007-01-18 | Jacobson Joseph M | Low cost portable computing device |
| KR20080040930A (ko) * | 2006-11-06 | 2008-05-09 | 삼성전자주식회사 | 컴퓨터 시스템 및 그 제어방법 |
| US8297757B2 (en) * | 2008-10-29 | 2012-10-30 | Seiko Epson Corporation | Projector and projector control method |
| JP5488306B2 (ja) * | 2010-07-29 | 2014-05-14 | 船井電機株式会社 | プロジェクタ |
| US20120044140A1 (en) * | 2010-08-19 | 2012-02-23 | Sanyo Electric Co., Ltd. | Information display system and program, and optical input system, projection-type images and display apparatus |
| US20150160741A1 (en) * | 2012-06-20 | 2015-06-11 | 3M Innovative Properties Company | Device allowing tool-free interactivity with a projected image |
| JP2014089522A (ja) * | 2012-10-29 | 2014-05-15 | Kyocera Corp | 電子機器及び制御プログラム並びに電子機器の動作方法 |
| US10394434B2 (en) * | 2013-02-22 | 2019-08-27 | Samsung Electronics Co., Ltd. | Apparatus and method for recognizing proximity motion using sensors |
| US9785243B2 (en) * | 2014-01-30 | 2017-10-10 | Honeywell International Inc. | System and method for providing an ergonomic three-dimensional, gesture based, multimodal interface for use in flight deck applications |
-
2015
- 2015-03-30 JP JP2015068094A patent/JP6665415B2/ja active Active
-
2016
- 2016-03-18 US US15/560,380 patent/US20180075821A1/en not_active Abandoned
- 2016-03-18 WO PCT/JP2016/001602 patent/WO2016157804A1/ja not_active Ceased
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2003280813A (ja) * | 2002-03-25 | 2003-10-02 | Ejikun Giken:Kk | ポインティングデバイス、ポインタ制御装置、ポインタ制御方法及びその方法を記録した記録媒体 |
| JP2005148661A (ja) * | 2003-11-19 | 2005-06-09 | Sony Corp | 情報処理装置、および情報処理方法 |
| WO2010032402A1 (ja) * | 2008-09-16 | 2010-03-25 | パナソニック株式会社 | データ表示装置、集積回路、データ表示方法、データ表示プログラム及び記録媒体 |
| JP2012220678A (ja) * | 2011-04-07 | 2012-11-12 | Seiko Epson Corp | カーソル表示システム |
| JP2014029656A (ja) * | 2012-06-27 | 2014-02-13 | Soka Univ | 画像処理装置および画像処理方法 |
Also Published As
| Publication number | Publication date |
|---|---|
| JP2016188892A (ja) | 2016-11-04 |
| JP6665415B2 (ja) | 2020-03-13 |
| US20180075821A1 (en) | 2018-03-15 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP6064319B2 (ja) | プロジェクター、及び、プロジェクターの制御方法 | |
| US9817301B2 (en) | Projector, projection system, and control method of projector | |
| JP6326895B2 (ja) | 位置検出装置、位置検出システム、及び、位置検出装置の制御方法 | |
| US9992466B2 (en) | Projector with calibration using a plurality of images | |
| JP6349838B2 (ja) | 位置検出装置、位置検出システム、及び、位置検出装置の制御方法 | |
| JP6387644B2 (ja) | 位置検出装置、位置検出システム、及び、位置検出方法 | |
| WO2016157804A1 (ja) | プロジェクター、及び、プロジェクターの制御方法 | |
| JP6051828B2 (ja) | 表示装置、及び、表示装置の制御方法 | |
| JP6350175B2 (ja) | 位置検出装置、プロジェクター、及び、位置検出方法 | |
| EP2916201B1 (en) | Position detecting device and position detecting method | |
| WO2016157803A1 (ja) | 表示装置、表示装置の制御方法、書画カメラ、及び、書画カメラの制御方法 | |
| JP6269801B2 (ja) | プロジェクター、及び、プロジェクターの制御方法 | |
| JP6405836B2 (ja) | 位置検出装置、プロジェクター、及び、位置検出方法 | |
| JP6806220B2 (ja) | 電子デバイス、及び電子デバイスの制御方法 | |
| JP6634904B2 (ja) | 電子デバイス、及び電子デバイスの制御方法 | |
| JP6439398B2 (ja) | プロジェクター、及び、プロジェクターの制御方法 | |
| JP6111682B2 (ja) | 表示装置、および、表示装置の制御方法 | |
| JP6988985B2 (ja) | 電子デバイス、及び電子デバイスの制御方法 | |
| JP6524741B2 (ja) | 位置検出装置、表示装置、位置検出装置の制御方法、及び、表示装置の制御方法 | |
| JP2014120022A (ja) | 表示装置、及び、表示装置の制御方法 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16771697 Country of ref document: EP Kind code of ref document: A1 |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 15560380 Country of ref document: US |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 16771697 Country of ref document: EP Kind code of ref document: A1 |