WO2017130504A1 - 画像投影装置 - Google Patents
画像投影装置 Download PDFInfo
- Publication number
- WO2017130504A1 WO2017130504A1 PCT/JP2016/082469 JP2016082469W WO2017130504A1 WO 2017130504 A1 WO2017130504 A1 WO 2017130504A1 JP 2016082469 W JP2016082469 W JP 2016082469W WO 2017130504 A1 WO2017130504 A1 WO 2017130504A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- projection
- unit
- finger
- projection screen
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/042—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
- G06F3/0425—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/042—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
- G06F3/0425—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
- G06F3/0426—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected tracking fingers with respect to a virtual keyboard projected or printed on the surface
-
- A—HUMAN NECESSITIES
- A47—FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
- A47B—TABLES; DESKS; OFFICE FURNITURE; CABINETS; DRAWERS; GENERAL DETAILS OF FURNITURE
- A47B21/00—Tables or desks for office equipment, e.g. typewriters, keyboards
- A47B21/007—Tables or desks for office equipment, e.g. typewriters, keyboards with under-desk displays, e.g. displays being viewable through a transparent working surface of the table or desk
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B21/00—Projectors or projection-type viewers; Accessories therefor
- G03B21/54—Accessories
- G03B21/56—Projection screens
- G03B21/60—Projection screens characterised by the nature of the surface
- G03B21/62—Translucent screens
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1615—Constructional details or arrangements for portable computers with several enclosures having relative motions, each enclosure supporting at least one I/O or computing function
- G06F1/1616—Constructional details or arrangements for portable computers with several enclosures having relative motions, each enclosure supporting at least one I/O or computing function with folding flat displays, e.g. laptop computers or notebooks having a clamshell configuration, with body parts pivoting to an open position around an axis parallel to the plane they define in closed position
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1626—Constructional details or arrangements for portable computers with a single-body enclosure integrating a flat display, e.g. Personal Digital Assistants [PDAs]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1637—Details related to the display arrangement, including those related to the mounting of the display in the housing
- G06F1/1639—Details related to the display arrangement, including those related to the mounting of the display in the housing the display being based on projection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1684—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1684—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
- G06F1/1686—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being an integrated camera
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0354—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0416—Control or interface arrangements specially adapted for digitisers
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/042—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/90—Determination of colour characteristics
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3141—Constructional details thereof
-
- A—HUMAN NECESSITIES
- A47—FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
- A47B—TABLES; DESKS; OFFICE FURNITURE; CABINETS; DRAWERS; GENERAL DETAILS OF FURNITURE
- A47B21/00—Tables or desks for office equipment, e.g. typewriters, keyboards
- A47B21/007—Tables or desks for office equipment, e.g. typewriters, keyboards with under-desk displays, e.g. displays being viewable through a transparent working surface of the table or desk
- A47B2021/0076—Tables or desks for office equipment, e.g. typewriters, keyboards with under-desk displays, e.g. displays being viewable through a transparent working surface of the table or desk the screen being incorporated in the desk top
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30204—Marker
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/67—Focus control based on electronic image sensor signals
Definitions
- the present invention relates to an image projection apparatus in which a user performs an input instruction by performing a touch operation on an image projected on a projection screen.
- Patent Document 1 Currently, with respect to projectors, various techniques for detecting a touch operation when a user performs a touch operation on an image projected on a projection screen have been proposed (for example, see Patent Document 1). (See the prior art described in Patent Document 1).
- a projected image on a projection screen is captured by a camera installed in the vicinity of the projector, and the difference between the original image of the projected image projected from the projector and the captured image from the camera
- the touch area on the captured image corresponding to the touch area on the projected image is calculated to obtain an area ratio between the shadow area of the finger and the area of the finger, and the touch area when the area ratio becomes smaller than a predetermined value. It is determined that the hand is touched.
- the touch operation includes various operations such as a tap operation, a double tap operation, a long press operation (long tap), a drag operation, a flick operation, and a scroll operation.
- the technique described in Patent Document 1 has a problem that these various operations cannot be accurately determined. For example, when a user performs a double-tap operation, the user touches the image twice with the finger so that the finger is not too far from the projection screen. However, in the technique described in Patent Document 1, the shadow region of the finger is used. When the area ratio between the finger area and the finger area is smaller than the predetermined value, it is determined that the hand has touched the touch area. It is difficult to clearly identify whether or not
- the present invention has been made based on the above circumstances, and an image that can accurately identify the content of the touch operation when the user performs various touch operations on the image projected on the projection screen.
- An object of the present invention is to provide a projection device.
- an image projection apparatus has a projection screen, a projection unit that projects and displays a predetermined image on the projection screen, and a function for focusing, and is projected onto the projection screen.
- An image pickup unit that picks up an image and obtains image data, and generates reference data that identifies the position and size of the image projected on the projection screen in the image pickup range based on the image data obtained by the image pickup unit Among the image data obtained by the reference data generation unit and the imaging unit, there is image data in which there is a finger or an indicator that the user has operated on the image projected on the projection screen.
- An image data extraction unit that extracts image data focused on a finger or an indicator, and imaging based on the image data extracted by the image data extraction unit
- a position data generation unit that generates position data for specifying the position of the finger or the pointing unit in the enclosure, and an operation determination that determines the content of the operation by the finger or the pointing tool based on the image data extracted by the image data extraction unit Corresponding to the operation by the finger or the pointing tool based on the content of the operation determined by the operation determining unit, the position data generated by the position data generating unit, and the reference data generated by the reference data generating unit
- An input control unit for recognizing the content of the input instruction and controlling the projection unit so as to project an image corresponding to the recognized content of the input instruction onto the projection screen.
- the imaging unit be adjusted so as to be in focus within a range from the projection screen to a position separated by a predetermined distance back and forth along the vertical direction.
- the imaging unit has a function of focusing
- the image data extraction unit is configured to apply the image projected on the projection screen by the user among the image data obtained by the imaging unit. Then, image data in which the finger or the pointing tool that has been operated is present and image data that is in focus on the finger or pointing tool is extracted. For this reason, for example, if the imaging unit is adjusted so as to focus within a range that is a certain short distance back and forth along the vertical direction from the projection screen, it is extracted by the image data extraction unit. Since the image data includes only the image data in which the finger or the pointing tool is present in the vicinity of the projection screen, the operation determination unit performs the finger or instruction based on the image data extracted by the image data extracting unit.
- the input control unit includes the content of the operation performed by the finger or the pointing tool obtained by the determination by the operation determining unit, the position data of the finger or the pointing tool generated by the position data generating unit, and the reference data generating unit.
- the projection unit recognizes the content of the input instruction corresponding to the operation by the finger or the pointing tool based on the reference data generated in step, and projects an image according to the recognized content of the input instruction on the projection screen. To control. Therefore, when the user performs various touch operations on the image projected on the projection screen, the image projection apparatus of the present invention can accurately recognize an input instruction corresponding to the touch operation.
- a frame is drawn on each image projected on the projection screen, or predetermined marks are drawn on the four corners of each image projected on the projection screen, and reference data generation is performed.
- the unit may recognize the position of the frame or mark of the image in the imaging range based on the image data obtained by the imaging unit, and use the data regarding the recognized frame or mark position as reference data. .
- the reference data generation unit can easily generate reference data regarding the position and size of the image.
- the projection screen is for displaying an image by irradiating projection light from the back side thereof, and the projection unit projects the image from the back side of the projection screen, and the imaging unit
- the image data may be acquired by capturing an image from the back side of the projection screen.
- the image data extraction unit extracts the finger from the image data captured through the projection screen.
- a finger or an indicator that recognizes a shape or shape and color corresponding to the pointing tool, and that has operated the image with respect to the image data including the recognized shape, or the shape and color. It is desirable to obtain it as existing image data.
- the projection screen may be for displaying an image by irradiating projection light from the surface side thereof.
- the projection unit displays the image from the surface of the projection screen.
- the imaging unit captures an image from the surface side of the projection screen and acquires image data.
- the image data extraction unit applies the laser light from the laser pointer to the projection screen when the user irradiates the projection screen.
- the shape and / or color of the laser light from the laser pointer is recognized from the image data obtained by the imaging unit, and the recognized shape of the laser light and
- the image data in which the color is present may be acquired as the image data in which the finger or the pointing tool operated by the user on the image is present.
- the image projection apparatus is provided with a screen for attaching the projection screen so that the distance between the projection screen and the projection unit and the distance between the projection screen and the imaging unit are maintained substantially constant. It is desirable to further provide a stand. By using such a screen installation table, the user can easily set the positional relationship between the projection screen and the projection unit and the positional relationship between the projection screen and the imaging unit.
- the image projection apparatus can accurately recognize an input instruction corresponding to the touch operation when the user performs various touch operations on the image projected on the projection screen.
- FIG. 1 is a schematic configuration diagram of a computer terminal including an image projection apparatus according to an embodiment of the present invention.
- FIG. 2 is a schematic block diagram of a computer terminal provided with the image projection apparatus of this embodiment.
- 3 (a) is a schematic perspective view of the screen installation table
- FIG. 3 (b) is a schematic side view showing the state of folding the screen installation table
- FIG. 3 (c) is the screen installation table in the folded state. It is a schematic plan view.
- FIG. 4 is a diagram showing an example of an image on the character input screen.
- FIG. 5 is a diagram schematically showing a finger portion whose shape or color has changed when the user touches the projection screen with the finger.
- FIG. 6 is a diagram illustrating an example of an image of a character input screen in which the character key image is highlighted when the user performs a tap operation on the character key image.
- FIG. 7 is a flowchart for explaining the procedure of input processing in the image projection apparatus of this embodiment.
- FIG. 8 is a schematic configuration diagram of a computer terminal including a front projection type image projection apparatus which is a modification of the present invention.
- FIG. 9A is a schematic perspective view of a smartphone provided with the image projection apparatus of the present invention, and FIG. 9B is a schematic side view of the smartphone.
- FIG. 10A is a schematic perspective view of a wristwatch provided with the image projection apparatus of the present invention, and FIG. 10B is a diagram showing a state when a projection screen is installed in the wristwatch.
- FIG. 9A is a schematic perspective view of a smartphone provided with the image projection apparatus of the present invention
- FIG. 10B is a diagram showing a state when a projection screen is installed in the wristwatch.
- FIG. 11A is a diagram showing an example of an automobile provided with the rear projection type image projection apparatus of the present invention
- FIG. 11B is an example of an automobile provided with the front projection type image projection apparatus of the present invention
- FIG. FIG. 12 is a diagram showing an example of a desk provided with the image projection apparatus of the present invention
- FIG. 13 is a diagram showing an example of a compact mirror provided with the image projection apparatus of the present invention.
- FIG. 1 is a schematic configuration diagram of a computer terminal including an image projection apparatus according to an embodiment of the present invention
- FIG. 2 is a schematic block diagram of the computer terminal including the image projection apparatus of the present embodiment.
- the image projection apparatus of the present invention recognizes the content of the input instruction corresponding to the operation. Then, an image corresponding to the content of the recognized input instruction is projected onto the projection screen.
- the computer terminal 1 includes a computer main body 10, a liquid crystal display unit 20, a touch panel 30, and an image projection apparatus 100 according to the present embodiment.
- the liquid crystal display unit 20 is provided on the upper surface of the casing of the computer main body 10.
- a touch panel 30 is provided on the screen of the liquid crystal display unit 20.
- the touch panel 30 detects a position where the touch operation is performed on the screen of the liquid crystal display unit 20 and outputs contact position information indicating the detected position to the computer main body 10. (Not shown) is provided. Various screens such as a menu screen, an application screen, and a character input screen are displayed on the screen of the liquid crystal display unit 20. Instead of the liquid crystal display unit 20, a (touch panel) display other than the liquid crystal may be used.
- the image projection apparatus 100 of this embodiment plays a role as a display device and an input device in the computer terminal 1.
- the image projection apparatus 100 a rear projection type apparatus that projects an image from the back side of the projection screen is used.
- the image projection apparatus 100 includes a projection screen 110, a projection unit 120, an imaging unit 130, a control unit 140, a storage unit 150, and a screen installation table 160.
- the projection unit 120, the imaging unit 130, the control unit 140, and the storage unit 150 are provided in the casing of the computer main body 10.
- the control unit 140 controls the image projection apparatus 100, but also performs overall control of the computer terminal 1 in the present embodiment. That is, the control unit 140 also serves as a control unit of the computer main body 10.
- the control unit that controls the image projection apparatus 100 and the control unit of the computer main body 10 may be provided separately.
- a transmissive screen that is, a rear projection screen for displaying an image by irradiating projection light from the back side thereof is used.
- the projection screen 110 is a single piece of hardware type and is fixed to the screen installation table 160.
- a folding type is used as the screen installation table 160.
- 3A is a schematic perspective view of the screen installation table 160
- FIG. 3B is a schematic side view showing the screen installation table 160 when folded
- FIG. 3C is a screen installation in the folded state.
- 3 is a schematic plan view of a table 160.
- the screen installation table 160 is for installing the projection screen 110 and the computer main body 10 so that the projection screen 110 and the computer main body 10 are arranged in a predetermined positional relationship.
- the screen installation table 160 includes two base portions 161 a and 161 b and two pole portions 162 and 162.
- FIG. 3 shows a screen installation table 160 on which only the projection screen 110 is installed.
- Each base part 161a, 161b is a substantially rectangular plate-like member, and the two base parts 161a, 161b are connected so as to be foldable. Further, in one base portion 161a, two pole portions 162 and 162 are attached to an end portion opposite to the side connected to the other base portion 161b.
- Each pole portion 162, 162 is a rod-like member, and a groove is formed along the central axis on the opposite side surface.
- the projection screen 110 is attached to the screen installation table 160 by being inserted between the grooves of the two pole portions 162 and 162. Further, as shown in FIG. 1, the computer main body 10 is placed on the other base portion 161b.
- a recess 163 is formed at a predetermined location on the surface of the base portion 161b. The recess 163 is used as a mark indicating a place where the computer main body 10 is installed.
- the user simply attaches the projection screen 110 between the two pole portions 162 and 162, and arranges the computer main body 10 on the concave portion 163 of the base portion 161b. And the positional relationship between the projection screen 110 and the computer main body 10 can be easily set so that the distance between the projection screen 110 and the imaging unit 130 is substantially constant. Further, by folding the two base portions 161a and 161b as shown in FIG. 3 (b), the screen installation table 160 can be made flat as shown in FIG. 3 (c).
- the projection unit 120 projects and displays an image of a predetermined screen on the projection screen 110 from the back side of the projection screen 110.
- the projection screen 110 serves as a display device for the computer terminal 1.
- the projection screen 110 displays images for various screens such as a menu screen and an operation screen on which various icons are displayed, and a character input screen for inputting characters.
- FIG. 4 is a diagram showing an example of an image on the character input screen.
- the character input screen 200 includes a keyboard image 210 and a display area 220 for displaying input characters and the like.
- the keyboard image 210 is provided with a plurality of character key images associated with each character (including symbols) and a plurality of function key images with a specific function.
- a frame is drawn on each screen image projected onto the projection screen 110 by the projection unit 120.
- the image of the character input screen 200 illustrated in FIG. 4 includes a keyboard image 210 and a rectangular outer frame 230 that surrounds the display area 220.
- the user can give various instructions to the control unit 140 by performing predetermined operations on the image of the screen projected on the projection screen 110 with a finger or an indicator.
- an indicator rod is used as the indicator.
- the user performs a predetermined operation by bringing a finger or an indicator into contact with a screen image projected on the projection screen 110 from the surface side of the projection screen 110 or from the surface side of the projection screen 110.
- An instruction is given by performing a predetermined operation in the vicinity of the screen image projected on the projection screen 110 without bringing a finger or an indicator into contact with the image.
- the control unit 140 recognizes the content of the instruction and controls the projection unit 120 to project a screen image corresponding to the recognized content of the instruction onto the projection screen 110.
- a touch operation performed on a normal touch panel that is, a tap operation, a double tap operation, a long press operation (long tap), a drag operation, and a flick are performed.
- Various operations such as operation and scroll operation are included. Note that each operation in the present embodiment described here means an operation corresponding to a touch operation performed on a normal touch panel regardless of whether the operation is performed by bringing a finger or the like into contact with the image. ing.
- the imaging unit 130 captures an image projected on the projection screen 110 and acquires image data.
- the imaging unit 130 includes a camera unit 131, an image processing unit 132, and a camera control unit 133.
- the camera unit 131 has a lens and an image sensor. As shown in FIG. 1, the camera unit 131 images the projection screen 110 from the back side.
- the range (imaging range) that can be captured by the camera unit 131 is the entire projection screen 110. Further, in the present embodiment, the camera unit 131 captures an image of a finger or an indicator that the user is operating on the image projected on the projection screen 110 via the transmissive projection screen 110. It is aimed.
- the image processing unit 132 performs correction processing for the color and gradation of the captured image, and performs image processing such as compression of the image data for the image data captured by the camera unit 131.
- the camera control unit 133 controls the image processing unit 132 and controls exchange of image data with the control unit 140.
- the image processing unit 132 may be provided in the control unit 140 instead of the imaging unit 130.
- the imaging unit 130 has a function of focusing.
- the camera control unit 133 includes an autofocus control unit 133a.
- the autofocus control unit 133a adjusts (controls) the camera unit 131 so that the projection screen 110 is in focus.
- the autofocus control unit 133a performs the camera operation when the imaging operation starts. Focusing processing is performed so that the focus is on a position away from the portion 131 by the predetermined distance.
- the autofocus control unit 133a adjusts so as to focus within a range from the projection screen 110 to a position separated by a predetermined distance back and forth along the vertical direction.
- the focus range is limited to a narrow range.
- a case is considered in which the autofocus control unit 133a adjusts so as to focus within a range from the projection screen 110 to the front and rear of 10 mm along the vertical direction.
- the imaging unit 130 images the subject in focus.
- the subject is located farther from the projection screen 110 than the depth of field, the subject cannot be imaged in focus.
- an auto-focus method an active method in which a subject is irradiated with infrared rays or ultrasonic waves, and the distance is detected by the time until the reflected wave returns and the irradiation angle, or an image captured by the lens of the camera unit 131 is used.
- a passive method such as a phase difference detection method or a contrast detection method that performs distance measurement using the above-described method may be used.
- Image data obtained by imaging with the imaging unit 130 is sent to the control unit 140 and stored in the storage unit 150 by the control unit 140.
- the imaging unit 130 has a still image shooting function and a moving image shooting function, and the control unit 140 acquires still image data as image data or acquires moving image data as necessary. can do.
- the control unit 140 controls the entire image projection apparatus 100.
- the control unit 140 controls the projection unit 120 in order to project the screen onto the projection screen 110 or controls imaging by the imaging unit 130.
- the control unit 140 includes a display control unit 141, a reference data generation unit 142, an image data extraction unit 143, a position data generation unit 144, an operation determination unit 145, An input control unit 146.
- the display control unit 141 controls the liquid crystal display unit 20 and the projection unit 120. Specifically, when the user performs a touch operation on the liquid crystal display unit 20, the display control unit 141 recognizes the content of the instruction by the touch operation based on the contact position information sent from the touch panel 30. At this time, if the content of the recognized instruction is an instruction to display a predetermined screen on the liquid crystal display unit 20, the display control unit 141 controls the liquid crystal display unit 20 to display the screen on the liquid crystal display unit 20. Let If the recognized instruction content is an instruction to display a predetermined screen image on the projection screen 110, the display control unit 141 controls the projection unit 120 to display the screen image on the projection screen 110. .
- the display control unit 141 performs a process of adjusting the image so that the image is clearly displayed on the projection screen 110 when the projection unit 120 starts the process of projecting the image.
- the display control unit 141 controls the liquid crystal display unit 20 to display the same image as the image displayed on the projection screen 110 on the liquid crystal display unit 20. May be.
- the reference data generation unit 142 generates reference data related to the image projected on the projection screen 110 based on the image data obtained by the imaging unit 130.
- This reference data is data for specifying the position and size of an image in the imaging range.
- the position data of the four corners of the outer frame 230 of the screen can be used as the reference data.
- an XY coordinate system in which the left-right direction is the X-axis direction and the up-down direction Y-axis direction is set within the imaging range of the imaging unit 130.
- the reference data generation unit 142 recognizes the position of the outer frame 230 on the screen based on the image data obtained by the imaging unit 130, and the data regarding the recognized position of the outer frame 230, that is, the outer frame 230 in the XY coordinate system.
- the XY coordinates of the four corners are acquired as reference data.
- the reference data generated by the reference data generation unit 142 is stored in the storage unit 150.
- the reference data generation unit 142 can easily generate reference data regarding the position and size of the image.
- the image data extraction unit 143 is image data in which there is a finger or an indicator that is operated by the user on the image projected on the projection screen 110 among the image data obtained by the imaging unit 130.
- the image data in which the finger or the pointing tool is in focus is extracted.
- a general image recognition method is used. Therefore, the image data extracted by the image data extraction unit 143 includes only image data in which the finger or the pointing tool exists in the vicinity of the projection screen 110.
- the extracted image data is stored in the storage unit 150.
- the image data extraction unit 143 generates data (time data) related to the time at which the image data was captured for each extracted image data, and associates the generated time data with the image data to store the data. 150.
- processing is performed based on the image data extracted by the image data extraction unit 143.
- the image data extraction unit 143 is captured via the projection screen 110 when the user performs an operation on the image from the surface side of the projection screen 110 with a finger or an indicator. Recognize the shape or shape and color corresponding to the finger or pointing tool from the image data, and the user selects the image data that has the recognized shape or shape and color for the image. It can be acquired as image data in which the operated finger or pointing tool exists. Further, when the shape or color of the finger or indicator changes on the projection screen 110 when the finger or indicator touches the projection screen 110, the image data extraction unit 143 displays the changed shape or color. The presence of a finger or an indicator may be determined by recognizing it.
- the image data extraction unit 143 is an image captured through the projection screen 110 when the user performs an operation on the image by bringing a finger or an indicator into contact with the image from the surface side of the projection screen 110. Recognize the change in the shape of the finger or indicator, the change in color, or the change in shape and color from the data, the change in the shape of the recognized finger or indicator, the change in color, Or you may make it acquire the image data in which the change of a shape and a color exists as image data in which the finger
- the pointing tool a tip whose tip is formed of a soft material and whose tip is easily deformed when touching the projection screen 110 can be used.
- FIG. 5 schematically shows a finger portion whose shape or color has changed when the user touches the projection screen 110 with a finger.
- the image data extraction unit 143 can determine that the finger is present when the shape or color of the changed finger part is recognized.
- the image data extraction unit 143 determines whether or not the graphic is present in the image data, and determines that a finger or an indicator is present if the graphic is present. Since it is easier to recognize the graphic than the recognition of the finger itself, the process of determining the presence of the finger or the like can be performed easily and quickly.
- rear projection screens are not only transparent but also translucent and milky white.
- a translucent or milky white projection screen it is difficult to determine whether the finger or the indicator is in focus.
- This problem can be dealt with by using the following method, for example. That is, when various projection screens such as translucent and milky white are used, at least one image in which the finger or the indicator is in contact with the projection screen is acquired in advance, and these images are used as the finger or the indicator. It is preset as an image that should be judged to be in focus.
- the image data extraction unit 143 can accurately determine whether the finger or the pointing tool is in focus by comparing the actually captured image data with the preset image.
- the image data extraction unit 143 performs a process of excluding data on the image currently projected on the projection screen 110 on the image data obtained by the imaging unit 130, and based on the processed image data.
- the presence of the finger or the pointing tool may be determined. Thereby, the process which judges presence of a finger
- the position data generation unit 144 generates position data that specifies the position of the finger or the pointing tool in the imaging range of the imaging unit 130 based on the image data extracted by the image data extraction unit 143. Specifically, the position data generation unit 144 is a position where the finger or the pointing tool exists in the XY coordinate system set within the imaging range (for example, the center position of the tip of the finger or the center position of the tip of the pointing tool). ) As XY coordinates. The acquired position data is stored in the storage unit 150 in association with the image data.
- the operation determination unit 145 determines what kind of operation the operation by the finger or the pointing tool is based on the image data extracted by the image data extraction unit 143.
- the image data extracted by the image data extraction unit 143 includes only the image data in which the finger or the pointing tool exists in the vicinity of the projection screen 110.
- the operation determination unit 145 moves the finger or the pointing tool based on the position data and time data associated with each image data for a series of image data obtained in time series. The contents of the operation are determined by checking whether they are present.
- the operation determination unit 145 determines that the finger or the pointing tool is performing a tap operation when image data in which the position of the finger or the pointing tool has hardly changed is extracted only for a short time. judge. In addition, after the image data in which the position of the finger or the pointing tool has hardly changed is extracted for a short time, and again after a certain period of time, the finger or the pointing tool exists again at substantially the same position. When the image data is extracted within a short time, it is determined that the finger or the pointing tool is performing a double tap operation. Further, when image data in which the position of the finger or the pointing tool hardly changes is extracted for a certain time or more, it is determined that the finger or the pointing tool is performing a long press operation.
- the operation determination unit 145 determines the content of the operation with the finger or the pointing tool based on the image data extracted by the image data extraction unit 143, so that the content of the operation with the finger or the pointing tool is determined as a tap. It is possible to accurately recognize which operation is one of various operations such as an operation, a double tap operation, a long press operation, and a drag operation. Data relating to the recognized operation content of the finger or pointing tool is stored in the storage unit 150.
- the imaging unit 130 is adjusted so that the projection screen 110 is focused within a range of 10 mm forward and backward along the vertical direction of the projection screen 110, so the user actually projects a finger or an indicator. Even without touching the screen 110, a tap operation or the like can be performed simply by positioning a finger or an indicator at a distance of about 10 mm from the projection screen 110. Since the image data extraction unit 143 extracts only image data in which the finger or the pointing tool exists within the range of about 10 mm from the projection screen 110, the operation determination unit 145 is extracted by the image data extraction unit 143. By determining the content of the operation with the finger or the pointing tool based on the obtained image data, the content of the operation can be accurately identified.
- the image data extraction unit 143 makes the touch. It is possible to recognize a change in shape or color at the time and extract image data in which the recognized change in shape or color exists. In this case, since the user operates the projection screen 110 by touching the finger or the pointing tool, the user can perform the operation as if operating the existing touch panel, and the operation determination unit 145 displays the image data. By determining the content of the operation by the finger or the pointing tool based on the image data extracted by the extraction unit 143, the content of the operation can be accurately identified.
- the input control unit 146 is based on the data regarding the content of the operation determined by the operation determination unit 145, the position data generated by the position data generation unit 144, and the reference data generated by the reference data generation unit 142.
- the projection unit 120 is controlled to recognize the content of the input instruction corresponding to the operation with the finger or the pointing tool and project an image corresponding to the recognized content of the input instruction onto the projection screen 110.
- the input control unit 146 selects the menu screen within the imaging range of the imaging unit 130 based on the reference data regarding the screen image.
- the existing range can be recognized.
- the input control unit 146 knows the configuration of the menu screen in advance, the input control unit 146 can recognize the position and range of each icon on the menu screen. Therefore, for example, when the user performs a double tap operation of an icon with a finger on the menu screen, the input control unit 146 determines in which icon region the finger position obtained from the finger position data is on the menu screen.
- the input control unit 146 recognizes that the input instruction corresponding to the current operation is an input instruction to display a screen image related to the operated icon, and displays the screen image related to the icon. A process of displaying on the projection screen 110 is performed.
- the input control unit 146 determines the imaging range of the imaging unit 130 based on the reference data regarding the image on the screen. The range in which the character input screen 200 exists can be recognized. In addition, since the input control unit 146 knows the configuration of the character input screen 200 in advance, the input control unit 146 can also recognize the range of the keyboard image 210 and the area of each character key image on the character input screen 200. Therefore, for example, when the user performs a tap operation on the character key image with the finger on the keyboard image 210, the input control unit 146 determines which character in the keyboard image 210 has the finger position obtained from the finger position data.
- the operated character key By examining whether the area corresponds to the area of the key image, the operated character key can be identified, and by examining the contents of the operation, it can be identified that the operation is a tap operation. .
- the input control unit 146 recognizes that the input instruction corresponding to the current operation is an input instruction of the character corresponding to the operated character key, and displays the image of the input instruction in the display area 220. Process to display.
- the input control unit 146 recognizes that the user has performed a tap operation on the character key image and has received an instruction to input a character corresponding to the operated character key, the character input screen 200 is displayed. Among them, a process of displaying an image of a screen on which only the operated character key image is highlighted on the projection screen 110 may be performed. As a result, the user can easily confirm the character key operated by the user.
- FIG. 6 shows an example of an image of the character input screen 200 on which the character key image is highlighted when the user taps the character key image.
- the storage unit 150 stores various programs and data.
- the programs stored in the storage unit 150 include, for example, a screen display processing program for performing screen display processing such as screen switching when an operation for selecting an icon is performed on a menu screen, and a character input screen.
- a character input processing program for performing character input processing when an operation for selecting a character key is performed at 200 is included.
- the data stored in the storage unit 150 includes, for example, image data for various screens. Further, the storage unit 150 is also used as a working memory.
- FIG. 7 is a flowchart for explaining a procedure of input processing in the image projection apparatus 100 of the present embodiment.
- the user for example, operates the touch panel 30 to instruct to display a predetermined screen image on the projection screen 110.
- the control unit 140 controls the projection unit 120 to display an image of the screen on the projection screen 110 (S1).
- the control unit 140 starts an imaging operation by the imaging unit 130 (S2).
- the control unit 140 causes the camera control unit 133 to execute a focusing process (S3).
- the autofocus control unit 133a of the camera control unit 133 is preliminarily received from the camera unit 131.
- the camera unit 131 is controlled such that the camera is in focus at a position that is a predetermined distance away.
- this focusing process can be performed by another method.
- the projection screen 110 itself is provided with an infrared reflection coating, a frame is provided on the projection screen 110, or reflection portions are provided at the four corners of the projection screen 110, the camera control unit 133 can perform the autofocus function.
- the distance to the projection screen 110 is detected, and the camera unit 131 can be controlled so that the focus is at a position separated by the detected distance.
- a reflection part may be provided on the screen installation table 160.
- the display control unit 141 controls the projection unit 120 to perform a projection image adjustment process so that the image is clearly displayed on the projection screen 110 (S4). Specifically, in the projection image adjustment process, the display control unit 141 determines whether the currently displayed image appears clearly in the image data based on the image data captured by the imaging unit 131. . If the currently displayed image appears clearly, it is recognized that the image is clearly displayed on the projection screen 110. Further, the display control unit 141 performs projection based on the distance information to the projection screen 110 obtained using the autofocus function of the imaging unit 130 and the image data of the projection screen 110 captured by the imaging unit 130. It is also possible to recognize the positional relationship between the unit 120 and the projection screen 110 and control the projection unit 120 to project an image on the projection screen 110 with a size, shape, angle, etc. according to the recognized positional relationship. .
- the user gives an input instruction to the computer main body 10, so that the screen image projected on the projection screen 110 is viewed from the surface side of the projection screen 110.
- a predetermined operation is performed with a finger or an indicator.
- the user's operation is imaged by the imaging unit 130 via the projection screen 110, and the obtained image data is sent to the image processing unit 132.
- the image processing unit 132 performs predetermined image processing on the image data (S5), and the image data subjected to the image processing is sent to the control unit 140.
- the reference data generation unit 142 generates reference data related to the image of the screen projected on the projection screen 110 based on the image data obtained by the imaging unit 130 (S6). Specifically, the reference data generation unit 142 acquires the position data (XY coordinates) of the four corners of the outer frame 230 of the screen as shown in FIG. 4, and uses the acquired position data of the four corners of the outer frame 230 as the reference data. To do. This reference data is temporarily stored in the storage unit 150.
- the image data extraction unit 143 includes a finger or an indicator that has been operated by the user on the screen image projected on the projection screen 110 from the image data obtained by the imaging unit 130.
- Image data that is in focus and that is in focus with the finger or pointing tool is extracted (S7). Thereby, only the image data in which the finger or the pointing tool exists in the vicinity of the projection screen 110 is acquired.
- the extracted image data is temporarily stored in the storage unit 150.
- the image data extraction unit 143 generates time data regarding the time at which the image data was captured for the extracted image data, and stores the generated time data in the storage unit 150 in association with the image data.
- the position data generation unit 144 positions the finger or the pointing tool based on the image data extracted by the image data extraction unit 143 (for example, the center position of the tip of the finger or the center position of the tip of the pointing tool).
- the position data (XY coordinates) for specifying is generated (S8).
- the generated position data is temporarily stored in the storage unit 150 in association with the image data.
- the operation determination unit 145 determines what kind of operation the operation by the finger or the pointing tool is based on the image data extracted by the image data extraction unit 143 (S9). For example, the operation determination unit 145 determines the operation content of a plurality of image data extracted by the image data extraction unit 143 using position data and time data associated with each image data. Data regarding the recognized contents of the operation with the finger or the pointing tool is temporarily stored in the storage unit 150.
- the input control unit 146 includes data relating to the content of the operation determined by the operation determination unit 145, position data generated by the position data generation unit 144, and reference data generated by the reference data generation unit 142. Based on this, the content of the input instruction corresponding to the operation with the finger or the pointing tool is recognized (S10), and the projection unit 120 is configured to project the screen image corresponding to the recognized content of the input instruction onto the projection screen 110. Control (S11).
- the input control unit 146 can identify the icon that is the operation target based on the position data of the finger, Since the operation can be specified based on the data related to the content of the operation, the input instruction corresponding to the current operation is recognized as an input instruction to display a screen related to the operated icon. Can do.
- the input control unit 146 can specify the character key operated based on the finger position data,
- the input instruction corresponding to the current operation can be recognized as the input instruction of the character corresponding to the operated character key. it can.
- step S11 the control unit 140 determines whether an instruction to end image projection is received from the user (S12). If an instruction to end the image projection is received, the input process shown in FIG. 5 ends. On the other hand, if an instruction to end the image projection has not been received, the process proceeds to step S5 and the input process is continued. Note that the user can give an instruction to end the image projection by touching the touch pad 30 or the projection screen 110, for example.
- the imaging unit has a function of focusing, and the image data extraction unit applies the image projected by the user to the projection screen among the image data obtained by the imaging unit.
- image data in which a finger or pointing tool that has been operated is present and image data in which the finger or pointing tool is in focus is extracted.
- the image data extracted by the image data extraction unit may be a finger or an instruction.
- the operation determination unit determines the content of the operation by the finger or the pointing tool based on the image data extracted by the image data extraction unit. Thus, the contents of the operation can be accurately identified.
- the input control unit includes data related to the content of the operation by the finger or pointing tool obtained by the operation determining unit, position data of the finger or pointing tool generated by the position data generating unit, and reference data Based on the reference data generated by the generation unit, the content of the input instruction corresponding to the operation by the finger or the pointing tool is recognized, and an image corresponding to the content of the recognized input instruction is projected on the projection screen. Control the projection unit. Therefore, when the user performs various touch operations on the image projected on the projection screen, the image projection apparatus according to the present embodiment can accurately recognize an input instruction corresponding to the touch operation.
- FIG. 8 shows a schematic configuration diagram of a computer terminal 1a including a front projection type image projection apparatus 100a which is a modification of the present invention.
- the front projection type image projection apparatus 100a also has a projection screen 1100, a projection unit 120, an imaging unit 130, and a control unit (not shown) as shown in FIG. ), A storage unit (not shown), and a screen installation base (not shown).
- the projection screen 1100 a front projection screen for displaying an image by irradiating projection light from the surface side is used.
- the front projection screen is mainly classified into three types: a diffusion screen, a reflection screen, and a regression screen.
- the diffusing screen has the property that incident light is diffusely reflected in all directions, and has a feature that the viewing angle is very wide.
- the reflective screen has the property that incident light is reflected at the same reflection angle as the incident angle, and the regressive screen has the property that the reflected light returns in the same direction as the incident light.
- the projection unit 120 projects an image from the surface of the projection screen 1100, and the imaging unit 130 captures an image from the surface side of the projection screen 1100 and acquires image data.
- control unit in the image projection apparatus 100a of this modification example is similar to the control unit in the image projection apparatus 100 of the above-described embodiment, and includes a display control unit, a reference data generation unit, an image data extraction unit, and an operation determination unit. And a position data generation unit and an input control unit. These units are the same as those in the image projection apparatus 100 of the above-described embodiment except that the projection unit 120, the imaging unit 130, and the screen installation base are arranged on the surface side of the projection screen 1100.
- the imaging unit 130 is operated by the projection screen.
- the state of the operation is imaged from the surface side of 1100.
- the imaging unit 130 is adjusted so as to be in focus within a range from the projection screen 110 to the front and rear of 15 mm along the vertical direction.
- the input control unit includes data related to the operation content of the finger or pointing tool obtained by the operation determining unit, position data of the finger or pointing tool generated by the position data generating unit, and a storage unit
- the content of the input instruction corresponding to the operation by the finger or the pointing tool is recognized based on the reference data for the screen stored in the screen, and an image corresponding to the recognized content of the input instruction is projected on the projection screen 1100.
- the projection unit 120 is controlled.
- the image data obtained by the imaging unit 130 includes image data having a content in which a part of the image projected on the projection screen 1100 is hidden by a finger or a pointing tool. It is also possible to recognize the operation performed with the finger or the pointing tool by superimposing the image currently projected on the projection screen 1100 on the image data with the content hidden.
- the image projection apparatus includes one projection unit.
- the image projection apparatus may include a plurality of projection units. In this case, it is possible to divide and project one screen on the projection screen using a plurality of projection units.
- the projection unit may project an image in which blue light or the like is reduced by reducing the amount of blue or purple light onto the projection screen. Thereby, a user-friendly image can be displayed.
- the image projection apparatus includes one imaging unit, but the image projection apparatus may include a plurality of imaging units.
- the projection screen is a single piece of a hard type.
- the projection screen may be a folding screen, a roll-up screen, or the like. May be.
- a laser pointer that emits laser light may be used as the pointing tool.
- the operation of turning on the laser beam twice is a double tap operation
- the operation of moving the laser beam at a predetermined speed or less with the laser beam on is the scrolling operation
- the operation of moving the laser beam at a predetermined speed or more with the laser beam on is a flick operation Is determined such that an operation of moving the laser light after turning it on a predetermined number of times means a drag operation.
- the image data extraction unit is obtained by the imaging unit when the user operates the image projected on the projection screen by irradiating the projection screen with laser light from the laser pointer.
- the shape and / or color of the laser beam from the laser pointer is recognized from the image data, and the user manipulates the image data that has the recognized shape and / or color of the laser beam on the image. It is acquired as image data in which the finger or the pointing tool that has been performed exists.
- the recognition of the color of the laser beam for example, the frequency of the color of the laser beam may be preset, and the image data extraction unit may consider that the laser beam exists when the preset color is recognized.
- the cross-sectional shape of the laser light emitted from the laser pointer is not limited to a circle, but may be a star shape, a cross shape, a triangle, or the like. Thereby, the image data extraction unit can easily perform the process of determining the presence of the pointing tool.
- the projection unit and the imaging unit are provided in the casing of the computer main body.
- the projection unit and the imaging unit are not included in the casing of the computer main body. You may make it provide in the housing
- the housing provided with the projection unit and the imaging unit and the housing of the computer main body are respectively provided with a communication unit and a communication control unit, so that the projection unit, the imaging unit, and the control unit Data communication can be performed wirelessly.
- the projecting unit and / or the imaging unit may be provided in a housing different from the housing in which the control unit is provided.
- each image projected onto the projection screen by the projection unit has been described.
- the frame does not necessarily have to be drawn.
- predetermined marks may be drawn at the four corners of each image projected onto the projection screen by the projection unit.
- the reference data generation unit recognizes the position of the mark of the image in the imaging range based on the image data obtained by the imaging unit, and uses the data regarding the recognized mark position as the reference data.
- the screen installation table and the liquid crystal display unit can be omitted.
- the modification shown in FIG. in the case where the imaging unit is adjusted so that the focus is within the range of 15 mm forward and backward along the vertical direction, the range of focus is 10 mm or 15 mm forward and backward along the vertical direction from the projection screen.
- the range of focus is 10 mm or 15 mm forward and backward along the vertical direction from the projection screen.
- it may be a range from the projection screen to the front and rear along the vertical direction up to about 50 mm.
- the imaging unit is adjusted so as to focus within a range from the projection screen to a position separated by a predetermined distance back and forth along the vertical direction.
- a predetermined distance is small, the identification accuracy of the operation content is improved, but on the other hand, the operation is difficult for the user.
- the predetermined distance is large, the operability for the user is improved, but the identification accuracy of the operation content is lowered.
- the predetermined distance is about 5 to 25 mm in the case of a rear projection type image projection apparatus, and the front projection type image projection is performed. In the case of an apparatus, it is desirable that the distance is about 10 mm to 50 mm.
- the image data extraction unit is an image in which a finger or an indicator operating the projection screen is present. It is necessary to extract image data that is in focus on the surface of the finger or pointing tool on the side facing the projection screen.
- the predetermined distance is preferably about 5 mm to 25 mm.
- the image data extraction unit since the imaging unit captures an image from the surface side of the projection screen, the image data extraction unit has a finger or an indicator operating the projection screen. It is necessary to extract image data that is in focus and that is in focus on the surface of the finger or pointing tool on the side opposite to the side facing the projection screen.
- the predetermined distance may be about 10 mm to 40 mm considering that the thickness of the finger at the nail portion of the index finger is about 5 mm to 15 mm.
- the predetermined distance is about 50 mm.
- the predetermined distance is the same for both the rear projection type image projection device and the front projection type image projection device. It is desirable that it is about 100 mm.
- the imaging unit is adjusted so as to focus within a range from the projection screen to a position separated by a predetermined distance back and forth along the vertical direction.
- the center of the in-focus range is not necessarily on the projection screen.
- a position separated by a distance x / 2 from the projection screen to the rear side along the vertical direction is set as the center of the in-focus range, and from the center to the vertical direction of the projection screen.
- the imaging unit may be adjusted so that it is focused within a range up to a position separated by a distance x / 2 back and forth.
- the in-focus range is a range from the projection screen to the position separated by the distance x on the rear side along the vertical direction.
- a position separated by a distance x / 2 in the vertical direction from the projection screen in the vertical direction is set as the center of the in-focus range, and from the center along the vertical direction of the projection screen.
- the imaging unit may be adjusted so that it is focused within a range up to a position separated by a distance x / 2 back and forth.
- the in-focus range is a range from the projection screen to the position separated by the distance x in the front direction along the vertical direction.
- the image projection apparatus of the present invention can be incorporated into various devices.
- the image projection apparatus of the present invention can be provided in a mobile terminal such as a smartphone or a mobile phone, a wristwatch, an automobile, a television, or the like.
- FIG. 9A is a schematic perspective view of a smartphone provided with the image projection apparatus of the present invention
- FIG. 9B is a schematic side view of the smartphone.
- the projection unit 120 and the imaging unit 130 are provided adjacent to the upper end of the smartphone 300 as shown in FIG. It attaches to the lower part of the smart phone 300 in the state inclined as shown to 9 (b).
- the projection screen 110 is a rear projection screen for displaying an image by irradiating projection light from the back side thereof.
- the control unit and the storage unit of the image projection apparatus of the present invention are built in the smartphone 300.
- the projection unit 120 displays an image of the menu screen from the back side of the projection screen 110
- the user performs an input instruction by operating the menu screen from the front side of the projection screen 110 with a finger or an indicator. .
- FIG. 10A is a schematic perspective view of a wristwatch provided with the image projection apparatus of the present invention
- FIG. 10B is a diagram showing a state when a projection screen is installed in the wristwatch.
- the main body of the wristwatch 400 is displayed, and the watch band is omitted.
- a wristwatch 400 to which the image projection apparatus of the present invention is applied has a function as an information terminal.
- the control unit and storage unit of the image projection apparatus of the present invention are built in the wristwatch 400.
- the projection unit 120 and the imaging unit 130 are provided at the upper end of the surface of the wristwatch 400 as shown in FIG.
- the projection screen 110 a is of a roll type and is stored in the screen storage unit 170.
- the screen storage 170 is provided at the left end of the wristwatch 400.
- Hooks 180 and 180 for hooking the projection screen 110a are provided at the right end of the wristwatch 400.
- the user pulls out the projection screen 110a from the screen storage unit 170 and hooks it on the hooks 180 and 180, thereby projecting the projection screen 110a.
- the projection unit 120 displays an image of the menu screen from the back surface of the projection screen 110a, the user performs an input instruction by operating the menu screen from the front side of the projection screen 110a with a finger or an indicator. .
- a projection unit, an imaging unit, a control unit, and a storage unit in the image projection apparatus of the present invention are built in an engine room of an automobile, and a screen projected by the projection unit is projected onto a vehicle body windshield (projection screen).
- a method of projecting a screen projected by a projection unit onto a control panel (projection screen) by incorporating a projection unit, an imaging unit, a control unit, and a storage unit in an automobile dashboard in the image projection apparatus of the present invention can be considered.
- the image projection apparatus of the present invention is used as a rear projection type image projection apparatus.
- FIG. 11A shows an example of an automobile provided with the rear projection type image projection apparatus of the present invention.
- the projection unit, the imaging unit, the control unit, and the storage unit in the image projection apparatus of the present invention are built in a dashboard unit or an indoor ceiling unit of an automobile, and a screen projected by the projection unit is a vehicle body windshield (projection screen).
- a method of projecting is conceivable.
- the image projection apparatus of the present invention is used as a front projection type image projection apparatus.
- FIG. 11B shows an example of an automobile provided with the front projection type image projection apparatus of the present invention.
- FIG. 12 shows an example of a desk 500 provided with the image projection apparatus of the present invention.
- the projection screen 110 is provided on the top plate 510 of the desk 500, and the projection unit, the imaging unit, the control unit, and the storage unit are disposed below the top plate 510.
- FIG. 13 shows an example of a compact mirror 600 provided with the image projection apparatus of the present invention.
- 13A is a schematic perspective view of the compact mirror when the mirror is used
- FIG. 13B is a schematic side view of the compact mirror when the mirror is used
- FIG. 13C is a diagram of the compact mirror when the image is projected.
- FIG. 13D is a schematic perspective view
- FIG. 13D is a schematic side view of the compact mirror at the time of image projection.
- the compact mirror 600 has a function as an information terminal and includes an upper lid 610, a lower lid 620, a mirror 630, and the image projection apparatus of the present invention, as shown in FIG.
- the projection unit 120, the imaging unit 130, the control unit, and the storage unit of the image projection apparatus of the present invention are built in the upper lid 610.
- the projection screen 110 is disposed on the back side of the mirror 630.
- the image projection apparatus of the present invention can be used by tilting the mirror 630 forward and separating the projection screen 110 from the upper lid 610.
- the imaging unit has a function of focusing
- the image data extraction unit is configured such that, among the image data obtained by the imaging unit, the user applies to the projection screen.
- Image data in which a finger or pointing tool that has performed an operation on the projected image is present and the finger or pointing tool is in focus is extracted. For this reason, for example, if the imaging unit is adjusted so as to focus within a range that is a certain short distance back and forth along the vertical direction from the projection screen, it is extracted by the image data extraction unit.
- the operation determination unit performs the finger or instruction based on the image data extracted by the image data extracting unit. By determining the content of the operation by the tool, the content of the operation can be accurately identified. Therefore, the image projection apparatus of the present invention is suitable for use in various devices such as computer terminals, portable terminals, watches, automobiles, and televisions, machines or tools.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Computer Hardware Design (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Signal Processing (AREA)
- Mathematical Physics (AREA)
- Projection Apparatus (AREA)
- Position Input By Displaying (AREA)
- User Interface Of Digital Computer (AREA)
- Controls And Circuits For Display Device (AREA)
Abstract
Description
10 コンピュータ本体
20 液晶表示部
30 タッチパネル
100,100a 画像投影装置
110,110a,1100 投影スクリーン
120 投影部
130 撮像部
131 カメラ部
132 画像処理部
133 カメラ制御部
133a オートフォーカス制御部
140 制御部
141 表示制御部
142 基準データ生成部
143 画像データ抽出部
144 位置データ生成部
145 操作判定部
146 入力制御部
150 記憶部
160 スクリーン設置台
161a,161b ベース部
162 ポール部
163 凹部
170 スクリーン収納部
180 フック
200 文字入力画面
210 キーボード画像
220 表示領域
230 外枠
300 スマートフォン
400 腕時計
500 机
510 天板
600 コンパクトミラー
610 上蓋
620 下蓋
630 鏡
Claims (11)
- 投影スクリーンと、
所定の画像を前記投影スクリーンに投影して表示する投影部と、
ピントを合わせる機能を有し、前記投影スクリーンに投影された前記画像を撮像して画像データを取得する撮像部と、
前記撮像部によって得られた画像データに基づいてその撮像範囲における、前記投影スクリーンに投影された前記画像の位置及び大きさを特定する基準データを生成する基準データ生成部と、
前記撮像部によって得られた画像データのうち、使用者が前記投影スクリーンに投影された前記画像に対して操作を行った指又は指示具が存在している画像データであってその指又は指示具にピントが合っている画像データを抽出する画像データ抽出部と、
前記画像データ抽出部によって抽出された画像データに基づいてその撮像範囲における当該指又は指示部の位置を特定する位置データを生成する位置データ生成部と、
前記画像データ抽出部によって抽出された画像データに基づいて当該指又は指示具による操作の内容を判定する操作判定部と、
前記操作判定部によって判定された操作の内容と前記位置データ生成部によって生成された位置データと前記基準データ生成部によって生成された基準データとに基づいて、当該指又は指示具による操作に対応する入力指示の内容を認識し、その認識した入力指示の内容に応じた画像を前記投影スクリーンに投影するように前記投影部を制御する入力制御部と、
を具備することを特徴とする画像投影装置。 - 前記撮像部は、前記投影スクリーンからその垂直方向に沿って前後に所定の距離だけ離れた位置までの範囲内にピントが合うように調整されていることを特徴とする請求項1記載の画像投影装置。
- 前記投影スクリーンに投影される各画像には枠が描かれ若しくは前記投影スクリーンに投影される各画像の四隅には所定のマークが描かれており、前記基準データ生成部は、前記撮像部によって得られた画像データに基づいてその撮像範囲における当該画像の前記枠又は前記マークの位置を認識し、その認識した前記枠又は前記マークの位置に関するデータを基準データとすることを特徴とする請求項1又は2記載の画像投影装置。
- 前記投影スクリーンはその裏面側から投影光を照射することにより画像を表示するためのものであり、前記投影部は前記投影スクリーンの裏面から前記画像を投影し、前記撮像部は前記投影スクリーンの裏面側から前記画像を撮像して画像データを取得することを特徴とする請求項1、2又は3記載の画像投影装置。
- 前記画像データ抽出部は、使用者が指又は指示具で前記投影スクリーンの表面側から前記画像に対して操作を行ったときに、前記投影スクリーンを介して撮像された画像データの中から、その指又は指示具に対応する形状、又は形状及び色を認識し、その認識した形状、又は形状及び色が存在している画像データを、使用者が前記画像に対して操作を行った指又は指示具が存在している画像データとして取得することを特徴とする請求項4記載の画像投影装置。
- 前記画像データ抽出部は、使用者が前記投影スクリーンの表面側から前記画像に指又は指示具を接触させて前記画像に対して操作を行ったときに、前記投影スクリーンを介して撮像された画像データの中から、その接触させた際の指又は指示具の形状の変化、色の変化、又は形状及び色の変化を認識し、その認識した指又は指示具の形状の変化、色の変化、又は形状及び色の変化が存在している画像データを、使用者が前記画像に対して操作を行った指又は指示具が存在している画像データとして取得することを特徴とする請求項4記載の画像投影装置。
- 前記投影スクリーンはその表面側から投影光を照射することにより画像を表示するためのものであり、前記投影部は前記投影スクリーンの表面から前記画像を投影し、前記撮像部は前記投影スクリーンの表面側から前記画像を撮像して画像データを取得することを特徴とする請求項1、2又は3記載の画面投影装置。
- 指示具としてレーザ光を発するレーザポインタを用いる場合、前記画像データ抽出部は、使用者が前記レーザポインタからのレーザ光を前記投影スクリーンに照射することにより前記投影スクリーンに投影された前記画像に対して操作を行ったときに、前記撮像部によって得られた画像データの中から、前記レーザポインタからのレーザ光の形状及び/又は色を認識し、その認識したレーザ光の形状及び/又は色が存在している画像データを、使用者が前記画像に対して操作を行った指又は指示具が存在している画像データとして取得することを特徴とする請求項1、2、3、4、5、6又は7記載の画像投影装置。
- 前記投影スクリーンと前記投影部との間の距離及び前記投影スクリーンと前記撮像部との間の距離がそれぞれ略一定に保持されるように前記投影スクリーンを取り付けるためのスクリーン設置台を更に備えることを特徴とする請求項1、2、3、4、5、6、7又は8記載の画像投影装置。
- 前記基準データ生成部、前記画像データ抽出部、前記位置データ生成部、前記操作判定部及び前記入力制御部は一つの筐体内に設けられており、前記投影部及び/又は前記撮像部はその筐体とは別の筐体内に設けられていることを特徴とする請求項1乃至9のいずれか一項記載の画像投影装置。
- 請求項1乃至10の何れかの請求項記載の画像投影装置を備えることを特徴とするコンピュータ端末、腕時計、携帯端末又は自動車。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2017563699A JP6858352B2 (ja) | 2016-01-25 | 2016-11-01 | 画像投影装置 |
US16/072,436 US11513637B2 (en) | 2016-01-25 | 2016-11-01 | Image projection device |
EP16888089.6A EP3410277B1 (en) | 2016-01-25 | 2016-11-01 | Image projection device |
US17/962,707 US11928291B2 (en) | 2016-01-25 | 2022-10-10 | Image projection device |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2016-011926 | 2016-01-25 | ||
JP2016011926 | 2016-01-25 |
Related Child Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/072,436 A-371-Of-International US11513637B2 (en) | 2016-01-25 | 2016-11-01 | Image projection device |
US17/962,707 Continuation US11928291B2 (en) | 2016-01-25 | 2022-10-10 | Image projection device |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017130504A1 true WO2017130504A1 (ja) | 2017-08-03 |
Family
ID=59397953
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2016/082469 WO2017130504A1 (ja) | 2016-01-25 | 2016-11-01 | 画像投影装置 |
Country Status (4)
Country | Link |
---|---|
US (2) | US11513637B2 (ja) |
EP (1) | EP3410277B1 (ja) |
JP (1) | JP6858352B2 (ja) |
WO (1) | WO2017130504A1 (ja) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109660779A (zh) * | 2018-12-20 | 2019-04-19 | 歌尔科技有限公司 | 基于投影的触控点定位方法、投影设备及存储介质 |
JP2020091639A (ja) * | 2018-12-05 | 2020-06-11 | 日本信号株式会社 | 指示特定システム |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11513637B2 (en) * | 2016-01-25 | 2022-11-29 | Hiroyuki Ikeda | Image projection device |
CN113709706B (zh) * | 2021-08-17 | 2024-03-26 | 中汽创智科技有限公司 | 一种车辆座舱的显示控制方法、装置、系统及终端 |
US12026912B2 (en) * | 2021-09-13 | 2024-07-02 | Wen Li | Apparatus and methodology for aiming point and aiming device 6DOF detection using image registration |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007257639A (ja) * | 2006-03-20 | 2007-10-04 | Samsung Electronics Co Ltd | 映像パターンを利用したポインティング入力装置、方法、およびシステム |
US20120262420A1 (en) * | 2011-04-15 | 2012-10-18 | Sobel Irwin E | Focus-based touch and hover detection |
JP2015014882A (ja) * | 2013-07-04 | 2015-01-22 | ソニー株式会社 | 情報処理装置、操作入力検出方法、プログラム、および記憶媒体 |
JP2015106111A (ja) * | 2013-12-02 | 2015-06-08 | 株式会社リコー | プロジェクションシステム |
Family Cites Families (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5982352A (en) * | 1992-09-18 | 1999-11-09 | Pryor; Timothy R. | Method for providing human input to a computer |
WO2001052230A1 (en) | 2000-01-10 | 2001-07-19 | Ic Tech, Inc. | Method and system for interacting with a display |
US8035612B2 (en) * | 2002-05-28 | 2011-10-11 | Intellectual Ventures Holding 67 Llc | Self-contained interactive video display system |
GB0116805D0 (en) * | 2001-07-10 | 2001-08-29 | Britannic Overseas Trading Co | A human-computer interface with a virtual touch sensitive screen |
WO2006086508A2 (en) * | 2005-02-08 | 2006-08-17 | Oblong Industries, Inc. | System and method for genture based control system |
EP1983402A4 (en) * | 2006-02-03 | 2013-06-26 | Panasonic Corp | INPUT DEVICE AND ITS METHOD |
JP5510907B2 (ja) | 2009-12-01 | 2014-06-04 | 学校法人東京電機大学 | タッチ位置入力装置及びタッチ位置入力方法 |
US20110164191A1 (en) * | 2010-01-04 | 2011-07-07 | Microvision, Inc. | Interactive Projection Method, Apparatus and System |
US20120194738A1 (en) * | 2011-01-26 | 2012-08-02 | Yongjing Wang | Dual mode projection docking device for portable electronic devices |
US9030425B2 (en) * | 2011-04-19 | 2015-05-12 | Sony Computer Entertainment Inc. | Detection of interaction with virtual object from finger color change |
CN102768573A (zh) * | 2011-05-06 | 2012-11-07 | 鸿富锦精密工业(深圳)有限公司 | 投影机监控系统及方法 |
JP5756729B2 (ja) | 2011-10-05 | 2015-07-29 | 日本電信電話株式会社 | ジェスチャ認識装置及びそのプログラム |
CN103309482A (zh) * | 2012-03-12 | 2013-09-18 | 富泰华工业(深圳)有限公司 | 电子设备及其触摸控制方法与触摸控制装置 |
JP5966535B2 (ja) * | 2012-04-05 | 2016-08-10 | ソニー株式会社 | 情報処理装置、プログラム及び情報処理方法 |
JP5278576B2 (ja) | 2012-04-27 | 2013-09-04 | カシオ計算機株式会社 | ジェスチャー認識装置、ジェスチャー認識方法及びそのプログラム |
JP6075122B2 (ja) * | 2013-03-05 | 2017-02-08 | 株式会社リコー | システム、画像投影装置、情報処理装置、情報処理方法およびプログラム |
CN104063143B (zh) * | 2013-06-06 | 2015-09-09 | 腾讯科技(深圳)有限公司 | 文件管理方法、装置及电子设备 |
US9146618B2 (en) * | 2013-06-28 | 2015-09-29 | Google Inc. | Unlocking a head mounted device |
JP6597625B2 (ja) * | 2014-10-03 | 2019-10-30 | ソニー株式会社 | 投射型表示装置 |
US10656746B2 (en) * | 2015-09-18 | 2020-05-19 | Sony Corporation | Information processing device, information processing method, and program |
US11513637B2 (en) * | 2016-01-25 | 2022-11-29 | Hiroyuki Ikeda | Image projection device |
-
2016
- 2016-11-01 US US16/072,436 patent/US11513637B2/en active Active
- 2016-11-01 JP JP2017563699A patent/JP6858352B2/ja active Active
- 2016-11-01 WO PCT/JP2016/082469 patent/WO2017130504A1/ja active Application Filing
- 2016-11-01 EP EP16888089.6A patent/EP3410277B1/en active Active
-
2022
- 2022-10-10 US US17/962,707 patent/US11928291B2/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007257639A (ja) * | 2006-03-20 | 2007-10-04 | Samsung Electronics Co Ltd | 映像パターンを利用したポインティング入力装置、方法、およびシステム |
US20120262420A1 (en) * | 2011-04-15 | 2012-10-18 | Sobel Irwin E | Focus-based touch and hover detection |
JP2015014882A (ja) * | 2013-07-04 | 2015-01-22 | ソニー株式会社 | 情報処理装置、操作入力検出方法、プログラム、および記憶媒体 |
JP2015106111A (ja) * | 2013-12-02 | 2015-06-08 | 株式会社リコー | プロジェクションシステム |
Non-Patent Citations (1)
Title |
---|
See also references of EP3410277A4 * |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2020091639A (ja) * | 2018-12-05 | 2020-06-11 | 日本信号株式会社 | 指示特定システム |
JP7182442B2 (ja) | 2018-12-05 | 2022-12-02 | 日本信号株式会社 | 指示特定システム |
CN109660779A (zh) * | 2018-12-20 | 2019-04-19 | 歌尔科技有限公司 | 基于投影的触控点定位方法、投影设备及存储介质 |
Also Published As
Publication number | Publication date |
---|---|
EP3410277B1 (en) | 2022-07-06 |
JPWO2017130504A1 (ja) | 2018-11-22 |
US11928291B2 (en) | 2024-03-12 |
US20230071534A1 (en) | 2023-03-09 |
EP3410277A4 (en) | 2019-07-24 |
US20190034033A1 (en) | 2019-01-31 |
JP6858352B2 (ja) | 2021-04-14 |
US11513637B2 (en) | 2022-11-29 |
EP3410277A1 (en) | 2018-12-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US7176881B2 (en) | Presentation system, material presenting device, and photographing device for presentation | |
US11928291B2 (en) | Image projection device | |
CN109564495B (zh) | 显示装置、存储介质、显示方法及控制装置 | |
US8350896B2 (en) | Terminal apparatus, display control method, and display control program | |
CN105706028B (zh) | 投影型影像显示装置 | |
US9432661B2 (en) | Electronic device, image display method, and image display program | |
US11029766B2 (en) | Information processing apparatus, control method, and storage medium | |
CN116126170A (zh) | 显示装置及控制装置 | |
EP2687959A1 (en) | Detection device, input device, projector, and electronic apparatus | |
US20140218300A1 (en) | Projection device | |
JP2009031334A (ja) | プロジェクタ及びプロジェクタの投射方法 | |
US9544556B2 (en) | Projection control apparatus and projection control method | |
CN107407959B (zh) | 基于姿势的三维图像的操纵 | |
CN111052063B (zh) | 电子装置及其控制方法 | |
WO2018010440A1 (zh) | 一种投影画面调整方法、装置和投影终端 | |
JP6381361B2 (ja) | データ処理装置、データ処理システム、データ処理装置の制御方法、並びにプログラム | |
JP2020077209A (ja) | 画像読取り装置及び方法 | |
JP2013157845A (ja) | 電子鏡およびそのプログラム | |
US11429200B2 (en) | Glasses-type terminal | |
JP4871226B2 (ja) | 認識装置および認識方法 | |
JP5229928B1 (ja) | 注視位置特定装置、および注視位置特定プログラム | |
JP5646532B2 (ja) | 操作入力装置、操作入力方法、及び、プログラム | |
JP2010015127A (ja) | 情報表示装置 | |
JP2007310789A (ja) | インタフェースデバイス | |
JP5118663B2 (ja) | 情報端末装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16888089 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2017563699 Country of ref document: JP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2016888089 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2016888089 Country of ref document: EP Effective date: 20180827 |