WO2016056102A1 - Image projection device - Google Patents

Image projection device Download PDF

Info

Publication number
WO2016056102A1
WO2016056102A1 PCT/JP2014/077071 JP2014077071W WO2016056102A1 WO 2016056102 A1 WO2016056102 A1 WO 2016056102A1 JP 2014077071 W JP2014077071 W JP 2014077071W WO 2016056102 A1 WO2016056102 A1 WO 2016056102A1
Authority
WO
WIPO (PCT)
Prior art keywords
light
unit
image
projection
emission angle
Prior art date
Application number
PCT/JP2014/077071
Other languages
French (fr)
Japanese (ja)
Inventor
将史 山本
瀬尾 欣穂
浦田 浩之
Original Assignee
日立マクセル株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日立マクセル株式会社 filed Critical 日立マクセル株式会社
Priority to PCT/JP2014/077071 priority Critical patent/WO2016056102A1/en
Publication of WO2016056102A1 publication Critical patent/WO2016056102A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means

Definitions

  • the present invention relates to an image projection apparatus.
  • Patent Document 1 As background art in this technical field, there is US Patent Application Publication No. 2014/152843 (Patent Document 1). This publication describes “providing a document camera capable of intuitive drawing operation on a subject and a method for controlling the document camera when realizing pseudo writing on a photographed image”.
  • Patent Document 1 In the technology disclosed in Patent Document 1, it is necessary to install a separate device called a document camera that detects a user's operation on a video screen projected from the projector and the projector. For this reason, when using the configuration disclosed in Patent Document 1 in various places, it is necessary to prepare and install the projector and the document camera, respectively.
  • a projector as a video projection device has a built-in function for detecting a user's operation and detects a user's operation on a screen projected from the projector by installing only one projector.
  • the projector incorporates a light detection sensor, and a function of detecting a user operation by the light detection sensor is realized.
  • the light detection sensor emits a laser beam that scans in the vicinity of the image screen projected from the projector, and receives the reflected light of the laser beam from an object such as a finger of the user who performs the operation.
  • the light detection sensor detects the reflected light from the object, it is possible to detect the operation in accordance with the timing when the user operates the video screen.
  • the angle of the laser beam emitted from the light detection sensor may change depending on the installation state such as the inclination of the projector when a projector incorporating such a light detection sensor is installed.
  • the emission angle of the laser light changes, the timing at which the user performs the operation and the detection timing of the operation may be shifted, or the operation may not be detected even though the user has performed the operation. Therefore, the operability for the video screen projected on the projector is lowered.
  • the present invention has been made to solve such problems, and improves the operability of a projected video screen in a video projection device that projects a video screen and detects an operation on the projected video screen. For the purpose.
  • the present invention adopts, for example, the configurations described in the claims.
  • the present application includes a plurality of components that solve the above-described problem.
  • the present invention controls the emission angle of detection light for detecting an object on a video screen projected on a projection plane. It is said.
  • FIG. 1 is a diagram illustrating an overview of a video projector 1 according to an embodiment of the invention.
  • the XYZ axes shown in FIG. 1 indicate from which direction the image screen projected by the image projection apparatus 1 is viewed. Specifically, the horizontal direction with respect to the video screen indicates the X-axis direction, the vertical direction indicates the Y-axis direction, and the vertical direction indicates the Z-axis direction.
  • the image projection apparatus 1 shown in FIG. 1 is a view seen from the X-axis direction, that is, the horizontal direction with respect to the image screen. In the subsequent drawings, the image projection apparatus 1 is shown together with the similar XYZ axes.
  • the image projection apparatus 1 is installed on an installation surface 102 that is a desk.
  • the image light generated inside the apparatus is enlarged by a projection lens 404 and then reflected by a reflection mirror 405 to be installed on the installation surface 102.
  • the image screen 101 is projected onto the screen. That is, in the present embodiment, the installation surface 102 is a projection surface.
  • the focus of the projected image is adjusted by the focus ring 103.
  • the reflection mirror 405 is configured to be foldable and is housed so that the reflection surface faces the image projection device 1 when not in use.
  • the image projection apparatus 1 includes a light detection sensor 200 that emits light and detects reflected light of an object.
  • the light detection sensor 200 detects the position of the operator's finger and the movement of the finger across the detection boundary 104.
  • the image projection apparatus 1 recognizes the position and movement of the finger detected by the light detection sensor 200 as operations similar to operations performed on the screen of a smartphone, tablet, etc. such as tap, flick, swipe, pinch-in, and pinch-out. Project an image according to the recognized operation. Details of the image projection processing corresponding to the position and movement of the finger detected by the light detection sensor 200 will be described later.
  • the “position of the object” is described as the “position of the operator's finger”.
  • FIG. 2 and 3 are diagrams illustrating an example of detecting the position and movement of the operator's finger by the light detection sensor 200 of the video projection device 1.
  • the image projection apparatus 1 shown in FIG. 2 is a view seen from the direction perpendicular to the Y axis direction, that is, the image screen, and the image projection apparatus 1 shown in FIG. 3 is horizontal to the X axis direction, that is, the image screen. It is the figure seen from the direction.
  • the light detection sensor 200 emits laser light to the range of the video screen 101, and an operator touches the video screen 101 with a finger 105 to operate it like a touch panel (for example, , The button displayed on the video screen 101 is touched), and the reflected light from the finger 105 is detected.
  • the light detection sensor 200 that has detected the reflected light calculates the distance from the light detection sensor 200 to the finger based on the detected reflected light, and based on the calculated distance and the emission direction of the laser light, Recognize whether the position is touched.
  • the light detection sensor 200 functions as a light detection unit that emits detection light for detecting an operation on the projection surface and detects reflected light of the detection light from an object on the video screen 101 projected on the projection surface.
  • the object on the video screen 101 is, for example, an operator's finger or a pointing stick, and may be in contact with the projection surface or slightly separated (for example, about several mm) without being in contact with the projection surface. May be.
  • the laser light emitted from the light detection sensor 200 is positioned very close to the installation surface 102 in order to detect the position and movement of the finger 105 at the timing when the operator touches the video screen 101. Scan in parallel with the installation surface 102. For example, it is desirable that the vertical distance y1 between the laser beam and the installation surface 102 be 20 mm or less.
  • detecting the position and movement of the finger 105 is simply referred to as “detecting the finger 105”.
  • the expression “in parallel with the installation surface 102” also includes the meaning of “very close to parallel with the installation surface 102”.
  • FIG. 4 is a diagram illustrating a mode of laser light emitted from the light detection sensor 200 when the image projection apparatus 1 is installed to be inclined with respect to the installation surface 102.
  • 4A shows a case where the angle formed by the image projection apparatus 1 and the installation surface 102 is larger than 90 °
  • FIG. 4B shows that the angle formed by the image projection apparatus 1 and the installation surface 102 is 90 °. The case where it is smaller than ° is shown.
  • the laser light emitted from the light detection sensor 200 is installed at a position very close to the installation surface 102. It is necessary to scan parallel to the surface 102.
  • the light detection sensor 200 does not detect the finger 105 when the operator touches the image screen 101. There is a case.
  • the laser light emitted from the light detection sensor 200 scans in parallel with the installation surface 102. Instead, the distance in the vertical direction between the laser beam and the installation surface 102 increases as the distance from the light detection sensor 200 increases. For this reason, the finger 105 hits the laser beam before the timing when the operator touches the video screen 101, and the light detection sensor 200 detects the finger 105 by the reflected light from the finger 105. The detection timing of the finger 105 by the detection sensor 200 deviates.
  • the timing when the operator tries to operate the video screen 101 and the reaction timing of the video screen 101 in response to the detection of the finger 105 by the light detection sensor 200 are shifted, so the operator has low operability. Will feel. Further, since the video screen 101 reacts before the operator touches the video screen 101, the operator does not easily feel the sense of operating the video screen 101, and feels that the operability is low.
  • the laser light emitted from the light detection sensor 200 is parallel to the installation surface 102.
  • the light is emitted toward the installation surface 102 without scanning. Therefore, the laser beam does not reach the position of the video screen 101 far from the video projection device 1 and the finger 105 may not be detected.
  • the finger 105 is not detected by the light detection sensor 200, and the operability is deteriorated.
  • one of the gist of the present embodiment is to control the emission angle of the laser light according to the state of the installation position of the image projection apparatus 1 with respect to the installation surface 102 which is the projection surface according to the present embodiment.
  • the functional configuration of the image projection apparatus 1 according to the present embodiment will be described.
  • FIG. 5 is a block diagram illustrating the functional configuration of the image projection apparatus 1 according to this embodiment.
  • the video projection device 1 includes a light detection sensor 200, a light emission angle control unit 240, an operation detection unit 300, a video projection unit 400, and a distance sensor 500.
  • the operation detection unit 300 includes a distance information acquisition unit 310, a coordinate information acquisition unit 301, and an operation signal generation unit 302.
  • the image projection unit 400 includes an image control unit 401, a light source unit 402, a light control unit 403, a projection lens 404, and a reflection mirror 405.
  • the external device 6 is a general information processing device such as a PC (Personal Computer) connected to the video projection device 1 and a mobile terminal device such as a smartphone, and supplies a video signal to the video projection device 1. .
  • the external device 6 is not limited to a PC and a portable terminal device, and may be a device that supplies a video signal to the video projection device 1 such as a card-like storage medium inserted into a card interface provided in the video projection device 1. That's fine.
  • the video control unit 401 outputs a control signal to the light source unit 402 and the light control unit 403 according to the video signal supplied from the external device 6.
  • the light source unit 402 is a light source such as a halogen lamp, an LED (Light Emitting Diode), or a laser, and adjusts the amount of light according to a control signal input from the video control unit 401. Note that when the light source unit 402 includes three colors of R (Red), G (Green), and B (Blue), the light amount may be independently controlled according to the video signal.
  • the light control unit 403 includes optical system components such as a mirror, a lens, a prism, and an imager (for example, a display device such as a liquid crystal panel), and is supplied from the external device 6 using light emitted from the light source unit 402. An optical image based on the processed image signal is generated.
  • the projection lens 404 enlarges the image output from the light control unit 403.
  • the reflection mirror 405 reflects the light emitted from the projection lens 404 and projects the video screen 101 on the installation surface 102.
  • the reflection mirror 405 uses an aspherical mirror, and when projecting an image screen of the same size, the projection distance can be shortened compared to a general image projection apparatus.
  • the image projection unit 400 using the reflection mirror 405 has been described as an example, but other configurations may be used as long as the image projection can be realized.
  • FIG. 6 is a block diagram illustrating the configuration of the light detection sensor 200.
  • the light detection sensor 200 includes a light receiving unit 201, a light emitting unit 202, a light emission driving unit 203, a mirror unit 210, and a mirror driving unit 220.
  • the light emission drive unit 203 controls the light emission intensity, the light emission frequency, and the like of the light emitting unit 202 that emits light (laser light).
  • FIG. 7 is a diagram illustrating the configuration of the mirror unit 210.
  • the mirror unit 210 includes a mirror 211, a permanent magnet 212, and a central axis 213.
  • the mirror 211 vibrates or rotates around the central axis 213 depending on the polarity of the permanent magnet 212 and the polarity of a mirror driving unit 220 described later.
  • the mirror unit 210 reflects the light emitted from the light emitting unit 202 in a desired direction by the mirror 211 that vibrates or rotates around the central axis 213. This reflected light is a laser beam for detecting the finger 105 shown in FIG.
  • FIG. 8 is a diagram illustrating a configuration of the mirror driving unit 220.
  • the mirror driver 220 includes a coil 221, a drive current generator 222, a mirror controller 223, and a mirror position detector 224.
  • the coil 221 is used as an electromagnet that changes the polarity and magnetic force according to the direction and amount of current.
  • the drive current generator 222 generates a current amount and a drive frequency according to the control signal output from the mirror controller 223, and supplies the coil 221 with a current.
  • the coil 221 switches the polarity according to the amount of current supplied from the drive current generator 222 and the timing.
  • the mirror position detection unit 224 detects the position of the mirror unit 210 based on a change in the output of the drive current generation unit 222. Specifically, for example, the mirror position detection unit 224 detects the position of the mirror unit 210 using the back electromotive force generated by the change in magnetic flux exerted on the coil 221 by the permanent magnet 212, and detects the detected position information. Output to the mirror control unit 223.
  • the mirror control unit 223 outputs the position information (hereinafter referred to as “mirror position information”) of the mirror unit 210 input from the mirror position detection unit 224 to the distance information acquisition unit 310 described later.
  • the mirror control unit 223 may change the control signal according to the position information of the mirror unit 210 output from the mirror position detection unit 224. Since the resonance frequency changes with temperature, when the mirror unit 210 operates by resonance, the detection range of the finger 105 by the reflected light from the mirror unit 210, that is, the laser beam shown in FIG. 3, also changes with temperature. Therefore, when the mirror unit 210 operates by resonance, the mirror control unit 223 desirably controls the resonance frequency according to the position information of the mirror unit 210 to make the detection range of the finger 105 constant.
  • the light reflected by the mirror unit 210 is reflected or scattered when it hits the finger 105 of the operator who operates the video screen 101.
  • the light receiving unit 201 receives a part of light reflected or scattered by hitting the finger 105, photoelectrically converts the received light, and outputs an electric signal to the distance information acquisition unit 310 described later.
  • the light emission angle control unit 240 prevents the laser light from being emitted at the angle shown in FIG.
  • the emission angle of the laser beam emitted from is controlled.
  • the light emission angle control unit 240 controls the emission angle of the laser light based on the output information from the distance sensor 500. This is one of the gist according to the present embodiment, and details will be described later.
  • the distance information acquisition unit 310 is based on the electrical signal (detection signal) input from the light receiving unit 201 of the light detection sensor 200 and the mirror position information input from the mirror driving unit 220 of the light detection sensor 200. The distance information from the detected to the operator's finger 105 is acquired.
  • FIG. 9 is a block diagram illustrating the configuration of the distance information acquisition unit 310.
  • the distance information acquisition unit 310 includes an amplification unit 311, a pulse generation unit 312, and a distance measurement unit 313.
  • the amplification unit 311 amplifies the signal input from the light receiving unit 201 of the light detection sensor 200 and outputs the amplified signal to the pulse generation unit 312.
  • the pulse generation unit 312 pulses the signal amplified by the amplification unit 311 and outputs the pulse to the distance measurement unit 313.
  • the distance measurement unit 313 measures the time difference between the light emission timing of the light detection sensor 200 and the pulse input from the pulse generation unit 312. Then, the distance measurement unit 313 uses the mirror position information input from the mirror driving unit 220 of the light detection sensor 200 and the measured time difference to detect the finger detected from the light detection sensor 200 by TOF (Time-Of-Flight). The distance to 105 is calculated. The distance measurement unit 313 outputs the calculated distance information to the coordinate information acquisition unit 301.
  • the light emission timing of the light detection sensor 200 is, for example, the timing at which the light emission unit 202 is driven by the light emission drive unit 203.
  • the coordinate information acquisition unit 301 converts the distance information input from the distance information acquisition unit 310 into coordinate information on the video screen 101 and outputs it to the operation signal generation unit 302.
  • the coordinate information on the video screen 101 is, for example, an X coordinate in the horizontal direction of the video screen 101 and a Y coordinate in the vertical direction.
  • the operation signal generation unit 302 generates an operation signal from the coordinate information input from the coordinate information acquisition unit 301 and outputs the operation signal to the external device 6. Specifically, for example, when the input coordinate information indicates one point on the video screen 101, the operation signal generation unit 302 generates an operation signal indicating that the position has been touched.
  • the external device 6 supplies a video signal corresponding to the operation signal input from the operation signal generation unit 302 to the video projection unit 400. Specifically, for example, it is assumed that a button for transitioning from the currently displayed video screen 101 to another video screen is displayed on the video screen 101 and the operator touches the button.
  • the external device 6 receives another operation signal indicating that one point on the video screen 101 is touched from the operation signal generation unit 302, and displays another video screen that transitions by pressing a button.
  • an operation signal is generated according to the position and movement of the operator's finger with respect to the video screen 101, and the video projected by the video projection device 1 is controlled.
  • Adjustment of the emission angle of the laser beam emitted from the light detection sensor 200 by the light emission angle control unit 240 and the distance sensor 500 will be described. Adjustment of the laser beam emission angle is performed even when the image projection apparatus 1 is tilted with respect to the installation surface 102 as shown in FIG. This is one of the gist according to the present embodiment.
  • FIG. 10 is a diagram illustrating the configuration of an inclination adjustment mechanism that adjusts the inclination of the light detection sensor 200.
  • FIG. 10A is a view of the image projection apparatus 1 viewed from the Z-axis direction, that is, the direction perpendicular to the image screen.
  • FIGS. 10B to 10D illustrate the image projection apparatus 1 in the X-axis direction. It is a figure which shows the specific structural example of the inclination adjustment mechanism seen from the horizontal direction with respect to a direction, ie, a video screen.
  • the tilt adjustment mechanism of the light detection sensor 200 includes a tilt adjustment unit 231, a movable unit 232, a fixed unit 233, and a rotation shaft 234.
  • the inclination adjusting unit 231 is a mechanism that rotates the rotating shaft 234, and is, for example, an adjustment ring that can be manually operated, a screw that is adjusted by a screwdriver, or the like.
  • the rotation shaft 234 is rotated by the inclination adjusting unit 231, the movable unit 232 is moved, and the inclination of the light detection sensor 200 installed on the movable unit 232 is changed.
  • the movable portion 232 moves so as to rise, so that the inclination of the light detection sensor 200 is changed from a state indicated by a dotted line to a one-dot chain line. It is adjusted to the state shown.
  • the laser light emitted from the light detection sensor 200 is emitted in the depression direction as compared with the case shown in FIG. Therefore, by adjusting the inclination of the light detection sensor 200 as described above, when the angle formed by the image projection apparatus 1 and the installation surface 102 is larger than 90 ° as shown in FIG. It can be adjusted to scan parallel to (closer to) 102.
  • the tilt adjustment mechanism is configured to be able to electrically monitor the amount of movement of the rotating shaft 234 by the tilt adjusting unit 231, that is, the amount of tilt adjustment.
  • the tilt adjustment mechanism is configured such that the resistance value changes like a variable resistor depending on the adjustment amount, and the voltage value changes depending on the resistance value.
  • the light emission angle control unit 240 controls the inclination adjustment of the light detection sensor 200 by such an inclination adjustment mechanism. That is, the tilt adjustment mechanism functions as a light emission angle adjustment unit that can manually adjust the emission angle of the detection light from the light detection sensor 200.
  • the rotating shaft 234 may be configured to move the movable portion 232 by combining the Z-axis direction axis and the Y-axis direction axis with a gear.
  • the tilt adjustment mechanism is not limited to the configuration shown in FIG. 10, and may be a two-dimensional scan mirror as long as it is a mirror.
  • FIG. 11 is a block diagram illustrating a functional configuration of the light emission angle control unit 240.
  • the light emission angle control unit 240 includes a reference information storage unit 241, a distance information acquisition unit 242, an inclination calculation unit 243, an adjustment completion determination unit 244, and an adjustment completion notification unit 245.
  • FIG. 12 is a diagram illustrating the reference information stored in the reference information storage unit 241.
  • FIG. 13 is a diagram illustrating an installation mode of the image projection apparatus 1 serving as a reference.
  • the reference information includes distance r, height H, and angle when the image projection apparatus 1 is installed such that the angle between the image projection apparatus 1 and the installation surface 102 is 90 ° as shown in FIG. Including ⁇ .
  • the installation position (installation state) of the image projection apparatus 1 shown in FIG. 13A is referred to as a “reference position” (reference installation state).
  • the distance r is a linear distance from the distance sensor 500 measured by the distance sensor 500 to the video screen 101. That is, the distance sensor 500 functions as a projection plane distance measurement unit that measures a projection plane distance that is a distance from the distance sensor 500 to the projection plane.
  • the height H is a height from the bottom surface of the image projection apparatus 1 to the distance sensor 500. That is, the height H is a measurement unit distance that is a distance from the bottom surface of the image projection device 1 to the projection surface distance measurement unit.
  • the angle ⁇ is an angle formed between the image projection apparatus 1 and the emission direction of light emitted for distance measurement from the distance sensor 500. That is, the angle ⁇ is obtained by the following formula (1) using the distance r and the height H.
  • the distance sensor 500 in this embodiment is an example in which the reflected light of the emitted light is received and the distance to the video screen 101 is measured. However, if the distance to the video screen 101 can be measured, other methods may be used. There may be. Further, in the present embodiment, the case where the angle ⁇ is calculated in advance and stored in the reference information storage unit 241 will be described as an example. However, the angle ⁇ is not stored in the reference information storage unit 241, and the distance r is set as necessary. And from the height H.
  • FIG. 14 is a diagram illustrating an installation mode of the image projection device 1.
  • the image projection device 1 is installed at an angle with respect to the reference image projection device 1 indicated by a dotted line.
  • the linear distance from the distance sensor 500 to the image screen 101 in the installation mode shown in FIG. 14A is r ′
  • the height from the bottom surface of the image projection apparatus 1 to the distance sensor 500 is Is H ′
  • the angle between the image projection apparatus 1 and the light emission direction from the distance sensor 500 is ⁇ ′.
  • the distance information acquisition unit 242 acquires the straight line distance r ′ from the distance sensor 500 measured by the distance sensor 500 to the video screen 101, and the inclination calculation unit 243 Output.
  • the inclination calculation unit 243 calculates the amount of inclination based on the distance r ′ input from the distance information acquisition unit 242 and the reference information stored in the reference information storage unit 241.
  • the inclination calculation unit 243 outputs the calculated ⁇ to the adjustment completion determination unit 244.
  • the adjustment completion determination unit 244 determines whether or not the inclination adjustment in the inclination adjustment mechanism has been completed based on ⁇ input from the inclination calculation unit 243. Specifically, the adjustment completion determination unit 244 refers to the adjustment amount monitored by the tilt adjustment mechanism, and the laser light from the light detection sensor 200 of the image projection device 1 tilted by ⁇ from the standard installation mode is It is determined whether or not the light detection sensor 200 has been adjusted to scan in parallel.
  • the adjustment completion determination unit 244 determines that the inclination adjustment has been completed, the adjustment completion determination unit 244 outputs a determination result to the adjustment completion notification unit 245. In response to the determination result from the adjustment completion determination unit 244, the adjustment completion notification unit 245 notifies the operator who operates the tilt adjustment mechanism that the adjustment has been completed.
  • the adjustment completion notification unit 245 outputs an operation signal to the external device 6 so as to display the characters “OK” on the video screen 101. Further, the adjustment completion notification unit 245 may output an operation signal to the external device 6 so that the characters “NG” are displayed before the adjustment is completed. In addition, the adjustment completion notification unit 245 may output a sound indicating the completion of adjustment from a speaker. In addition, the adjustment completion notification unit 245 may display a gauge indicating the remaining adjustment amount until the adjustment is completed on the video screen 101. By such notification, when the inclination adjusting unit 231 is a manual adjustment ring, a screw, or the like, the operator who adjusts the inclination can grasp the timing of completion of the adjustment.
  • the inclination of the light detection sensor 200 is adjusted according to the inclination of the image projection apparatus 1 with respect to the installation surface 102.
  • control is performed so that the laser beam emitted from the light detection sensor 200 scans in parallel with the image screen 101 even when the image projection apparatus 1 is installed inclined with respect to the installation surface 102. be able to. Therefore, the timing at which the operator operates the video screen 101 and the response timing to the operation are aligned, and the operability with respect to the projected video screen can be improved.
  • the tilt of the light detection sensor 200 is adjusted by the tilt adjustment mechanism shown in FIG. 10 in order to adjust the emission angle of the laser beam for detecting the operator's finger 105. Described as an example.
  • each unit constituting the light detection sensor 200 may be adjusted.
  • FIG. 15 is a diagram exemplifying a mode in which each unit constituting the light detection sensor 200 is adjusted.
  • the light detection sensor 200 includes an inclination adjustment mirror 205 in addition to the light emitting unit 202 and the mirror unit 210 described above.
  • the laser light emitted from the light emitting unit 202 is reflected by the mirror unit 210 and the tilt adjustment mirror 205 and is emitted to the outside of the image projection apparatus 1.
  • the operator adjusts the inclination of the inclination adjustment mirror 205 by an inclination adjustment unit (not shown) that adjusts the inclination of the inclination adjustment mirror 205.
  • the light emission angle control unit 240 notifies whether or not the tilt adjustment of the tilt adjustment mirror 205 has been completed based on the tilt ⁇ of the video projector 1. Even with such a configuration, the laser light emitted from the light detection sensor 200 can be controlled to scan in parallel with the video screen 101, so that the operability for the projected video screen can be improved. Become.
  • the light emitting unit 202 or the mirror unit 210 may be adjusted, or the light detection sensor 200 may be adjusted by adjusting a plurality of components including the tilt adjustment mirror 205 in combination.
  • the laser beam emission angle may be controlled.
  • the adjustment of the laser beam emission angle according to the inclination with respect to the installation surface 102 of the image projection apparatus 1 using one distance sensor 500 has been described as an example.
  • the two distance sensors 500 it is possible to further adjust the emission angle of the laser light according to the horizontal inclination of the image projection apparatus 1.
  • the light emission angle control unit 240 measures two elevation angles in the horizontal direction, and controls the emission angle of the laser beam according to each elevation angle.
  • the distance sensor 500 is installed at a position as far as possible from the surface on which the video screen 101 is projected (installation surface 102 in the case of projecting on a desk), and the angle ⁇ shown in FIG. It is desirable to emit light so that If the angle ⁇ is small, the component in the vertical direction of the emitted light with respect to the installation surface 102 becomes large and the reflected light becomes large. Therefore, the distance sensor 500 can easily receive the reflected light and easily measure the distance.
  • the distance sensor 500 is a position close to the projection lens 404 installed as far as possible from the surface on which the video screen 101 is projected (for example, the lower side of the focus ring 103 in the video projection apparatus 1 shown in FIG. 1). It is installed at a position adjacent to That is, the distance sensor 500 is provided at a position adjacent to the projection lens 404 with respect to the projection plane.
  • the inclination calculation unit 243 has been described as an example in which the inclination is calculated using the distance sensor 500. However, this is an example, and when it is assumed that the installation surface 102 is parallel to the floor surface, the inclination calculation unit 243 may calculate the inclination using a gyro sensor.
  • the case where the inclination adjustment part 231 is provided above the light detection sensor 200 namely, on the opposite side to the image
  • the tilt adjustment unit 231 is provided on the lower side of the light detection sensor 200 (that is, on the image screen 101 side with the light emitted from the light detection sensor 200 interposed), the tilt adjustment unit 231 emits from the light detection sensor 200. May block the light emitted. Therefore, it is desirable that the inclination adjustment unit 231 is provided on the upper side of the light detection sensor 200.
  • the inclination adjustment unit 231 is provided on the lower side of the light detection sensor 200, for example, if the inclination of the light detection sensor 200 can be adjusted without blocking the light emitted from the light detection sensor 200.
  • the light detection sensor 200 may be provided not only in the elevation angle direction but also in the azimuth angle direction.
  • the case where the inclination adjusting unit 231 is manually moved has been described as an example.
  • the inclination adjusting unit 231 is an adjustment ring or a screw as described above, the inclination adjusting unit 231 is provided outside the image projection device 1.
  • the tilt adjusting unit 231 operates by supplying an electric signal corresponding to the tilt calculated by the tilt calculating unit 243 to the stepping motor.
  • the distance sensor 500 can measure the distance to the projection surface, not only the inclination of the image projection apparatus 1 is calculated, but also the image projected on the projection surface is adjusted based on the distance to the projection slope. You can also
  • the light emission angle control unit 240 uses the distance sensor 500 to control the emission angle of the laser light from the light detection sensor 200.
  • the light emission angle control unit 240 in the second embodiment controls the emission angle of the laser light from the light detection sensor 200 using a camera. In the second embodiment, only the configuration different from the first embodiment will be described, and the description of the same configuration will be omitted.
  • FIG. 16 is a block diagram illustrating a functional configuration of the light emission angle control unit 240 according to the second embodiment.
  • the light emission angle control unit 240 according to the second embodiment includes the distance information acquisition unit 242 and the inclination calculation unit 243 according to the first embodiment, the pixel number acquisition unit 246, the inclination direction determination unit 247, and the adjustment instruction unit 248. The configuration replaced with is taken.
  • the video projection device 1 includes a camera 510 that functions as a shape recognition device that recognizes the shape of the video screen 101.
  • FIG. 17 is a diagram illustrating the shape of the video screen 101 recognized by the camera 510.
  • FIG. 17A is a diagram exemplifying the shape of the video screen 101 in the installation mode of the video projection device 1 serving as the reference shown in FIG.
  • the reference information storage unit 241 has a length of a side (hereinafter, referred to as “side A”) close to the camera 510 among horizontal sides of the shape of the video screen 101 illustrated in FIG.
  • the number of pixels A1 and the length B1 of the other side (hereinafter referred to as “side B”) are stored as reference information.
  • FIG. 17B shows the shape of the video screen 101 when the video projection apparatus 1 is installed with an inclination with respect to the installation surface 102 (for example, in the installation mode shown in FIG. 4A).
  • FIG. 17B the length A2 of the side A of the video screen 101 is longer than the length A1 of the side A shown in FIG. 17A, and the length B2 is shorter than B1. . That is, when the image projection apparatus 1 is installed in an elevation angle direction with respect to the installation surface, a relationship of B2 / A2 ⁇ B1 / A1 is established.
  • FIG. 17C shows the video screen 101 when the video projection device 1 is installed tilted in the depression direction with respect to the installation surface 102 (for example, in the installation mode shown in FIG. 4B). It is a figure which illustrates the shape of. As shown in FIG. 17C, the length A3 of the side A of the video screen 101 is shorter than the length A1 of the side A shown in FIG. 17A, and the length B3 is longer than B1. . That is, when the image projection apparatus 1 is installed inclined in the depression direction with respect to the installation surface, a relationship of B2 / A2> B1 / A1 is established.
  • the pixel number acquisition unit 246 acquires the pixel number (length) of the side A and the pixel number (length) of the side B from the video screen 101 recognized by the camera 510, and outputs them to the tilt direction determination unit 247. To do.
  • the inclination direction determination unit 247 is based on the pixel numbers of the sides A and B input from the pixel number acquisition unit 246 and the reference information (pixel number A1 and pixel number B1) stored in the reference information storage unit 241. Then, it is determined whether the image projection apparatus 1 is inclined in the elevation angle direction or the depression angle direction with respect to the installation surface 102.
  • the tilt direction determination unit 247 determines the tilt of the image projection device 1 based on the relationship between the reference pixel numbers A1 and B1 and the acquired pixel numbers (for example, A2 and B2) shown in FIG. The direction is determined, and the determination result is output to the adjustment instruction unit 248.
  • the adjustment instruction unit 248 instructs the operator how to adjust with the tilt adjustment mechanism based on the determination result. Specifically, for example, the adjustment instruction unit 248 sends an operation signal to the external device 6 to display on the video screen 101 which direction the rotation shaft 234 shown in FIG. Output. For example, when the determination result indicates that the determination result is tilted in the elevation angle direction, the adjustment instruction unit 248 instructs the adjustment to rotate the rotation shaft 234 clockwise.
  • the pixel number acquisition unit 246 acquires the pixel numbers of the sides A and B from the video screen 101 recognized by the camera 510, and Output to the adjustment completion determination unit 244.
  • the adjustment completion determination unit 244 matches the number of pixels of each side input from the pixel number acquisition unit 246 with the number of pixels of each side stored in the reference information storage unit 241 (or a difference in the number of pixels is determined in advance).
  • the adjustment completion notification unit 245 is notified that the adjustment has been completed.
  • the inclination of the light detection sensor 200 is adjusted according to the shape of the video screen 101 recognized by the camera 510.
  • control is performed so that the laser beam emitted from the light detection sensor 200 scans in parallel with the image screen 101 even when the image projection apparatus 1 is installed inclined with respect to the installation surface 102. can do. Therefore, the timing at which the operator operates the video screen 101 and the response timing for scanning are aligned, and the operability with respect to the projected video screen 101 can be improved.
  • the light emission angle control unit 240 determines the inclination in the elevation angle direction based on the length of the side A and the side B in the horizontal direction of the shape of the video screen 101, and according to the determination result.
  • the case where the laser beam emission angle is controlled has been described as an example.
  • the light emission angle control unit 240 determines the inclination in the azimuth direction based on the lengths of the side C and the side D in the vertical direction of the shape of the video screen 101, and according to the determination result
  • the emission angle of the laser beam may be controlled.
  • the emission angle of the laser light can be controlled with higher accuracy in accordance with the inclination of the image projection device 1, and the operability with respect to the projected image screen 101 can be further improved.
  • the camera 510 may be installed at any position as long as the shape of the video screen 101 can be recognized.
  • Embodiment 3 of the present invention will be described below.
  • the inclination calculation unit 243 according to the third embodiment of the present invention calculates the inclination with respect to the installation surface 102 of the image projection apparatus 1 using the light detection sensor 200 instead of the distance sensor 500 according to the first embodiment.
  • the third embodiment only the configuration different from the first embodiment will be described, and the description of the same configuration will be omitted.
  • FIG. 18 is a diagram illustrating a mode in which the inclination is calculated by the light detection sensor 200.
  • the light detection sensor 200 according to the third embodiment is configured to be able to emit laser light while shifting in the Y-axis direction.
  • Such a configuration is realized, for example, by controlling the inclination of the light detection sensor 200 as illustrated in FIG. 10 and described in the first embodiment.
  • such a configuration may be realized by adjusting each part of the light detection sensor 200 as illustrated in FIG. 15 and described in the first embodiment.
  • the mirror unit 210 may be a two-dimensional scan mirror.
  • the light detection sensor 200 receives the reflected light.
  • the inclination calculation unit 243 calculates the inclination based on the emission angle of the laser light when the light detection sensor 200 receives the reflected light and the distance information from the light detection sensor 200 to the installation surface 102. That is, the light detection sensor 200 emits measurement light for measuring a projection plane distance that is a distance from the light detection sensor 200 to the projection plane.
  • the inclination calculating unit 243 determines the emission angle of the laser beam when the laser beam emitted while shifting in the Y-axis direction at the reference position shown in FIG. Based on the distance information, a deviation from the reference is calculated as an inclination.
  • FIG. 19 is a diagram illustrating a sequence diagram of the light emission angle control process and the finger operation detection process.
  • the light detection sensor 200 itself is used for the tilt calculation process, the light emission angle control process by the light emission angle control unit 240 and the finger 105 detection process by the operation detection unit 300 are performed at different timings. Is called.
  • the light emission angle control unit 240 first calculates the tilt using the laser light of the light detection sensor 200. Next, the light emission angle control unit 240 controls the tilt adjustment of the light detection sensor 200 based on the calculated tilt. When the adjustment is completed, the operation detection unit 300 detects the operation of the operator's finger 105 using the laser beam of the light detection sensor 200.
  • the tilt of the light detection sensor 200 is adjusted according to the tilt with respect to the installation surface 102 of the video projection device 1 as in the first and second embodiments. Operability can be improved.
  • the apparatus since it is not necessary to mount the distance sensor 500 and the camera 510 in the image projection apparatus 1, the apparatus is further downsized.
  • the light emission angle control process in the third embodiment is performed only once as an initial setting after the image projection apparatus 1 is powered on, for example.
  • the light emission angle control process may be performed periodically at regular intervals. In any case, the operation detection process of the finger 105 by the operation detection unit 300 is not performed while the light emission angle control process is being performed.
  • FIG. 20 is a diagram illustrating an aspect of adjusting the laser beam emission angle using the jig 520.
  • the laser beam from the jig 520 is used.
  • the reflectance becomes higher than that when the laser beam is scanned in the other direction. That is, the jig 520 functions as a light reflecting portion having a different reflectance depending on the position where the light emitted from the light detection sensor 200 strikes.
  • the light emission angle control unit 240 calculates the reflectance of the laser light emitted from the light detection sensor 200, and determines that the inclination adjustment of the light detection sensor 200 is completed when the reflectance is equal to or greater than a predetermined value. . When it is determined that the adjustment has been completed, the light emission angle control unit 240 notifies the adjustment state such as the completion of the inclination adjustment of the light detection sensor 200 with an image or sound, like the adjustment completion notification unit 245 in the first embodiment. To do.
  • the tilt of the light detection sensor 200 is adjusted according to the tilt with respect to the installation surface 102 of the video projection device 1 as in the first and second embodiments. Operability can be improved.
  • the position where the jig 520 is placed may be displayed on the video screen 101.
  • the shape of the jig 520 in the fourth embodiment is a rectangular parallelepiped type will be described as an example, it may be a cylindrical shape in which laser light is easily reflected at any position on the video screen 101.
  • the case where the inclination of the light detection sensor 200 is adjusted according to the inclination in the elevation angle direction with respect to the installation surface 102 of the image projection apparatus 1 is described as an example.
  • the inclination of the light detection sensor 200 may be adjusted according to the above.
  • the jig 520 may be housed in the video projection device 1. With such a configuration, it becomes easy to adjust the inclination of the light detection sensor 200 at any time.
  • a sound output control unit (not shown) emits sound from the speaker of the image projection apparatus 1 or the image projection unit 400 displays an image.
  • the display on the screen 101 may be changed.
  • the operation detection unit 300 detects a touch operation of the volume adjustment button displayed on the video screen 101
  • the volume adjustment button is temporarily displayed in an enlarged or reduced manner. That is, the sound output control unit and the video projection unit 400 function as a detection notification unit that notifies the operator that the reflected light from the finger 105 has been detected in a manner that allows the operator to recognize it.
  • the video projection apparatus 1 notifies the operator that the operation has been detected by some method such as sound or display. It becomes easy to recognize the response to the operation.
  • some method such as sound or display. It becomes easy to recognize the response to the operation.
  • the video screen 101 is displayed on a space rather than on a desk or a wall, it is difficult for the operator to grasp whether the operation is detected without feeling touching the video screen 101.
  • the configuration described above even when the video screen 101 is displayed in the space, it becomes easier for the operator to feel that the video screen 101 is touched, and the operability with respect to the projected video screen 101 is further improved. .
  • the size of the video screen 101 is constant, and the detection range of the finger 105 by the laser light emitted from the light detection sensor 200 is also constant according to the size of the video screen 101.
  • the size of the video screen 101 can be changed, for example, when the size of the video screen 101 becomes larger than the initial size, the size of the video screen 101 becomes larger than the detection range of the finger 105, and the position in the detection range and the video screen The position on 101 is shifted.
  • the operation position on the video screen 101 is outside the detection range and the operation cannot be detected, or even if the operation position is within the detection range, it is erroneously detected as an operation at a different position.
  • a 1 cm square button is displayed on the initial size video screen 101
  • a 2 cm square button is displayed when the size of the video screen 101 is twice the initial size.
  • the operation detection unit 300 does not detect the touch operation of the button even though the 2 cm square button is touched.
  • the video projection device 1 may change the detection range of the finger 105 according to the changed size. Specifically, for example, the position of a size adjusting ring (not shown) that adjusts the size of the video screen 101 is electrically fed back, the light emitting unit 202 adjusts the intensity according to the feedback, and the amplifying unit 311 adjusts the gain. . Further, for example, when the size of the video screen 101 is electrically adjusted, the light emitting unit 202 may adjust the intensity according to the electric signal at the time of adjustment, and the amplifying unit 311 may adjust the gain. Thereby, the scanning range of the light emitted from the light detection sensor 200 is changed.
  • the apparatus is further downsized.
  • the image projection apparatus 1 in which the image screen 101 is displayed on the desk is described as an example.
  • the image projection apparatus 1 may be any projector such as an ultra-short projection type capable of scanning light near the projection surface from the light detection sensor 200.
  • the video projection device 1 may be installed near a wall surface or suspended from a ceiling to display the video screen 101 on the wall surface, or may display the video screen 101 in a space.
  • the case where the operator's finger 105 is used to operate the video screen 101 has been described as an example. However, this is an example, and a touch pen, a pointing stick, or the like is used instead of the finger 105. Any mode that can operate on the video screen 101 may be used.
  • FIG. 21 is a diagram illustrating an installation state of the image projection device 1 when controlling the height of the laser beam. As shown in FIG. 21, since a part of the image projection device 1 is installed on a step that is one step higher than the installation surface 102, the image screen 101, the laser beam, and the laser beam are controlled only by controlling the laser beam emission angle. In some cases, the vertical distance is more than the desired distance. On the contrary, when the installation surface 102 is at a position higher than the bottom surface of the image projection device 1, the laser light may not be able to scan the installation surface 102 in parallel.
  • the light emission angle control unit 240 also controls the height of the laser light. Specifically, for example, the light emission angle control unit 240 calculates a difference from the reference height using the distance sensor 500, and emits the light from the light detection sensor 200 based on the calculated height difference. Controls the height of the laser beam.
  • this invention is not limited to the above-mentioned Example, Various modifications are included.
  • the above-described embodiments have been described in detail for easy understanding of the present invention, and are not necessarily limited to those having all the configurations described.
  • each of the above-described configurations, functions, processing units, and the like may be realized by hardware by designing a part or all of them with, for example, an integrated circuit.
  • Each of the above-described configurations, functions, and the like may be realized by software by interpreting and executing a program that realizes each function by the processor.
  • Information such as programs, tables, and files that realize each function can be stored in a storage medium, storage, or the like.
  • the control lines and information lines indicate what is considered necessary for the explanation, and not all the control lines and information lines on the product are necessarily shown. Actually, it may be considered that almost all the components are connected to each other.

Abstract

Provided is an image projection device that projects an image screen and detects operations on the projected image screen, wherein operability with respect to the projected image screen is improved. This image projection device projects, onto a projection surface, an image generated on the basis of an image signal, and is characterized by including: a light detection unit that emits detection light for detecting an object on the image screen projected on the projection surface, and detects light among the detection light reflected by the object on the image screen; an operation signal generation unit that generates an operation signal for the image screen on the basis of a detection signal of said reflected light; and a light emission angle control unit that controls the emission angle of the detection light.

Description

映像投写装置Image projection device
 本発明は映像投写装置に関する。 The present invention relates to an image projection apparatus.
 本技術分野の背景技術として、米国特許出願公開第2014/152843号明細書(特許文献1)がある。この公報には、「撮影画像への擬似的な書き込みを実現する際に、被写体に対する直感的な描画操作が可能な書画カメラ、書画カメラの制御方法を提供すること」が記載されている。 As background art in this technical field, there is US Patent Application Publication No. 2014/152843 (Patent Document 1). This publication describes “providing a document camera capable of intuitive drawing operation on a subject and a method for controlling the document camera when realizing pseudo writing on a photographed image”.
米国特許出願公開第2014/152843号明細書US Patent Application Publication No. 2014/152843
 特許文献1に開示された技術では、プロジェクタ及びプロジェクタから投写された映像画面に対するユーザの操作を検知する書画カメラという別体の装置を設置する必要がある。そのため、様々な場所で特許文献1に開示された構成を使用する場合、プロジェクタ及び書画カメラをそれぞれ準備して設置する必要があるため、ユーザの作業負担が大きい。 In the technology disclosed in Patent Document 1, it is necessary to install a separate device called a document camera that detects a user's operation on a video screen projected from the projector and the projector. For this reason, when using the configuration disclosed in Patent Document 1 in various places, it is necessary to prepare and install the projector and the document camera, respectively.
 このような問題を解決するために、映像投写装置であるプロジェクタがユーザの操作を検知する機能を内蔵し、プロジェクタ1つを設置するだけでプロジェクタから投写された画面に対するユーザの操作を検知することが考えられる。具体的には、プロジェクタが光検知センサを内蔵し、光検知センサによりユーザの操作を検知する機能が実現される。 In order to solve such a problem, a projector as a video projection device has a built-in function for detecting a user's operation and detects a user's operation on a screen projected from the projector by installing only one projector. Can be considered. Specifically, the projector incorporates a light detection sensor, and a function of detecting a user operation by the light detection sensor is realized.
 光検知センサは、プロジェクタから投写される映像画面の近傍で走査するレーザ光を出射し、操作を行うユーザの指等の物体からのレーザ光の反射光を受光する。光検知センサが物体からの反射光を検知することで、ユーザが映像画面を操作したタイミングに合った操作検知が可能になる。 The light detection sensor emits a laser beam that scans in the vicinity of the image screen projected from the projector, and receives the reflected light of the laser beam from an object such as a finger of the user who performs the operation. When the light detection sensor detects the reflected light from the object, it is possible to detect the operation in accordance with the timing when the user operates the video screen.
 しかしながら、このような光検知センサを内蔵したプロジェクタを設置した際のプロジェクタの傾き等の設置状態により、光検知センサから出射するレーザ光の角度が変わる場合がある。レーザ光の出射角度が変わることにより、ユーザが操作を行ったタイミングと操作の検知タイミングがずれたり、ユーザが操作を行ったにも関わらず操作が検知されなかったりする場合がある。そのため、プロジェクタに投写された映像画面に対する操作性が低下する。 However, the angle of the laser beam emitted from the light detection sensor may change depending on the installation state such as the inclination of the projector when a projector incorporating such a light detection sensor is installed. When the emission angle of the laser light changes, the timing at which the user performs the operation and the detection timing of the operation may be shifted, or the operation may not be detected even though the user has performed the operation. Therefore, the operability for the video screen projected on the projector is lowered.
 本発明は、このような課題を解決するためになされたものであり、映像画面を投写し、投写した映像画面に対する操作を検知する映像投写装置において、投写された映像画面に対する操作性を向上させることを目的とする。 The present invention has been made to solve such problems, and improves the operability of a projected video screen in a video projection device that projects a video screen and detects an operation on the projected video screen. For the purpose.
 上記課題を解決するために、本発明は例えば特許請求の範囲に記載の構成を採用する。本願は上記課題を解決する構成要素を複数含んでいるが、その一例を挙げるならば、投写面に投写された映像画面上の物体を検知するための検知光の出射角度を制御することを特徴としている。 In order to solve the above-described problems, the present invention adopts, for example, the configurations described in the claims. The present application includes a plurality of components that solve the above-described problem. To give an example, the present invention controls the emission angle of detection light for detecting an object on a video screen projected on a projection plane. It is said.
 本発明によれば、映像画面を投写し、投写した映像画面に対する操作を検知する映像投写装置において、投写された映像画面に対する操作性を向上させることができる。上記した以外の課題、構成及び効果は、以下の実施形態の説明により明らかにされる。 According to the present invention, in a video projection device that projects a video screen and detects an operation on the projected video screen, it is possible to improve the operability of the projected video screen. Problems, configurations, and effects other than those described above will be clarified by the following description of embodiments.
本発明の実施形態に係る映像投写装置の外観を例示する図である。It is a figure which illustrates the external appearance of the image projection apparatus which concerns on embodiment of this invention. 本発明の実施形態に係る映像投写装置の光検知センサによる操作者の指の位置及び動きを検知する態様を例示する図である。It is a figure which illustrates the aspect which detects the position and movement of an operator's finger | toe by the light detection sensor of the image projection apparatus which concerns on embodiment of this invention. 本発明の実施形態に係る映像投写装置の光検知センサによる操作者の指の位置及び動きを検知する態様を例示する図である。It is a figure which illustrates the aspect which detects the position and movement of an operator's finger | toe by the light detection sensor of the image projection apparatus which concerns on embodiment of this invention. 映像投写装置が設置面に対して傾いて設置されている場合に光検知センサが出射するレーザ光の態様を例示する図である。It is a figure which illustrates the aspect of the laser beam which a light detection sensor radiate | emits when the image projection apparatus is installed inclining with respect to the installation surface. 本発明の実施形態に係る映像投写装置の機能構成を例示するブロック図である。It is a block diagram which illustrates the functional composition of the picture projection device concerning the embodiment of the present invention. 本発明の実施形態に係る光検知センサの構成を例示するブロック図である。It is a block diagram which illustrates the composition of the photodetection sensor concerning the embodiment of the present invention. 本発明の実施形態に係るミラー部の構成を例示する図である。It is a figure which illustrates the composition of the mirror part concerning the embodiment of the present invention. 本発明の実施形態に係るミラー駆動部の構成を例示する図である。It is a figure which illustrates the structure of the mirror drive part which concerns on embodiment of this invention. 本発明の実施形態に係る距離情報取得部の構成を例示するブロック図である。It is a block diagram which illustrates the composition of the distance information acquisition part concerning the embodiment of the present invention. 本発明の実施形態に係る光検知センサの傾きを調整する傾き調整機構の構成を例示する図である。It is a figure which illustrates the structure of the inclination adjustment mechanism which adjusts the inclination of the optical detection sensor which concerns on embodiment of this invention. 本発明の実施形態に係る光出射角度制御部の機能構成を例示するブロック図である。It is a block diagram which illustrates the functional structure of the light emission angle control part which concerns on embodiment of this invention. 本発明の実施形態に係る基準情報記憶部に記憶されている基準情報を例示する図である。It is a figure which illustrates the standard information memorized by the standard information storage part concerning the embodiment of the present invention. 本発明の実施形態に係る基準となる映像投写装置の設置態様を例示する図である。It is a figure which illustrates the installation aspect of the image projection apparatus used as the reference | standard which concerns on embodiment of this invention. 本発明の実施形態に係る映像投写装置の設置態様を例示する図である。It is a figure which illustrates the installation aspect of the image projection apparatus which concerns on embodiment of this invention. 本発明の実施形態に係る光検知センサを構成する各部を調整する態様を例示する図である。It is a figure which illustrates the aspect which adjusts each part which comprises the photon detection sensor which concerns on embodiment of this invention. 本発明の実施形態に係る光出射角度制御部の機能構成を例示するブロック図である。It is a block diagram which illustrates the functional structure of the light emission angle control part which concerns on embodiment of this invention. 本発明の実施形態に係るカメラが認識した映像画面の形状を例示する図である。It is a figure which illustrates the shape of the video screen recognized by the camera which concerns on embodiment of this invention. 本発明の実施形態に係る光検知センサにより傾きを算出する態様を例示する図である。It is a figure which illustrates the aspect which calculates inclination by the photon detection sensor which concerns on embodiment of this invention. 本発明の実施形態に係る光出射角度制御の処理と指操作検知の処理とのシーケンス図を例示する図である。It is a figure which illustrates the sequence diagram of the process of light emission angle control which concerns on embodiment of this invention, and the process of finger operation detection. 本発明の実施形態に係る治具を用いたレーザ光の出射角度調整の態様を例示する図である。It is a figure which illustrates the aspect of the laser beam emission angle adjustment using the jig | tool which concerns on embodiment of this invention. 本発明の実施形態に係る映像投写装置の設置態様を例示する図である。It is a figure which illustrates the installation aspect of the image projection apparatus which concerns on embodiment of this invention.
 以下、本発明の実施形態を、図面を用いて詳細に説明する。本実施形態においては、机上を投写面として映像を投写する机上投写プロジェクタを対象とし、操作者の指の位置及び動きに基づいて投写映像を制御する映像投写装置を例として説明する。 Hereinafter, embodiments of the present invention will be described in detail with reference to the drawings. In the present embodiment, a description will be given by taking as an example a video projection apparatus that targets a desktop projection projector that projects an image on a desk as a projection plane and controls the projected image based on the position and movement of the operator's finger.
 [実施例1]
 以下、本発明の実施例1を説明する。図1は、本発明の実施形態に係る映像投写装置1の概観を例示する図である。図1に示すXYZ軸は、映像投写装置1が投写する映像画面に対してどの方向から見た図であるかを示す。具体的には、映像画面に対して水平方向がX軸方向、鉛直方向がY軸方向、垂直方向がZ軸方向を示している。図1に示した映像投写装置1は、X軸方向すなわち映像画面に対して水平方向から見た図である。以降の図においても、同様のXYZ軸とともに映像投写装置1が示される。
[Example 1]
Embodiment 1 of the present invention will be described below. FIG. 1 is a diagram illustrating an overview of a video projector 1 according to an embodiment of the invention. The XYZ axes shown in FIG. 1 indicate from which direction the image screen projected by the image projection apparatus 1 is viewed. Specifically, the horizontal direction with respect to the video screen indicates the X-axis direction, the vertical direction indicates the Y-axis direction, and the vertical direction indicates the Z-axis direction. The image projection apparatus 1 shown in FIG. 1 is a view seen from the X-axis direction, that is, the horizontal direction with respect to the image screen. In the subsequent drawings, the image projection apparatus 1 is shown together with the similar XYZ axes.
 図1に示すように、映像投写装置1は、机上である設置面102上に設置され、装置内部で生成した映像光を投写レンズ404で拡大した後、反射ミラー405で反射して設置面102に映像画面101を投写する。すなわち、本実施形態においては、設置面102が投写面である。投射映像のフォーカスは、フォーカスリング103で調整される。反射ミラー405は、折り畳み可能に構成されており、不使用時には反射面が映像投写装置1と対向するように収納される。 As shown in FIG. 1, the image projection apparatus 1 is installed on an installation surface 102 that is a desk. The image light generated inside the apparatus is enlarged by a projection lens 404 and then reflected by a reflection mirror 405 to be installed on the installation surface 102. The image screen 101 is projected onto the screen. That is, in the present embodiment, the installation surface 102 is a projection surface. The focus of the projected image is adjusted by the focus ring 103. The reflection mirror 405 is configured to be foldable and is housed so that the reflection surface faces the image projection device 1 when not in use.
 映像投写装置1は、光を出射し物体の反射光を検知する光検知センサ200を有する。例えば、光検知センサ200は、検出境界104を横切った操作者の指の位置及び指の動きを検知する。映像投写装置1は、光検知センサ200が検知した指の位置及び動きを、タップ、フリック、スワイプ、ピンチイン、ピンチアウト等のスマートフォン、タブレット等の画面上で行われる操作と同様の操作として認識し、認識した操作に応じた映像を投写する。光検知センサ200が検知した指の位置及び動きに応じた映像投写処理の詳細については後述する。以降、本実施形態においては「物体の位置」を「操作者の指の位置」として説明する。 The image projection apparatus 1 includes a light detection sensor 200 that emits light and detects reflected light of an object. For example, the light detection sensor 200 detects the position of the operator's finger and the movement of the finger across the detection boundary 104. The image projection apparatus 1 recognizes the position and movement of the finger detected by the light detection sensor 200 as operations similar to operations performed on the screen of a smartphone, tablet, etc. such as tap, flick, swipe, pinch-in, and pinch-out. Project an image according to the recognized operation. Details of the image projection processing corresponding to the position and movement of the finger detected by the light detection sensor 200 will be described later. Hereinafter, in this embodiment, the “position of the object” is described as the “position of the operator's finger”.
 図2及び図3は、映像投写装置1の光検知センサ200による操作者の指の位置及び動きを検知する態様を例示する図である。図2に示した映像投写装置1は、Y軸方向すなわち映像画面に対して鉛直方向から見た図であり、図3に示した映像投写装置1は、X軸方向すなわち映像画面に対して水平方向から見た図である。 2 and 3 are diagrams illustrating an example of detecting the position and movement of the operator's finger by the light detection sensor 200 of the video projection device 1. The image projection apparatus 1 shown in FIG. 2 is a view seen from the direction perpendicular to the Y axis direction, that is, the image screen, and the image projection apparatus 1 shown in FIG. 3 is horizontal to the X axis direction, that is, the image screen. It is the figure seen from the direction.
 図2及び図3に示すように、光検知センサ200は、映像画面101の範囲にレーザ光を出射し、操作者がタッチパネルのように映像画面101上を指105でタッチして操作する(例えば、映像画面101に表示されているボタンにタッチする)と、指105からの反射光を検知する。反射光を検知した光検知センサ200は、検知した反射光に基づいて光検知センサ200から指までの距離を算出し、算出した距離及びレーザ光の出射方向に基づいて、映像画面101上のどの位置をタッチしたかを認識する。 As shown in FIGS. 2 and 3, the light detection sensor 200 emits laser light to the range of the video screen 101, and an operator touches the video screen 101 with a finger 105 to operate it like a touch panel (for example, , The button displayed on the video screen 101 is touched), and the reflected light from the finger 105 is detected. The light detection sensor 200 that has detected the reflected light calculates the distance from the light detection sensor 200 to the finger based on the detected reflected light, and based on the calculated distance and the emission direction of the laser light, Recognize whether the position is touched.
 すなわち、光検知センサ200は、投写面に対する操作を検知するための検知光を出射し、投写面に投写された映像画面101上の物体からの検知光の反射光を検知する光検知部として機能する。映像画面101上の物体は、例えば、操作者の指又は指示棒等であり、投写面と接触していてもよいし、投写面と接触することなくわずかに(例えば、数mm程度)離れていてもよい。 That is, the light detection sensor 200 functions as a light detection unit that emits detection light for detecting an operation on the projection surface and detects reflected light of the detection light from an object on the video screen 101 projected on the projection surface. To do. The object on the video screen 101 is, for example, an operator's finger or a pointing stick, and may be in contact with the projection surface or slightly separated (for example, about several mm) without being in contact with the projection surface. May be.
 図3に示すように、光検知センサ200が出射するレーザ光は、操作者が映像画面101上をタッチしたタイミングで指105の位置及び動きを検知するために、設置面102に極めて近い位置を設置面102と平行に走査する。例えば、レーザ光と設置面102との鉛直方向距離y1を20mm以下とすることが望ましい。なお、以降、「指105の位置及び動きを検知する」ことを単に「指105を検知する」とする。また、以降、「設置面102と平行に」の表現は「設置面102と平行に極めて近い」意味も含まれるものとする。 As shown in FIG. 3, the laser light emitted from the light detection sensor 200 is positioned very close to the installation surface 102 in order to detect the position and movement of the finger 105 at the timing when the operator touches the video screen 101. Scan in parallel with the installation surface 102. For example, it is desirable that the vertical distance y1 between the laser beam and the installation surface 102 be 20 mm or less. Hereinafter, “detecting the position and movement of the finger 105” is simply referred to as “detecting the finger 105”. Hereinafter, the expression “in parallel with the installation surface 102” also includes the meaning of “very close to parallel with the installation surface 102”.
 図4は、映像投写装置1が設置面102に対して傾いて設置されている場合に光検知センサ200が出射するレーザ光の態様を例示する図である。図4(a)は、映像投写装置1と設置面102とのなす角度が90°よりも大きい場合を示し、図4(b)は、映像投写装置1と設置面102とのなす角度が90°よりも小さい場合を示す。 FIG. 4 is a diagram illustrating a mode of laser light emitted from the light detection sensor 200 when the image projection apparatus 1 is installed to be inclined with respect to the installation surface 102. 4A shows a case where the angle formed by the image projection apparatus 1 and the installation surface 102 is larger than 90 °, and FIG. 4B shows that the angle formed by the image projection apparatus 1 and the installation surface 102 is 90 °. The case where it is smaller than ° is shown.
 図3を示して説明したように、操作者が映像画面101上をタッチしたタイミングで指105を検知するためには、光検知センサ200が出射するレーザ光が設置面102に極めて近い位置を設置面102と平行に走査する必要がある。しかしながら、図4に示すように、映像投写装置が設置面102に対して傾いて設置されていると、光検知センサ200は、操作者が映像画面101上をタッチしたタイミングで指105を検知しない場合がある。 As described with reference to FIG. 3, in order to detect the finger 105 at the timing when the operator touches the video screen 101, the laser light emitted from the light detection sensor 200 is installed at a position very close to the installation surface 102. It is necessary to scan parallel to the surface 102. However, as shown in FIG. 4, when the image projection device is installed inclined with respect to the installation surface 102, the light detection sensor 200 does not detect the finger 105 when the operator touches the image screen 101. There is a case.
 例えば、図4(a)に示すように、映像投写装置1と設置面102とのなす角度が90°よりも大きい場合、光検知センサ200から出射するレーザ光は、設置面102と平行に走査せず、光検知センサ200から離れるほど、レーザ光と設置面102との鉛直方向距離が長くなる。そのため、操作者が映像画面101をタッチするタイミングよりも前に指105がレーザ光に当たり、光検知センサ200は指105からの反射光により指105を検知するので、操作者によるタッチのタイミングと光検知センサ200による指105の検知タイミングとがずれてしまう。 For example, as shown in FIG. 4A, when the angle formed between the image projection apparatus 1 and the installation surface 102 is larger than 90 °, the laser light emitted from the light detection sensor 200 scans in parallel with the installation surface 102. Instead, the distance in the vertical direction between the laser beam and the installation surface 102 increases as the distance from the light detection sensor 200 increases. For this reason, the finger 105 hits the laser beam before the timing when the operator touches the video screen 101, and the light detection sensor 200 detects the finger 105 by the reflected light from the finger 105. The detection timing of the finger 105 by the detection sensor 200 deviates.
 このような場合、操作者が映像画面101を操作しようとするタイミングと、光検知センサ200による指105の検知に応じた映像画面101の反応タイミングとがずれるので、操作者は、操作性が低いと感じることになる。また、操作者が映像画面101をタッチする前に映像画面101が反応するため、操作者は、映像画面101を操作した感覚をつかみにくく、操作性が低いと感じることになる。 In such a case, the timing when the operator tries to operate the video screen 101 and the reaction timing of the video screen 101 in response to the detection of the finger 105 by the light detection sensor 200 are shifted, so the operator has low operability. Will feel. Further, since the video screen 101 reacts before the operator touches the video screen 101, the operator does not easily feel the sense of operating the video screen 101, and feels that the operability is low.
 また、例えば、図4(b)に示すように、映像投写装置1と設置面102とのなす角度が90°よりも小さい場合、光検知センサ200から出射するレーザ光は、設置面102と平行に走査せず、設置面102に向かって出射する。そのため、映像投写装置1から離れた映像画面101の位置までレーザ光が届かず、指105を検出できない場合がある。このような場合、操作者が映像画面101を操作したにも関わらず、光検知センサ200により指105が検知されず操作性が低下する。 Further, for example, as shown in FIG. 4B, when the angle formed by the image projection apparatus 1 and the installation surface 102 is smaller than 90 °, the laser light emitted from the light detection sensor 200 is parallel to the installation surface 102. The light is emitted toward the installation surface 102 without scanning. Therefore, the laser beam does not reach the position of the video screen 101 far from the video projection device 1 and the finger 105 may not be detected. In such a case, although the operator operates the video screen 101, the finger 105 is not detected by the light detection sensor 200, and the operability is deteriorated.
 また、このような問題は、映像投写装置1の設置面102に対する傾きだけでなく、投写面である設置面102が傾いている場合等、映像投写装置1の設置面102に対する設置状態に応じて同様に生じ得る。そこで、本実施形態に係る投写面である設置面102に対する映像投写装置1の設置位置の状態に応じてレーザ光の出射角度を制御することが本実施形態の要旨の1つである。以下、本実施形態に係る映像投写装置1の機能構成を説明する。 Such a problem is caused not only by the inclination of the image projection device 1 with respect to the installation surface 102 but also when the installation surface 102 as the projection surface is inclined, depending on the installation state of the image projection device 1 with respect to the installation surface 102. It can occur as well. Therefore, one of the gist of the present embodiment is to control the emission angle of the laser light according to the state of the installation position of the image projection apparatus 1 with respect to the installation surface 102 which is the projection surface according to the present embodiment. Hereinafter, the functional configuration of the image projection apparatus 1 according to the present embodiment will be described.
 図5は、本実施形態に係る映像投写装置1の機能構成を例示するブロック図である。図5に示すように、映像投写装置1は、光検知センサ200、光出射角度制御部240、操作検知部300、映像投写部400及び距離センサ500を含む。また、操作検知部300は、距離情報取得部310、座標情報取得部301及び操作信号生成部302を含む。また、映像投写部400は、映像制御部401、光源部402、光制御部403、投写レンズ404及び反射ミラー405を含む。 FIG. 5 is a block diagram illustrating the functional configuration of the image projection apparatus 1 according to this embodiment. As shown in FIG. 5, the video projection device 1 includes a light detection sensor 200, a light emission angle control unit 240, an operation detection unit 300, a video projection unit 400, and a distance sensor 500. The operation detection unit 300 includes a distance information acquisition unit 310, a coordinate information acquisition unit 301, and an operation signal generation unit 302. The image projection unit 400 includes an image control unit 401, a light source unit 402, a light control unit 403, a projection lens 404, and a reflection mirror 405.
 また、外部機器6は、映像投写装置1と接続されたPC(Personal Computer)等の一般的な情報処理装置及びスマートフォン等の携帯端末装置であり、映像投写装置1に対して映像信号を供給する。なお、外部機器6は、PC及び携帯端末装置に限らず、映像投写装置1に備えられたカードインタフェースに挿入されるカード状の記憶媒体等、映像投写装置1に映像信号を供給する装置であればよい。 The external device 6 is a general information processing device such as a PC (Personal Computer) connected to the video projection device 1 and a mobile terminal device such as a smartphone, and supplies a video signal to the video projection device 1. . The external device 6 is not limited to a PC and a portable terminal device, and may be a device that supplies a video signal to the video projection device 1 such as a card-like storage medium inserted into a card interface provided in the video projection device 1. That's fine.
 まず、映像投写部400の構成を説明する。映像制御部401は、外部機器6から供給された映像信号に応じて、光源部402、光制御部403に対して制御信号を出力する。光源部402は、ハロゲンランプ、LED(Light Emitting Diode)、レーザ等の光源であり、映像制御部401から入力された制御信号に応じて光量を調整する。なお、光源部402は、R(Red)、G(Green)、B(Blue)の3色を含む場合、映像信号に応じて各々独立して光量を制御してもよい。 First, the configuration of the image projection unit 400 will be described. The video control unit 401 outputs a control signal to the light source unit 402 and the light control unit 403 according to the video signal supplied from the external device 6. The light source unit 402 is a light source such as a halogen lamp, an LED (Light Emitting Diode), or a laser, and adjusts the amount of light according to a control signal input from the video control unit 401. Note that when the light source unit 402 includes three colors of R (Red), G (Green), and B (Blue), the light amount may be independently controlled according to the video signal.
 光制御部403は、ミラー、レンズ、プリズム、イメージャ(例えば液晶パネルのような表示デバイス)等の光学系構成要素を有し、光源部402から出射された光を用いて、外部機器6から供給された映像信号に基づく光学的な映像を生成する。投写レンズ404は、光制御部403から出力された映像を拡大する。反射ミラー405は、投写レンズ404から放射された光を反射し、設置面102に映像画面101を投写する。 The light control unit 403 includes optical system components such as a mirror, a lens, a prism, and an imager (for example, a display device such as a liquid crystal panel), and is supplied from the external device 6 using light emitted from the light source unit 402. An optical image based on the processed image signal is generated. The projection lens 404 enlarges the image output from the light control unit 403. The reflection mirror 405 reflects the light emitted from the projection lens 404 and projects the video screen 101 on the installation surface 102.
 反射ミラー405は、非球面ミラーを用いており、同サイズの映像画面を投写する場合、一般的な映像投写装置と比較して投写距離を短くすることができる。本実施形態においては、反射ミラー405を用いた映像投写部400を例として説明したが、映像投写を実現できる構成であれば他の構成でもよい。 The reflection mirror 405 uses an aspherical mirror, and when projecting an image screen of the same size, the projection distance can be shortened compared to a general image projection apparatus. In the present embodiment, the image projection unit 400 using the reflection mirror 405 has been described as an example, but other configurations may be used as long as the image projection can be realized.
 次に、光検知センサ200の構成を説明する。図6は、光検知センサ200の構成を例示するブロック図である。図6に示すように、光検知センサ200は、受光部201、発光部202、発光駆動部203、ミラー部210及びミラー駆動部220を含む。発光駆動部203は、光(レーザ光)を出射する発光部202の発光強度、発光周波数等を制御する。 Next, the configuration of the light detection sensor 200 will be described. FIG. 6 is a block diagram illustrating the configuration of the light detection sensor 200. As shown in FIG. 6, the light detection sensor 200 includes a light receiving unit 201, a light emitting unit 202, a light emission driving unit 203, a mirror unit 210, and a mirror driving unit 220. The light emission drive unit 203 controls the light emission intensity, the light emission frequency, and the like of the light emitting unit 202 that emits light (laser light).
 図7は、ミラー部210の構成を例示する図である。図7に示すように、ミラー部210は、ミラー211、永久磁石212及び中心軸213から構成される。ミラー211は、永久磁石212の極性及び後述するミラー駆動部220の極性により、中心軸213の回りを振動又は回転する。ミラー部210は、発光部202から出射した光を、中心軸213の周りを振動又は回転するミラー211により所望の方向に反射する。この反射光が図3に示した指105を検知するためのレーザ光である。 FIG. 7 is a diagram illustrating the configuration of the mirror unit 210. As shown in FIG. 7, the mirror unit 210 includes a mirror 211, a permanent magnet 212, and a central axis 213. The mirror 211 vibrates or rotates around the central axis 213 depending on the polarity of the permanent magnet 212 and the polarity of a mirror driving unit 220 described later. The mirror unit 210 reflects the light emitted from the light emitting unit 202 in a desired direction by the mirror 211 that vibrates or rotates around the central axis 213. This reflected light is a laser beam for detecting the finger 105 shown in FIG.
 図8は、ミラー駆動部220の構成を例示する図である。図8に示すように、ミラー駆動部220は、コイル221、駆動電流生成部222、ミラー制御部223及びミラー位置検知部224から構成される。コイル221は、電流の向き、電流量により極性、磁力を変える電磁石として用いられる。 FIG. 8 is a diagram illustrating a configuration of the mirror driving unit 220. As shown in FIG. 8, the mirror driver 220 includes a coil 221, a drive current generator 222, a mirror controller 223, and a mirror position detector 224. The coil 221 is used as an electromagnet that changes the polarity and magnetic force according to the direction and amount of current.
 駆動電流生成部222は、ミラー制御部223から出力される制御信号に応じた電流量、駆動周波数を生成し、コイル221に電流を供給する。コイル221は、駆動電流生成部222から供給された電流量、タイミングに応じて極性を切替える。 The drive current generator 222 generates a current amount and a drive frequency according to the control signal output from the mirror controller 223, and supplies the coil 221 with a current. The coil 221 switches the polarity according to the amount of current supplied from the drive current generator 222 and the timing.
 ミラー位置検知部224は、駆動電流生成部222の出力の変化によりミラー部210の位置を検知する。具体的には、例えば、ミラー位置検知部224は、永久磁石212がコイル221に及ぼす磁束の変化により生じる逆起電力を利用して、ミラー部210の位置を検知して、検知した位置情報をミラー制御部223に対して出力する。ミラー制御部223は、ミラー位置検知部224から入力されたミラー部210の位置情報(以降、「ミラー位置情報」とする)を後述する距離情報取得部310に対して出力する。 The mirror position detection unit 224 detects the position of the mirror unit 210 based on a change in the output of the drive current generation unit 222. Specifically, for example, the mirror position detection unit 224 detects the position of the mirror unit 210 using the back electromotive force generated by the change in magnetic flux exerted on the coil 221 by the permanent magnet 212, and detects the detected position information. Output to the mirror control unit 223. The mirror control unit 223 outputs the position information (hereinafter referred to as “mirror position information”) of the mirror unit 210 input from the mirror position detection unit 224 to the distance information acquisition unit 310 described later.
 ミラー部210が共振により動作する場合、ミラー制御部223は、ミラー位置検知部224から出力されたミラー部210の位置情報により制御信号を変更してもよい。共振周波数は温度により変化するため、ミラー部210が共振により動作する場合、ミラー部210による反射光すなわち図3に示したレーザ光による指105の検知範囲も温度により変化する。そのため、ミラー部210が共振により動作する場合、ミラー制御部223は、ミラー部210の位置情報に応じて共振周波数を制御して、指105の検知範囲を一定にすることが望ましい。 When the mirror unit 210 operates by resonance, the mirror control unit 223 may change the control signal according to the position information of the mirror unit 210 output from the mirror position detection unit 224. Since the resonance frequency changes with temperature, when the mirror unit 210 operates by resonance, the detection range of the finger 105 by the reflected light from the mirror unit 210, that is, the laser beam shown in FIG. 3, also changes with temperature. Therefore, when the mirror unit 210 operates by resonance, the mirror control unit 223 desirably controls the resonance frequency according to the position information of the mirror unit 210 to make the detection range of the finger 105 constant.
 ミラー部210により反射した光は、図2及び図3を示して上述したように、映像画面101を操作する操作者の指105に当たると反射又は散乱する。受光部201は、指105に当たって反射又は散乱した一部の光を受光し、受光した光を光電変換して、後述する距離情報取得部310に対して電気信号を出力する。 As described above with reference to FIGS. 2 and 3, the light reflected by the mirror unit 210 is reflected or scattered when it hits the finger 105 of the operator who operates the video screen 101. The light receiving unit 201 receives a part of light reflected or scattered by hitting the finger 105, photoelectrically converts the received light, and outputs an electric signal to the distance information acquisition unit 310 described later.
 なお、上述したように、映像投写装置1が設置面102に対して傾いている場合、図4に示した角度でレーザ光が出射しないように、光出射角度制御部240は、光検知センサ200から出射するレーザ光の出射角度を制御する。実施例1において、光出射角度制御部240は、距離センサ500からの出力情報に基づいてレーザ光の出射角度を制御する。これが本実施形態に係る要旨の1つであり、詳細は後述する。 As described above, when the image projection apparatus 1 is tilted with respect to the installation surface 102, the light emission angle control unit 240 prevents the laser light from being emitted at the angle shown in FIG. The emission angle of the laser beam emitted from is controlled. In the first embodiment, the light emission angle control unit 240 controls the emission angle of the laser light based on the output information from the distance sensor 500. This is one of the gist according to the present embodiment, and details will be described later.
 次に、操作検知部300の構成を説明する。距離情報取得部310は、光検知センサ200の受光部201から入力された電気信号(検知信号)及び光検知センサ200のミラー駆動部220から入力されたミラー位置情報に基づいて、光検知センサ200から検知された操作者の指105までの距離情報を取得する。 Next, the configuration of the operation detection unit 300 will be described. The distance information acquisition unit 310 is based on the electrical signal (detection signal) input from the light receiving unit 201 of the light detection sensor 200 and the mirror position information input from the mirror driving unit 220 of the light detection sensor 200. The distance information from the detected to the operator's finger 105 is acquired.
 図9は、距離情報取得部310の構成を例示するブロック図である。図9に示すように、距離情報取得部310は、増幅部311、パルス生成部312及び距離計測部313を含む。増幅部311は光検知センサ200の受光部201から入力された信号を増幅し、パルス生成部312に対して出力する。パルス生成部312は、増幅部311により増幅した信号をパルス化し、距離計測部313に出力する。 FIG. 9 is a block diagram illustrating the configuration of the distance information acquisition unit 310. As illustrated in FIG. 9, the distance information acquisition unit 310 includes an amplification unit 311, a pulse generation unit 312, and a distance measurement unit 313. The amplification unit 311 amplifies the signal input from the light receiving unit 201 of the light detection sensor 200 and outputs the amplified signal to the pulse generation unit 312. The pulse generation unit 312 pulses the signal amplified by the amplification unit 311 and outputs the pulse to the distance measurement unit 313.
 距離計測部313は、光検知センサ200の発光タイミングと、パルス生成部312から入力されたパルスとの時間差を測定する。そして、距離計測部313は、光検知センサ200のミラー駆動部220から入力されたミラー位置情報及び測定した時間差を用いてTOF(Time-Of-Flight)により、光検知センサ200から検知された指105までの距離を算出する。距離計測部313は、算出した距離情報を座標情報取得部301に対して出力する。なお、光検知センサ200の発光タイミングは、例えば、発光駆動部203により発光部202を駆動したタイミングである。 The distance measurement unit 313 measures the time difference between the light emission timing of the light detection sensor 200 and the pulse input from the pulse generation unit 312. Then, the distance measurement unit 313 uses the mirror position information input from the mirror driving unit 220 of the light detection sensor 200 and the measured time difference to detect the finger detected from the light detection sensor 200 by TOF (Time-Of-Flight). The distance to 105 is calculated. The distance measurement unit 313 outputs the calculated distance information to the coordinate information acquisition unit 301. Note that the light emission timing of the light detection sensor 200 is, for example, the timing at which the light emission unit 202 is driven by the light emission drive unit 203.
 座標情報取得部301は、距離情報取得部310から入力された距離情報を、映像画面101上の座標情報に変換して、操作信号生成部302に対して出力する。映像画面101上の座標情報は、例えば、映像画面101の横方向をX座標、縦方向をY座標とする。操作信号生成部302は、座標情報取得部301から入力された座標情報から操作信号を生成し、外部機器6に対して出力する。具体的には、例えば、操作信号生成部302は、入力された座標情報が映像画面101上の1点を示す場合、その位置をタッチしたことを示す操作信号を生成する。 The coordinate information acquisition unit 301 converts the distance information input from the distance information acquisition unit 310 into coordinate information on the video screen 101 and outputs it to the operation signal generation unit 302. The coordinate information on the video screen 101 is, for example, an X coordinate in the horizontal direction of the video screen 101 and a Y coordinate in the vertical direction. The operation signal generation unit 302 generates an operation signal from the coordinate information input from the coordinate information acquisition unit 301 and outputs the operation signal to the external device 6. Specifically, for example, when the input coordinate information indicates one point on the video screen 101, the operation signal generation unit 302 generates an operation signal indicating that the position has been touched.
 外部機器6は、操作信号生成部302から入力された操作信号に応じた映像信号を映像投写部400に対して供給する。具体的には、例えば、現在表示されている映像画面101から別の映像画面へ遷移するためのボタンが映像画面101に表示されており、操作者がそのボタンをタッチしたものとする。 The external device 6 supplies a video signal corresponding to the operation signal input from the operation signal generation unit 302 to the video projection unit 400. Specifically, for example, it is assumed that a button for transitioning from the currently displayed video screen 101 to another video screen is displayed on the video screen 101 and the operator touches the button.
 この場合、外部機器6は、操作信号生成部302から映像画面101上のある1点をタッチしたことを示す操作信号が入力されることで、ボタンを押すことにより遷移する別の映像画面を表示させるための映像信号を映像投写部400に対して供給する。このように、映像画面101に対する操作者の指の位置や動きに応じて操作信号が生成され、映像投写装置1により投写される映像が制御される。 In this case, the external device 6 receives another operation signal indicating that one point on the video screen 101 is touched from the operation signal generation unit 302, and displays another video screen that transitions by pressing a button. An image signal for causing the image projection unit 400 to supply the image signal. In this manner, an operation signal is generated according to the position and movement of the operator's finger with respect to the video screen 101, and the video projected by the video projection device 1 is controlled.
 次に、光出射角度制御部240及び距離センサ500による光検知センサ200から出射されるレーザ光の出射角度の調整について説明する。レーザ光の出射角度の調整は、図4を示して上述したように映像投写装置1が設置面102に対して傾いている場合であっても、適切なタイミングで操作者の指105が検知されるようにすることを目的としており、これが本実施形態に係る要旨の1つである。 Next, adjustment of the emission angle of the laser beam emitted from the light detection sensor 200 by the light emission angle control unit 240 and the distance sensor 500 will be described. Adjustment of the laser beam emission angle is performed even when the image projection apparatus 1 is tilted with respect to the installation surface 102 as shown in FIG. This is one of the gist according to the present embodiment.
 光出射角度制御部240は、映像投写装置1の設置面102に対する傾きに応じて光検知センサ200の傾き調整を制御する。まず、光検知センサ200の傾き調整機構を説明する。図10は、光検知センサ200の傾きを調整する傾き調整機構の構成を例示する図である。図10(a)は、映像投写装置1をZ軸方向すなわち映像画面に対して垂直方向から見た図であり、図10(b)~図10(d)は、映像投写装置1をX軸方向すなわち映像画面に対して水平方向から見た傾き調整機構の具体的な構成例を示す図である。 The light emission angle control unit 240 controls the tilt adjustment of the light detection sensor 200 according to the tilt with respect to the installation surface 102 of the video projection device 1. First, the tilt adjustment mechanism of the light detection sensor 200 will be described. FIG. 10 is a diagram illustrating the configuration of an inclination adjustment mechanism that adjusts the inclination of the light detection sensor 200. FIG. 10A is a view of the image projection apparatus 1 viewed from the Z-axis direction, that is, the direction perpendicular to the image screen. FIGS. 10B to 10D illustrate the image projection apparatus 1 in the X-axis direction. It is a figure which shows the specific structural example of the inclination adjustment mechanism seen from the horizontal direction with respect to a direction, ie, a video screen.
 図10(b)に示すように、光検知センサ200の傾き調整機構は、傾き調整部231、可動部232、固定部233及び回転軸234から構成される。傾き調整部231は、回転軸234を回転させる機構であり、例えば、手動により操作可能な調整リング又はドライバ等で調整するねじである。傾き調整部231により回転軸234が回転することで可動部232が動き、可動部232上に設置されている光検知センサ200の傾きが変わる。 As shown in FIG. 10B, the tilt adjustment mechanism of the light detection sensor 200 includes a tilt adjustment unit 231, a movable unit 232, a fixed unit 233, and a rotation shaft 234. The inclination adjusting unit 231 is a mechanism that rotates the rotating shaft 234, and is, for example, an adjustment ring that can be manually operated, a screw that is adjusted by a screwdriver, or the like. When the rotation shaft 234 is rotated by the inclination adjusting unit 231, the movable unit 232 is moved, and the inclination of the light detection sensor 200 installed on the movable unit 232 is changed.
 例えば、図10(c)に示すように、回転軸234が右回りに回転した場合、可動部232が起き上がるように動くことにより、光検知センサ200の傾きが点線で示す状態から1点鎖線で示す状態へ調整される。このように光検知センサ200の傾きが調整されることにより、光検知センサ200から出射するレーザ光が、図10(b)に示した場合よりも俯角方向に向かって出射される。そのため、このような光検知センサ200の傾き調整により、図4(a)に示したように映像投写装置1と設置面102とのなす角度が90°よりも大きい場合に、レーザ光が設置面102と平行に(近づくように)走査するよう調整することができる。 For example, as shown in FIG. 10C, when the rotating shaft 234 rotates clockwise, the movable portion 232 moves so as to rise, so that the inclination of the light detection sensor 200 is changed from a state indicated by a dotted line to a one-dot chain line. It is adjusted to the state shown. By adjusting the inclination of the light detection sensor 200 in this way, the laser light emitted from the light detection sensor 200 is emitted in the depression direction as compared with the case shown in FIG. Therefore, by adjusting the inclination of the light detection sensor 200 as described above, when the angle formed by the image projection apparatus 1 and the installation surface 102 is larger than 90 ° as shown in FIG. It can be adjusted to scan parallel to (closer to) 102.
 また、例えば、図10(d)に示すように、回転軸234が左回りに回転した場合、可動部232が下がるように動くことにより、光検知センサ200の傾きが点線で示す状態から1点鎖線で示す状態へ調整される。このように光検知センサ200の傾きが調整されることにより、光検知センサ200から出射するレーザ光が、図10(b)に示した場合よりも仰角方向に向かって出射される。そのため、このような光検知センサ200の傾き調整により、図4(b)に示したように映像投写装置1と設置面102とのなす角度が90°よりも小さい場合に、レーザ光が設置面102と平行に(近づくように)走査するよう調整することができる。 Further, for example, as shown in FIG. 10D, when the rotation shaft 234 rotates counterclockwise, the movable portion 232 moves so as to move downward, so that the inclination of the light detection sensor 200 is one point from the state indicated by the dotted line. It is adjusted to the state shown by the chain line. By adjusting the inclination of the light detection sensor 200 in this way, the laser light emitted from the light detection sensor 200 is emitted toward the elevation direction as compared with the case shown in FIG. Therefore, by adjusting the tilt of the light detection sensor 200 as described above, when the angle formed between the image projection device 1 and the installation surface 102 is smaller than 90 ° as shown in FIG. It can be adjusted to scan parallel to (closer to) 102.
 また、傾き調整機構は、傾き調整部231による回転軸234の動き量すなわち傾きの調整量を電気的に監視可能な構成とする。具体的には、例えば、傾き調整機構は、調整量により可変抵抗のように抵抗値が変わり、その抵抗値により電圧値が変わる構成とする。光出射角度制御部240は、このような傾き調整機構による光検知センサ200の傾き調整を制御する。すなわち、傾き調整機構は、光検知センサ200からの検知光の出射角度を手動により調整可能な光出射角度調整部として機能する。 Also, the tilt adjustment mechanism is configured to be able to electrically monitor the amount of movement of the rotating shaft 234 by the tilt adjusting unit 231, that is, the amount of tilt adjustment. Specifically, for example, the tilt adjustment mechanism is configured such that the resistance value changes like a variable resistor depending on the adjustment amount, and the voltage value changes depending on the resistance value. The light emission angle control unit 240 controls the inclination adjustment of the light detection sensor 200 by such an inclination adjustment mechanism. That is, the tilt adjustment mechanism functions as a light emission angle adjustment unit that can manually adjust the emission angle of the detection light from the light detection sensor 200.
 なお、回転軸234は、Z軸方向の軸とY軸方向の軸とを歯車のようなもので組み合わせて可動部232を動かす構成としてもよい。また、傾き調整機構は、図10に示した構成に限らず、ミラーであれば2次元スキャンミラーとしてもよい。 The rotating shaft 234 may be configured to move the movable portion 232 by combining the Z-axis direction axis and the Y-axis direction axis with a gear. Further, the tilt adjustment mechanism is not limited to the configuration shown in FIG. 10, and may be a two-dimensional scan mirror as long as it is a mirror.
 図11は、光出射角度制御部240の機能構成を例示するブロック図である。図11に示すように、光出射角度制御部240は、基準情報記憶部241、距離情報取得部242、傾き算出部243、調整完了判定部244、調整完了通知部245を含む。 FIG. 11 is a block diagram illustrating a functional configuration of the light emission angle control unit 240. As illustrated in FIG. 11, the light emission angle control unit 240 includes a reference information storage unit 241, a distance information acquisition unit 242, an inclination calculation unit 243, an adjustment completion determination unit 244, and an adjustment completion notification unit 245.
 図12は、基準情報記憶部241に記憶されている基準情報を例示する図である。図13は、基準となる映像投写装置1の設置態様を例示する図である。基準情報は、図13(a)に示すように映像投写装置1と設置面102とのなす角度が90°となるよう映像投写装置1が設置されている場合における距離r、高さH、角度θを含む。以降、図13(a)に示した映像投写装置1の設置位置(設置状態)を「基準位置」(基準設置状態)とする。 FIG. 12 is a diagram illustrating the reference information stored in the reference information storage unit 241. FIG. 13 is a diagram illustrating an installation mode of the image projection apparatus 1 serving as a reference. The reference information includes distance r, height H, and angle when the image projection apparatus 1 is installed such that the angle between the image projection apparatus 1 and the installation surface 102 is 90 ° as shown in FIG. Including θ. Hereinafter, the installation position (installation state) of the image projection apparatus 1 shown in FIG. 13A is referred to as a “reference position” (reference installation state).
 図13(b)に示すように、距離rは、距離センサ500により計測される距離センサ500から映像画面101までの直線距離である。すなわち、距離センサ500は、距離センサ500から投写面までの距離である投写面距離を計測する投写面距離計測部として機能する。 As shown in FIG. 13B, the distance r is a linear distance from the distance sensor 500 measured by the distance sensor 500 to the video screen 101. That is, the distance sensor 500 functions as a projection plane distance measurement unit that measures a projection plane distance that is a distance from the distance sensor 500 to the projection plane.
 高さHは、映像投写装置1の底面から距離センサ500までの高さである。すなわち、高さHは、映像投写装置1の底面から投写面距離計測部までの距離である計測部距離である。角度θは、映像投写装置1と距離センサ500から距離計測のために出射する光の出射方向とのなす角度である。すなわち、角度θは、距離r及び高さHを用いて以下の式(1)により求められる。 The height H is a height from the bottom surface of the image projection apparatus 1 to the distance sensor 500. That is, the height H is a measurement unit distance that is a distance from the bottom surface of the image projection device 1 to the projection surface distance measurement unit. The angle θ is an angle formed between the image projection apparatus 1 and the emission direction of light emitted for distance measurement from the distance sensor 500. That is, the angle θ is obtained by the following formula (1) using the distance r and the height H.
 
Figure JPOXMLDOC01-appb-I000001
 
Figure JPOXMLDOC01-appb-I000001
 なお、本実施形態における距離センサ500は、出射した光の反射光を受光して映像画面101までの距離を計測する場合を例とするが、映像画面101までの距離を計測できれば他の方法であってもよい。また、本実施形態において角度θは予め算出されて基準情報記憶部241に格納されている場合を例として説明するが、角度θは基準情報記憶部241に格納されず、必要に応じて距離r及び高さHから求められてもよい。 The distance sensor 500 in this embodiment is an example in which the reflected light of the emitted light is received and the distance to the video screen 101 is measured. However, if the distance to the video screen 101 can be measured, other methods may be used. There may be. Further, in the present embodiment, the case where the angle θ is calculated in advance and stored in the reference information storage unit 241 will be described as an example. However, the angle θ is not stored in the reference information storage unit 241, and the distance r is set as necessary. And from the height H.
 図14は、映像投写装置1の設置態様を例示する図である。図14(a)に示す場合においては、映像投写装置1は、点線で示した基準となる映像投写装置1よりも傾いて設置されている。図14(b)に示すように、図14(a)に示した設置態様における距離センサ500から映像画面101までの直線距離をr´、映像投写装置1の底面から距離センサ500までの高さをH´、映像投写装置1と距離センサ500からの光の出射方向とのなす角度をθ´とする。 FIG. 14 is a diagram illustrating an installation mode of the image projection device 1. In the case shown in FIG. 14 (a), the image projection device 1 is installed at an angle with respect to the reference image projection device 1 indicated by a dotted line. As shown in FIG. 14B, the linear distance from the distance sensor 500 to the image screen 101 in the installation mode shown in FIG. 14A is r ′, and the height from the bottom surface of the image projection apparatus 1 to the distance sensor 500. Is H ′, and the angle between the image projection apparatus 1 and the light emission direction from the distance sensor 500 is θ ′.
 図14(a)に示した設置態様の場合、距離情報取得部242は、距離センサ500により計測された距離センサ500から映像画面101までの直線距離r´を取得して傾き算出部243に対して出力する。傾き算出部243は、距離情報取得部242から入力された距離r´及び基準情報記憶部241に記憶されている基準情報に基づいて、傾き量を算出する。傾き量は、図14(b)に示すように、基準となる設置態様における角度θから角度θ´への変化量Δθ(=θ´-θ)である。 In the case of the installation mode shown in FIG. 14A, the distance information acquisition unit 242 acquires the straight line distance r ′ from the distance sensor 500 measured by the distance sensor 500 to the video screen 101, and the inclination calculation unit 243 Output. The inclination calculation unit 243 calculates the amount of inclination based on the distance r ′ input from the distance information acquisition unit 242 and the reference information stored in the reference information storage unit 241. As shown in FIG. 14B, the inclination amount is a change amount Δθ (= θ′−θ) from the angle θ to the angle θ ′ in the reference installation mode.
 図14(b)に示した関係から、以下の式(2)が成立する。
 
Figure JPOXMLDOC01-appb-I000002
From the relationship shown in FIG. 14B, the following formula (2) is established.

Figure JPOXMLDOC01-appb-I000002
  式(1)、(2)より、Δθは、以下の式(3)により求められる。
 
Figure JPOXMLDOC01-appb-I000003
From the formulas (1) and (2), Δθ is obtained by the following formula (3).

Figure JPOXMLDOC01-appb-I000003
 傾き算出部243は、算出したΔθを調整完了判定部244に対して出力する。調整完了判定部244は、傾き算出部243から入力されたΔθに基づいて、傾き調整機構における傾き調整が完了したか否かを判定する。具体的には、調整完了判定部244は、傾き調整機構が監視する調整量を参照し、基準の設置態様からΔθ傾いた映像投写装置1の光検知センサ200からのレーザ光が設置面102と平行に走査するよう光検知センサ200が調整されたか否かを判定する。 The inclination calculation unit 243 outputs the calculated Δθ to the adjustment completion determination unit 244. The adjustment completion determination unit 244 determines whether or not the inclination adjustment in the inclination adjustment mechanism has been completed based on Δθ input from the inclination calculation unit 243. Specifically, the adjustment completion determination unit 244 refers to the adjustment amount monitored by the tilt adjustment mechanism, and the laser light from the light detection sensor 200 of the image projection device 1 tilted by Δθ from the standard installation mode is It is determined whether or not the light detection sensor 200 has been adjusted to scan in parallel.
 調整完了判定部244は、傾き調整が完了したと判定すると、調整完了通知部245に対して判定結果を出力する。調整完了通知部245は、調整完了判定部244からの判定結果を受けて、傾き調整機構を操作する操作者に認識できる態様で調整が完了したことを通知する。 When the adjustment completion determination unit 244 determines that the inclination adjustment has been completed, the adjustment completion determination unit 244 outputs a determination result to the adjustment completion notification unit 245. In response to the determination result from the adjustment completion determination unit 244, the adjustment completion notification unit 245 notifies the operator who operates the tilt adjustment mechanism that the adjustment has been completed.
 具体的には、例えば、調整完了通知部245は、映像画面101に「OK」の文字を表示させるよう外部機器6に操作信号を出力する。また、調整完了通知部245は、調整完了前には「NG」の文字を表示させるよう外部機器6に操作信号を出力するようにしてもよい。その他、調整完了通知部245は、スピーカから調整完了を示す音を出力するようにしてもよい。その他、調整完了通知部245は、映像画面101に調整完了までの残り調整量を示すゲージを表示してもよい。このような通知により、傾き調整部231が手動の調整リング又はねじ等である場合に、傾きを調整する操作者は調整完了のタイミングを把握することができる。 Specifically, for example, the adjustment completion notification unit 245 outputs an operation signal to the external device 6 so as to display the characters “OK” on the video screen 101. Further, the adjustment completion notification unit 245 may output an operation signal to the external device 6 so that the characters “NG” are displayed before the adjustment is completed. In addition, the adjustment completion notification unit 245 may output a sound indicating the completion of adjustment from a speaker. In addition, the adjustment completion notification unit 245 may display a gauge indicating the remaining adjustment amount until the adjustment is completed on the video screen 101. By such notification, when the inclination adjusting unit 231 is a manual adjustment ring, a screw, or the like, the operator who adjusts the inclination can grasp the timing of completion of the adjustment.
 以上説明したように、本実施形態においては、映像投写装置1の設置面102に対する傾きに応じて光検知センサ200の傾きが調整される。このような構成により、映像投写装置1が設置面102に対して傾いて設置された場合であっても、光検知センサ200の出射されるレーザ光が映像画面101と平行に走査するよう制御することができる。そのため、操作者が映像画面101を操作するタイミングと操作に対する反応タイミングとがそろい、投写された映像画面に対する操作性を向上させることが可能になる。 As described above, in the present embodiment, the inclination of the light detection sensor 200 is adjusted according to the inclination of the image projection apparatus 1 with respect to the installation surface 102. With such a configuration, control is performed so that the laser beam emitted from the light detection sensor 200 scans in parallel with the image screen 101 even when the image projection apparatus 1 is installed inclined with respect to the installation surface 102. be able to. Therefore, the timing at which the operator operates the video screen 101 and the response timing to the operation are aligned, and the operability with respect to the projected video screen can be improved.
 なお、上記実施形態においては、操作者の指105を検知するためのレーザ光の出射角度を調整するために、図10に示した傾き調整機構により光検知センサ200の傾きが調整される場合を例として説明した。その他、光検知センサ200の傾き調整に加えて、又は光検知センサ200の傾き調整の代わりに、光検知センサ200を構成する各部が調整されるようにしてもよい。 In the above embodiment, the tilt of the light detection sensor 200 is adjusted by the tilt adjustment mechanism shown in FIG. 10 in order to adjust the emission angle of the laser beam for detecting the operator's finger 105. Described as an example. In addition, in addition to the inclination adjustment of the light detection sensor 200, or instead of the inclination adjustment of the light detection sensor 200, each unit constituting the light detection sensor 200 may be adjusted.
 図15は、光検知センサ200を構成する各部を調整する態様を例示する図である。図15に示すように、光検知センサ200は、上述した発光部202及びミラー部210の他、傾き調整ミラー205を含む。発光部202から出射したレーザ光は、ミラー部210及び傾き調整ミラー205を反射して映像投写装置1外に出射する。図15に示した構成の場合、操作者は、傾き調整ミラー205の傾きを調整する図示しない傾き調整部により傾き調整ミラー205の傾きを調整する。 FIG. 15 is a diagram exemplifying a mode in which each unit constituting the light detection sensor 200 is adjusted. As shown in FIG. 15, the light detection sensor 200 includes an inclination adjustment mirror 205 in addition to the light emitting unit 202 and the mirror unit 210 described above. The laser light emitted from the light emitting unit 202 is reflected by the mirror unit 210 and the tilt adjustment mirror 205 and is emitted to the outside of the image projection apparatus 1. In the case of the configuration shown in FIG. 15, the operator adjusts the inclination of the inclination adjustment mirror 205 by an inclination adjustment unit (not shown) that adjusts the inclination of the inclination adjustment mirror 205.
 光出射角度制御部240は、映像投写装置1の傾きΔθに基づいて、傾き調整ミラー205の傾き調整が完了したか否かを通知する。このような構成によっても、光検知センサ200の出射されるレーザ光が映像画面101と平行して走査するよう制御することができるので、投写された映像画面に対する操作性を向上させることが可能になる。 The light emission angle control unit 240 notifies whether or not the tilt adjustment of the tilt adjustment mirror 205 has been completed based on the tilt Δθ of the video projector 1. Even with such a configuration, the laser light emitted from the light detection sensor 200 can be controlled to scan in parallel with the video screen 101, so that the operability for the projected video screen can be improved. Become.
 なお、本実施形態においては、発光部202又はミラー部210を調整するようにしてもよいし、傾き調整ミラー205を含むこれらの複数の構成部を組み合わせて調整することにより、光検知センサ200からのレーザ光の出射角度を制御するようにしてもよい。 In the present embodiment, the light emitting unit 202 or the mirror unit 210 may be adjusted, or the light detection sensor 200 may be adjusted by adjusting a plurality of components including the tilt adjustment mirror 205 in combination. The laser beam emission angle may be controlled.
 また、上記実施形態においては、1つの距離センサ500を用いて映像投写装置1の設置面102に対する傾きに応じたレーザ光の出射角度の調整を例として説明した。その他、2つの距離センサ500を用いることで、さらに映像投写装置1の水平方向の傾きに応じたレーザ光の出射角度を調整することができる。この場合、光出射角度制御部240は、水平方向の仰俯角を2点測定し、それぞれの仰俯角に応じてレーザ光の出射角度を制御する。 In the above embodiment, the adjustment of the laser beam emission angle according to the inclination with respect to the installation surface 102 of the image projection apparatus 1 using one distance sensor 500 has been described as an example. In addition, by using the two distance sensors 500, it is possible to further adjust the emission angle of the laser light according to the horizontal inclination of the image projection apparatus 1. In this case, the light emission angle control unit 240 measures two elevation angles in the horizontal direction, and controls the emission angle of the laser beam according to each elevation angle.
 また、距離センサ500は、映像画面101を投写する面(机上に投写する場合においては設置面102)からできるだけ離れた位置に設置され、かつ図13(b)等に示した角度θができるだけ小さくなるよう光を出射することが望ましい。角度θが小さいと出射した光の設置面102に対する鉛直方向の成分が大きくなり、反射光が大きくなるので、距離センサ500は、反射光を受光しやすくなり、距離を計測しやすくなる。 Further, the distance sensor 500 is installed at a position as far as possible from the surface on which the video screen 101 is projected (installation surface 102 in the case of projecting on a desk), and the angle θ shown in FIG. It is desirable to emit light so that If the angle θ is small, the component in the vertical direction of the emitted light with respect to the installation surface 102 becomes large and the reflected light becomes large. Therefore, the distance sensor 500 can easily receive the reflected light and easily measure the distance.
 例えば、距離センサ500は、映像画面101を投写する面からできるだけ離れた位置に設置されている投写レンズ404に近い位置(例えば、図1に示した映像投写装置1においてはフォーカスリング103の下側に隣接した位置)に設置される。すなわち、距離センサ500は、投写面よりも投写レンズ404に隣接する位置に設けられる。 For example, the distance sensor 500 is a position close to the projection lens 404 installed as far as possible from the surface on which the video screen 101 is projected (for example, the lower side of the focus ring 103 in the video projection apparatus 1 shown in FIG. 1). It is installed at a position adjacent to That is, the distance sensor 500 is provided at a position adjacent to the projection lens 404 with respect to the projection plane.
 また、上記実施形態においては、傾き算出部243は、距離センサ500を用いて傾きを算出する場合を例として説明した。しかしながら、これは一例であり、設置面102が床面に対して平行であると仮定する場合、傾き算出部243は、ジャイロセンサを用いて傾きを算出するようにしてもよい。 Further, in the above-described embodiment, the inclination calculation unit 243 has been described as an example in which the inclination is calculated using the distance sensor 500. However, this is an example, and when it is assumed that the installation surface 102 is parallel to the floor surface, the inclination calculation unit 243 may calculate the inclination using a gyro sensor.
 また、上記実施形態においては、傾き調整部231は、光検知センサ200の上側(すなわち、光検知センサ200から出射される光を挟んで映像画面101とは逆側)に設けられている場合を例として説明した。傾き調整部231が光検知センサ200の下側(すなわち、光検知センサ200から出射される光を挟んで映像画面101の側)に設けられた場合、傾き調整部231が光検知センサ200から出射される光を遮る場合がある。そのため、傾き調整部231は、光検知センサ200の上側に設けられることが望ましい。 Moreover, in the said embodiment, the case where the inclination adjustment part 231 is provided above the light detection sensor 200 (namely, on the opposite side to the image | video screen 101 across the light radiate | emitted from the light detection sensor 200). Described as an example. When the tilt adjustment unit 231 is provided on the lower side of the light detection sensor 200 (that is, on the image screen 101 side with the light emitted from the light detection sensor 200 interposed), the tilt adjustment unit 231 emits from the light detection sensor 200. May block the light emitted. Therefore, it is desirable that the inclination adjustment unit 231 is provided on the upper side of the light detection sensor 200.
 しかしながら、これは一例であり、傾き調整部231は、光検知センサ200から出射される光を遮ることなく光検知センサ200の傾きを調整できれば、例えば、光検知センサ200の下側に設けられてもよいし、光検知センサ200に対して仰俯角方向に限らず方位角方向に設けられてもよい。 However, this is only an example, and the inclination adjustment unit 231 is provided on the lower side of the light detection sensor 200, for example, if the inclination of the light detection sensor 200 can be adjusted without blocking the light emitted from the light detection sensor 200. Alternatively, the light detection sensor 200 may be provided not only in the elevation angle direction but also in the azimuth angle direction.
 また、上記実施形態においては、傾き調整部231を手動により動かす場合を例として説明した。この場合、傾き調整部231は、上述したように調整リング又はねじ等であるので、映像投写装置1の外側に設けられる。このような構成により、内部に傾き調整用のモータ等を搭載する必要がないので、装置がより小型化される。しかしながら、このような構成は必須ではなく、傾き調整部231を電動により動かす構成であってもよい。その場合、例えば、傾き算出部243により算出された傾きに応じた電気信号をステッピングモータに与えることにより、傾き調整部231が動作する。 In the above embodiment, the case where the inclination adjusting unit 231 is manually moved has been described as an example. In this case, since the inclination adjusting unit 231 is an adjustment ring or a screw as described above, the inclination adjusting unit 231 is provided outside the image projection device 1. With such a configuration, there is no need to mount a tilt adjusting motor or the like inside, so that the apparatus is further miniaturized. However, such a configuration is not essential, and a configuration in which the inclination adjusting unit 231 is moved electrically may be used. In that case, for example, the tilt adjusting unit 231 operates by supplying an electric signal corresponding to the tilt calculated by the tilt calculating unit 243 to the stepping motor.
 また、距離センサ500は、投写面までの距離を計測することができるので、映像投写装置1の傾き算出だけでなく、投斜面までの距離に基づいて、投写面に投写する映像の調整を行うこともできる。 In addition, since the distance sensor 500 can measure the distance to the projection surface, not only the inclination of the image projection apparatus 1 is calculated, but also the image projected on the projection surface is adjusted based on the distance to the projection slope. You can also
 [実施例2]
 以下、本発明の実施例2を説明する。本発明の実施例1における光出射角度制御部240は、距離センサ500を用いて光検知センサ200からのレーザ光の出射角度を制御した。実施例2における光出射角度制御部240は、カメラを用いて光検知センサ200からのレーザ光の出射角度を制御する。実施例2においては、実施例1と異なる構成のみを説明し、同様の構成については説明を省略する。
[Example 2]
Embodiment 2 of the present invention will be described below. The light emission angle control unit 240 according to the first embodiment of the present invention uses the distance sensor 500 to control the emission angle of the laser light from the light detection sensor 200. The light emission angle control unit 240 in the second embodiment controls the emission angle of the laser light from the light detection sensor 200 using a camera. In the second embodiment, only the configuration different from the first embodiment will be described, and the description of the same configuration will be omitted.
 図16は、実施例2における光出射角度制御部240の機能構成を例示するブロック図である。図16に示すように、実施例2における光出射角度制御部240は、実施例1における距離情報取得部242及び傾き算出部243が画素数取得部246、傾き方向判定部247及び調整指示部248に置換された構成をとる。 FIG. 16 is a block diagram illustrating a functional configuration of the light emission angle control unit 240 according to the second embodiment. As illustrated in FIG. 16, the light emission angle control unit 240 according to the second embodiment includes the distance information acquisition unit 242 and the inclination calculation unit 243 according to the first embodiment, the pixel number acquisition unit 246, the inclination direction determination unit 247, and the adjustment instruction unit 248. The configuration replaced with is taken.
 また、実施例2に係る映像投写装置1は、映像画面101の形状を認識する形状認識装置として機能するカメラ510を備える。図17は、カメラ510が認識した映像画面101の形状を例示する図である。図17(a)は、図13(a)に示した基準となる映像投写装置1の設置態様における映像画面101の形状を例示する図である。 Also, the video projection device 1 according to the second embodiment includes a camera 510 that functions as a shape recognition device that recognizes the shape of the video screen 101. FIG. 17 is a diagram illustrating the shape of the video screen 101 recognized by the camera 510. FIG. 17A is a diagram exemplifying the shape of the video screen 101 in the installation mode of the video projection device 1 serving as the reference shown in FIG.
 実施例2における基準情報記憶部241は、図17(a)に示した映像画面101の形状の横方向の辺のうちカメラ510に近い辺(以降、「辺A」とする)の長さ(画素数)A1及びもう一方の辺(以降、「辺B」とする)の長さB1を基準情報として格納する。 The reference information storage unit 241 according to the second embodiment has a length of a side (hereinafter, referred to as “side A”) close to the camera 510 among horizontal sides of the shape of the video screen 101 illustrated in FIG. The number of pixels A1 and the length B1 of the other side (hereinafter referred to as “side B”) are stored as reference information.
 図17(b)は、映像投写装置1が設置面102に対して仰角方向に傾いて設置されている場合(例えば、図4(a)に示した設置態様の場合)における映像画面101の形状を例示する図である。図17(b)に示すように、映像画面101の辺Aの長さA2は、図17(a)に示した辺Aの長さA1よりも長く、長さB2は、B1よりも短くなる。すなわち、映像投写装置1が設置面に対して仰角方向に傾いて設置されている場合、B2/A2<B1/A1の関係が成り立つ。 FIG. 17B shows the shape of the video screen 101 when the video projection apparatus 1 is installed with an inclination with respect to the installation surface 102 (for example, in the installation mode shown in FIG. 4A). FIG. As shown in FIG. 17B, the length A2 of the side A of the video screen 101 is longer than the length A1 of the side A shown in FIG. 17A, and the length B2 is shorter than B1. . That is, when the image projection apparatus 1 is installed in an elevation angle direction with respect to the installation surface, a relationship of B2 / A2 <B1 / A1 is established.
 また、図17(c)は、映像投写装置1が設置面102に対して俯角方向に傾いて設置されている場合(例えば、図4(b)に示した設置態様の場合)における映像画面101の形状を例示する図である。図17(c)に示すように、映像画面101の辺Aの長さA3は、図17(a)に示した辺Aの長さA1よりも短く、長さB3は、B1よりも長くなる。すなわち、映像投写装置1が設置面に対して俯角方向に傾いて設置されている場合、B2/A2>B1/A1の関係が成り立つ。 FIG. 17C shows the video screen 101 when the video projection device 1 is installed tilted in the depression direction with respect to the installation surface 102 (for example, in the installation mode shown in FIG. 4B). It is a figure which illustrates the shape of. As shown in FIG. 17C, the length A3 of the side A of the video screen 101 is shorter than the length A1 of the side A shown in FIG. 17A, and the length B3 is longer than B1. . That is, when the image projection apparatus 1 is installed inclined in the depression direction with respect to the installation surface, a relationship of B2 / A2> B1 / A1 is established.
 画素数取得部246は、カメラ510により認識された映像画面101から辺Aの画素数(長さ)及び辺Bの画素数(長さ)を取得して、傾き方向判定部247に対して出力する。傾き方向判定部247は、画素数取得部246から入力された辺A及び辺Bの画素数と基準情報記憶部241に記憶されている基準情報(画素数A1及び画素数B1)とに基づいて、映像投写装置1が設置面102に対して仰角方向に傾いているのか俯角方向に傾いているのかを判定する。 The pixel number acquisition unit 246 acquires the pixel number (length) of the side A and the pixel number (length) of the side B from the video screen 101 recognized by the camera 510, and outputs them to the tilt direction determination unit 247. To do. The inclination direction determination unit 247 is based on the pixel numbers of the sides A and B input from the pixel number acquisition unit 246 and the reference information (pixel number A1 and pixel number B1) stored in the reference information storage unit 241. Then, it is determined whether the image projection apparatus 1 is inclined in the elevation angle direction or the depression angle direction with respect to the installation surface 102.
 具体的には、傾き方向判定部247は、図17を示して上述した基準となる画素数A1及びB1と取得した画素数(例えば、A2及びB2)との関係により、映像投写装置1の傾き方向を判定し、判定結果を調整指示部248に対して出力する。 Specifically, the tilt direction determination unit 247 determines the tilt of the image projection device 1 based on the relationship between the reference pixel numbers A1 and B1 and the acquired pixel numbers (for example, A2 and B2) shown in FIG. The direction is determined, and the determination result is output to the adjustment instruction unit 248.
 調整指示部248は、判定結果に基づいて、操作者がどのように傾き調整機構により調整すればよいかを指示する。具体的には、例えば、調整指示部248は、図10に示した回転軸234がいずれの方向に回転するよう調整するのかを、映像画面101上に表示させる操作信号を外部機器6に対して出力する。例えば、調整指示部248は、判定結果が仰角方向に傾いていることを示す場合、回転軸234を右回りに回転するよう調整を指示する。 The adjustment instruction unit 248 instructs the operator how to adjust with the tilt adjustment mechanism based on the determination result. Specifically, for example, the adjustment instruction unit 248 sends an operation signal to the external device 6 to display on the video screen 101 which direction the rotation shaft 234 shown in FIG. Output. For example, when the determination result indicates that the determination result is tilted in the elevation angle direction, the adjustment instruction unit 248 instructs the adjustment to rotate the rotation shaft 234 clockwise.
 調整指示部248の指示により光検知センサ200の傾きが調整されている間、画素数取得部246は、カメラ510により認識された映像画面101から辺A及び辺Bの画素数を取得して、調整完了判定部244に対して出力する。調整完了判定部244は、画素数取得部246から入力された各辺の画素数と基準情報記憶部241に格納されている各辺の画素数とが一致した(又は画素数の差分が予め定められた値になった)場合に、調整完了通知部245に対して調整が完了した旨を通知する。 While the inclination of the light detection sensor 200 is adjusted according to the instruction of the adjustment instruction unit 248, the pixel number acquisition unit 246 acquires the pixel numbers of the sides A and B from the video screen 101 recognized by the camera 510, and Output to the adjustment completion determination unit 244. The adjustment completion determination unit 244 matches the number of pixels of each side input from the pixel number acquisition unit 246 with the number of pixels of each side stored in the reference information storage unit 241 (or a difference in the number of pixels is determined in advance). The adjustment completion notification unit 245 is notified that the adjustment has been completed.
 以上説明したように、実施例2においては、カメラ510により認識される映像画面101の形状に応じて光検知センサ200の傾きが調整される。このような構成により、映像投写装置1が設置面102に対して傾いて設置された場合であっても、光検知センサ200の出射されるレーザ光が映像画面101と平行して走査するよう制御することができる。そのため、操作者が映像画面101を操作するタイミングと走査に対する反応タイミングとがそろい、投写された映像画面101に対する操作性を向上させることができる。 As described above, in the second embodiment, the inclination of the light detection sensor 200 is adjusted according to the shape of the video screen 101 recognized by the camera 510. With such a configuration, control is performed so that the laser beam emitted from the light detection sensor 200 scans in parallel with the image screen 101 even when the image projection apparatus 1 is installed inclined with respect to the installation surface 102. can do. Therefore, the timing at which the operator operates the video screen 101 and the response timing for scanning are aligned, and the operability with respect to the projected video screen 101 can be improved.
 なお、上記実施例2において、光出射角度制御部240は、映像画面101の形状の横方向の辺A及び辺Bの長さに基づいて仰俯角方向の傾きを判定し、判定結果に応じてレーザ光の出射角度を制御する場合を例として説明した。このような構成に加えて、光出射角度制御部240は、映像画面101の形状の縦方向の辺C及び辺Dの長さに基づいて方位角方向の傾きを判定し、判定結果に応じてレーザ光の出射角度を制御するようにしてもよい。このような構成により、映像投写装置1の傾きに応じてレーザ光の出射角度をより精度よく制御することができ、投写された映像画面101に対する操作性をさらに向上させることができる。また、カメラ510は、映像画面101の形状が認識できればどの位置に設置されてもよい。 In the second embodiment, the light emission angle control unit 240 determines the inclination in the elevation angle direction based on the length of the side A and the side B in the horizontal direction of the shape of the video screen 101, and according to the determination result. The case where the laser beam emission angle is controlled has been described as an example. In addition to such a configuration, the light emission angle control unit 240 determines the inclination in the azimuth direction based on the lengths of the side C and the side D in the vertical direction of the shape of the video screen 101, and according to the determination result The emission angle of the laser beam may be controlled. With such a configuration, the emission angle of the laser light can be controlled with higher accuracy in accordance with the inclination of the image projection device 1, and the operability with respect to the projected image screen 101 can be further improved. The camera 510 may be installed at any position as long as the shape of the video screen 101 can be recognized.
 [実施例3]
 以下、本発明の実施例3を説明する。本発明の実施例3における傾き算出部243は、実施例1における距離センサ500の代わりに光検知センサ200を用いて映像投写装置1の設置面102に対する傾きを算出する。実施例3においては、実施例1と異なる構成のみを説明し、同様の構成については説明を省略する。
[Example 3]
Embodiment 3 of the present invention will be described below. The inclination calculation unit 243 according to the third embodiment of the present invention calculates the inclination with respect to the installation surface 102 of the image projection apparatus 1 using the light detection sensor 200 instead of the distance sensor 500 according to the first embodiment. In the third embodiment, only the configuration different from the first embodiment will be described, and the description of the same configuration will be omitted.
 図18は、光検知センサ200により傾きを算出する態様を例示する図である。図18に示すように、実施例3における光検知センサ200は、レーザ光をY軸方向でずらして出射することができる構成とする。このような構成は、例えば、図10を示して実施例1において説明したように、光検知センサ200の傾きを制御することにより実現される。その他、このような構成は図15を示して実施例1において説明したように、光検知センサ200を構成する各部を調整することにより実現されてもよい。また、この場合、ミラー部210は2次元スキャンミラーであってもよい。 FIG. 18 is a diagram illustrating a mode in which the inclination is calculated by the light detection sensor 200. As shown in FIG. 18, the light detection sensor 200 according to the third embodiment is configured to be able to emit laser light while shifting in the Y-axis direction. Such a configuration is realized, for example, by controlling the inclination of the light detection sensor 200 as illustrated in FIG. 10 and described in the first embodiment. In addition, such a configuration may be realized by adjusting each part of the light detection sensor 200 as illustrated in FIG. 15 and described in the first embodiment. In this case, the mirror unit 210 may be a two-dimensional scan mirror.
 図18に示すように、Y軸方向でずらしながら出射されるレーザ光が設置面102に当たる(図18においては六角星形で示す)と、光検知センサ200は反射光を受光する。実施例3における傾き算出部243は、光検知センサ200が反射光を受光した際のレーザ光の出射角度及び光検知センサ200から設置面102までの距離情報に基づいて傾きを算出する。すなわち、光検知センサ200は、光検知センサ200から投写面までの距離である投写面距離を計測するための計測用の光を出射する。 As shown in FIG. 18, when the laser beam emitted while shifting in the Y-axis direction strikes the installation surface 102 (indicated by a hexagonal star in FIG. 18), the light detection sensor 200 receives the reflected light. The inclination calculation unit 243 according to the third embodiment calculates the inclination based on the emission angle of the laser light when the light detection sensor 200 receives the reflected light and the distance information from the light detection sensor 200 to the installation surface 102. That is, the light detection sensor 200 emits measurement light for measuring a projection plane distance that is a distance from the light detection sensor 200 to the projection plane.
 具体的には、例えば、傾き算出部243は、図13(a)に示した基準位置においてY軸方向でずらしながら出射されるレーザ光が設置面102に当たった際のレーザ光の出射角度及び距離情報を基準として、基準からのずれを傾きとして算出する。 Specifically, for example, the inclination calculating unit 243 determines the emission angle of the laser beam when the laser beam emitted while shifting in the Y-axis direction at the reference position shown in FIG. Based on the distance information, a deviation from the reference is calculated as an inclination.
 図19は、光出射角度制御の処理と指操作検知の処理とのシーケンス図を例示する図である。実施例3においては、光検知センサ200自体が傾き算出処理に用いられるので、光出射角度制御部240による光出射角度制御処理と操作検知部300による指105の検知処理とは別のタイミングで行われる。 FIG. 19 is a diagram illustrating a sequence diagram of the light emission angle control process and the finger operation detection process. In the third embodiment, since the light detection sensor 200 itself is used for the tilt calculation process, the light emission angle control process by the light emission angle control unit 240 and the finger 105 detection process by the operation detection unit 300 are performed at different timings. Is called.
 例えば、図19に示すように、光出射角度制御部240は、まず、光検知センサ200のレーザ光を用いて傾きを算出する。次に、光出射角度制御部240は、算出した傾きに基づいて、光検知センサ200の傾き調整を制御する。調整が完了すると、操作検知部300は、光検知センサ200のレーザ光を用いて操作者の指105の操作を検知する。 For example, as shown in FIG. 19, the light emission angle control unit 240 first calculates the tilt using the laser light of the light detection sensor 200. Next, the light emission angle control unit 240 controls the tilt adjustment of the light detection sensor 200 based on the calculated tilt. When the adjustment is completed, the operation detection unit 300 detects the operation of the operator's finger 105 using the laser beam of the light detection sensor 200.
 このような構成によっても、実施例1及び実施例2と同様に、映像投写装置1の設置面102に対する傾きに応じて光検知センサ200の傾きが調整されるので、投写された映像画面101に対する操作性を向上させることができる。また、実施例3においては、距離センサ500やカメラ510を映像投写装置1に搭載する必要がないので、装置がより小型化される。 Also with such a configuration, the tilt of the light detection sensor 200 is adjusted according to the tilt with respect to the installation surface 102 of the video projection device 1 as in the first and second embodiments. Operability can be improved. In the third embodiment, since it is not necessary to mount the distance sensor 500 and the camera 510 in the image projection apparatus 1, the apparatus is further downsized.
 実施例3における光出射角度制御処理は、例えば、映像投写装置1に電源が投入された後、初期設定として一度だけ行われる。その他、光出射角度制御処理は、一定の間隔で定期的に行われるようにしてもよい。いずれの場合であっても、光出射角度制御処理が行われている間は、操作検知部300による指105の操作検知処理は行われない。 The light emission angle control process in the third embodiment is performed only once as an initial setting after the image projection apparatus 1 is powered on, for example. In addition, the light emission angle control process may be performed periodically at regular intervals. In any case, the operation detection process of the finger 105 by the operation detection unit 300 is not performed while the light emission angle control process is being performed.
 [実施例4]
 以下、本発明の実施例4を説明する。実施例4においては、表面の一部の光の反射率が他の部分の光の反射率よりも高い構成である治具を用いてレーザ光の出射角度が調整される。図20は、治具520を用いたレーザ光の出射角度調整の態様を例示する図である。図20に示すように、操作者が映像画面101上をタッチしたタイミングで指105の位置及び動きを検知するための所望の方向にレーザ光が走査している場合に、治具520によるレーザ光の反射率が他の方向にレーザ光が走査している場合よりも高くなる。すなわち、治具520は、光検知センサ200から出射された光が当たる位置により反射率の異なる光反射部として機能する。
[Example 4]
Embodiment 4 of the present invention will be described below. In the fourth embodiment, the emission angle of the laser beam is adjusted using a jig having a configuration in which the reflectance of part of the light on the surface is higher than the reflectance of light on the other part. FIG. 20 is a diagram illustrating an aspect of adjusting the laser beam emission angle using the jig 520. As shown in FIG. 20, when the laser beam is scanned in a desired direction for detecting the position and movement of the finger 105 at the timing when the operator touches the video screen 101, the laser beam from the jig 520 is used. The reflectance becomes higher than that when the laser beam is scanned in the other direction. That is, the jig 520 functions as a light reflecting portion having a different reflectance depending on the position where the light emitted from the light detection sensor 200 strikes.
 操作者が傾き調整部231により光検知センサ200の傾きを調整することにより、光検知センサ200から出射されるレーザ光の方向が変化する。光出射角度制御部240は、光検知センサ200から出射したレーザ光の反射率を算出し、反射率が予め定められた値以上である場合に光検知センサ200の傾き調整が完了したと判定する。調整が完了したと判定すると、光出射角度制御部240は、実施例1における調整完了通知部245と同様に、光検知センサ200の傾き調整が完了した旨等の調整状態を映像や音により通知する。 When the operator adjusts the inclination of the light detection sensor 200 using the inclination adjustment unit 231, the direction of the laser light emitted from the light detection sensor 200 changes. The light emission angle control unit 240 calculates the reflectance of the laser light emitted from the light detection sensor 200, and determines that the inclination adjustment of the light detection sensor 200 is completed when the reflectance is equal to or greater than a predetermined value. . When it is determined that the adjustment has been completed, the light emission angle control unit 240 notifies the adjustment state such as the completion of the inclination adjustment of the light detection sensor 200 with an image or sound, like the adjustment completion notification unit 245 in the first embodiment. To do.
 このような構成によっても、実施例1及び実施例2と同様に、映像投写装置1の設置面102に対する傾きに応じて光検知センサ200の傾きが調整されるので、投写された映像画面101に対する操作性を向上させることができる。 Also with such a configuration, the tilt of the light detection sensor 200 is adjusted according to the tilt with respect to the installation surface 102 of the video projection device 1 as in the first and second embodiments. Operability can be improved.
 なお、治具520を置く位置は映像画面101に表示されるようにしてもよい。また、実施例4における治具520の形状は直方体型である場合を例として説明するが、映像画面101上のどの位置であってもレーザ光が反射しやすい円柱型としてもよい。また、上記実施例4においては、映像投写装置1の設置面102に対する仰俯角方向の傾きに応じて光検知センサ200の傾きが調整される場合を例として説明したが、さらに方位角方向の傾きに応じて光検知センサ200の傾きが調整されてもよい。また、治具520は、映像投写装置1内に収納可能であってもよい。このような構成により、光検知センサ200の傾きをいつでも調整しやすくなる。 The position where the jig 520 is placed may be displayed on the video screen 101. Further, although the case where the shape of the jig 520 in the fourth embodiment is a rectangular parallelepiped type will be described as an example, it may be a cylindrical shape in which laser light is easily reflected at any position on the video screen 101. In the fourth embodiment, the case where the inclination of the light detection sensor 200 is adjusted according to the inclination in the elevation angle direction with respect to the installation surface 102 of the image projection apparatus 1 is described as an example. The inclination of the light detection sensor 200 may be adjusted according to the above. Further, the jig 520 may be housed in the video projection device 1. With such a configuration, it becomes easy to adjust the inclination of the light detection sensor 200 at any time.
 なお、上記実施例1~4において、さらに操作者の指105による操作が検知された際に、図示しない音出力制御部が映像投写装置1のスピーカから音を鳴らしたり、映像投写部400が映像画面101の表示を変更したりするようにしてもよい。例えば、操作検知部300が映像画面101に表示されている音量調整ボタンのタッチ操作を検知した場合に、音量調整ボタンが一瞬拡大表示されたり縮小表示されたりする。すなわち、音出力制御部及び映像投写部400は、指105からの反射光を検知したことを操作者が認識できる態様で通知する検知通知部として機能する。 In the first to fourth embodiments, when an operation by the operator's finger 105 is further detected, a sound output control unit (not shown) emits sound from the speaker of the image projection apparatus 1 or the image projection unit 400 displays an image. The display on the screen 101 may be changed. For example, when the operation detection unit 300 detects a touch operation of the volume adjustment button displayed on the video screen 101, the volume adjustment button is temporarily displayed in an enlarged or reduced manner. That is, the sound output control unit and the video projection unit 400 function as a detection notification unit that notifies the operator that the reflected light from the finger 105 has been detected in a manner that allows the operator to recognize it.
 このように、映像投写装置1は、操作者が行った操作を検知した場合に、音や表示等の何らかの方法により操作者に対して操作が検知されたことを通知するので、操作者は行った操作に対する反応を認識しやすくなる。特に、映像画面101が机上や壁面ではなく空間上に表示されている場合、操作者は映像画面101に触れた感覚がなく操作が検知されているかを把握しづらい。上述の構成によれば、空間上に映像画面101が表示されている場合においても、操作者が映像画面101に触れた感覚を得やすくなり、投写された映像画面101に対する操作性がさらに向上する。 As described above, when the operation performed by the operator is detected, the video projection apparatus 1 notifies the operator that the operation has been detected by some method such as sound or display. It becomes easy to recognize the response to the operation. In particular, when the video screen 101 is displayed on a space rather than on a desk or a wall, it is difficult for the operator to grasp whether the operation is detected without feeling touching the video screen 101. According to the configuration described above, even when the video screen 101 is displayed in the space, it becomes easier for the operator to feel that the video screen 101 is touched, and the operability with respect to the projected video screen 101 is further improved. .
 また、上記実施例1~4においては、映像画面101のサイズが一定であり、光検知センサ200から出射されるレーザ光による指105の検知範囲も映像画面101のサイズに合わせて一定である場合を例として説明した。映像画面101のサイズが変更可能である場合、例えば、映像画面101のサイズが初期サイズよりも大きくなると、映像画面101のサイズが指105の検知範囲よりも大きくなり、検知範囲における位置と映像画面101上における位置とがずれてしまう。 In the first to fourth embodiments, the size of the video screen 101 is constant, and the detection range of the finger 105 by the laser light emitted from the light detection sensor 200 is also constant according to the size of the video screen 101. Was described as an example. When the size of the video screen 101 can be changed, for example, when the size of the video screen 101 becomes larger than the initial size, the size of the video screen 101 becomes larger than the detection range of the finger 105, and the position in the detection range and the video screen The position on 101 is shifted.
 そのため、映像画面101上の操作位置が検知範囲外で操作が検知できなかったり、操作位置が検知範囲内であっても異なる位置の操作と誤検知したりする場合がある。例えば、初期サイズの映像画面101において1cm角のボタンが表示されている場合、映像画面101のサイズが初期サイズの2倍になると、2cm角のボタンが表示される。このような場合に、検知範囲が初期サイズのままであると、操作検知部300は、2cm角のボタンがタッチされているにも関わらず、ボタンのタッチ操作を検知しない。 For this reason, there are cases where the operation position on the video screen 101 is outside the detection range and the operation cannot be detected, or even if the operation position is within the detection range, it is erroneously detected as an operation at a different position. For example, when a 1 cm square button is displayed on the initial size video screen 101, a 2 cm square button is displayed when the size of the video screen 101 is twice the initial size. In such a case, if the detection range remains the initial size, the operation detection unit 300 does not detect the touch operation of the button even though the 2 cm square button is touched.
 そこで、映像投写装置1は、映像画面101のサイズが変更可能である場合、変更されたサイズに合わせて指105の検知範囲を変更するようにしてもよい。具体的には、例えば、映像画面101のサイズを調整する図示しないサイズ調整リングの位置が電気的にフィードバックされ、フィードバックに応じて発光部202が強度を調整し、増幅部311がゲインを調整する。また、例えば、映像画面101のサイズを電気的に調整する場合、調整時の電気信号に応じて発光部202が強度を調整し、増幅部311がゲインを調整するようにしてもよい。これにより、光検知センサ200が出射する光の走査範囲が変更される。 Therefore, when the size of the video screen 101 can be changed, the video projection device 1 may change the detection range of the finger 105 according to the changed size. Specifically, for example, the position of a size adjusting ring (not shown) that adjusts the size of the video screen 101 is electrically fed back, the light emitting unit 202 adjusts the intensity according to the feedback, and the amplifying unit 311 adjusts the gain. . Further, for example, when the size of the video screen 101 is electrically adjusted, the light emitting unit 202 may adjust the intensity according to the electric signal at the time of adjustment, and the amplifying unit 311 may adjust the gain. Thereby, the scanning range of the light emitted from the light detection sensor 200 is changed.
 このような構成により、映像画面101のサイズが変更された場合であっても、映像画面101上での操作位置と操作した指105の検知位置とのずれが生じないので、投写された映像画面101に対する操作性を向上させることができる。また、実施例4においては、距離センサ500やカメラ510を映像投写装置1に搭載する必要がないので、装置がより小型化される。 With such a configuration, even when the size of the video screen 101 is changed, there is no deviation between the operation position on the video screen 101 and the detection position of the operated finger 105, so the projected video screen The operability with respect to 101 can be improved. In the fourth embodiment, since it is not necessary to mount the distance sensor 500 or the camera 510 in the image projection apparatus 1, the apparatus is further downsized.
 なお、上記実施例1~4においては、映像画面101が机上に表示される映像投写装置1を例として説明している。しかしながら、これは一例であり、映像投写装置1は、光検知センサ200から光が投写面の極近くを走査することが可能な超短投写型等のプロジェクタであればよい。例えば、映像投写装置1は、壁面近くに設置され又は天井に吊り下げられて壁面上に映像画面101を表示してもよいし、空間上に映像画面101を表示してもよい。 In the first to fourth embodiments, the image projection apparatus 1 in which the image screen 101 is displayed on the desk is described as an example. However, this is merely an example, and the image projection apparatus 1 may be any projector such as an ultra-short projection type capable of scanning light near the projection surface from the light detection sensor 200. For example, the video projection device 1 may be installed near a wall surface or suspended from a ceiling to display the video screen 101 on the wall surface, or may display the video screen 101 in a space.
 また、上記実施例1~4においては、操作者の指105により映像画面101上を操作する場合を例として説明したが、これは一例であり、指105の代わりにタッチペンや指し棒等を用いる等、映像画面101上を操作できる態様であればよい。 In the first to fourth embodiments, the case where the operator's finger 105 is used to operate the video screen 101 has been described as an example. However, this is an example, and a touch pen, a pointing stick, or the like is used instead of the finger 105. Any mode that can operate on the video screen 101 may be used.
 また、上記実施例1~4においては、光出射角度制御部240がレーザ光の出射角度を制御する場合を例として説明した。さらに、光出射角度制御部240は、レーザ光の高さを制御するようにしてもよい。図21は、レーザ光の高さを制御する場合の映像投写装置1の設置状態を例示する図である。図21に示すように、映像投写装置1の一部が設置面102よりも一段高い段差の上に設置されているので、レーザ光の出射角度を制御しただけでは、映像画面101とレーザ光との鉛直方向距離が望ましい距離よりも離れてしまう場合がある。逆に、設置面102が映像投写装置1の底面よりも高い位置にある場合、レーザ光が設置面102上を平行に走査できない場合がある。 In the first to fourth embodiments, the case where the light emission angle control unit 240 controls the emission angle of the laser beam has been described as an example. Furthermore, the light emission angle control unit 240 may control the height of the laser light. FIG. 21 is a diagram illustrating an installation state of the image projection device 1 when controlling the height of the laser beam. As shown in FIG. 21, since a part of the image projection device 1 is installed on a step that is one step higher than the installation surface 102, the image screen 101, the laser beam, and the laser beam are controlled only by controlling the laser beam emission angle. In some cases, the vertical distance is more than the desired distance. On the contrary, when the installation surface 102 is at a position higher than the bottom surface of the image projection device 1, the laser light may not be able to scan the installation surface 102 in parallel.
 このような場合、光出射角度制御部240は、レーザ光の高さについても制御する。具体的には、例えば、光出射角度制御部240は、距離センサ500を用いて基準の高さからの差分を算出し、算出した高さの差分に基づいて、光検知センサ200から出射されるレーザ光の高さを制御する。 In such a case, the light emission angle control unit 240 also controls the height of the laser light. Specifically, for example, the light emission angle control unit 240 calculates a difference from the reference height using the distance sensor 500, and emits the light from the light detection sensor 200 based on the calculated height difference. Controls the height of the laser beam.
 なお、本発明は上記した実施例に限定されるものではなく、様々な変形例が含まれる。例えば、上記した実施例は本発明をわかりやすく説明するために詳細に説明したものであり、必ずしも説明した全ての構成を備えるものに限定されるものではない。また、ある実施例の構成の一部に他の実施例の構成に置き換えることが可能であり、また、ある実施例の構成に他の実施例の構成を加えることも可能である。また、各実施例の構成の一部について、他の構成の追加・削除・置換をすることが可能である。 In addition, this invention is not limited to the above-mentioned Example, Various modifications are included. For example, the above-described embodiments have been described in detail for easy understanding of the present invention, and are not necessarily limited to those having all the configurations described. Further, it is possible to replace a part of the configuration of a certain embodiment with the configuration of another embodiment, and it is also possible to add the configuration of another embodiment to the configuration of a certain embodiment. Further, it is possible to add, delete, and replace other configurations for a part of the configuration of each embodiment.
 また、上記の各構成、機能、処理部等は、それらの一部又は全部を、例えば集積回路で設計する等によりハードウェアで実現しても良い。また、上記の各構成、機能等は、プロセッサがそれぞれの機能を実現するプログラムを解釈し、実行することによりソフトウェアで実現しても良い。各機能を実現するプログラム、テーブル、ファイル等の情報は、記憶媒体、ストレージ等に置くことができる。また、制御線や情報線は説明上必要と考えられるものを示しており、製品上必ずしも全ての制御線や情報線を示しているとは限らない。実際には殆ど全ての構成が相互に接続されていると考えても良い。 In addition, each of the above-described configurations, functions, processing units, and the like may be realized by hardware by designing a part or all of them with, for example, an integrated circuit. Each of the above-described configurations, functions, and the like may be realized by software by interpreting and executing a program that realizes each function by the processor. Information such as programs, tables, and files that realize each function can be stored in a storage medium, storage, or the like. Further, the control lines and information lines indicate what is considered necessary for the explanation, and not all the control lines and information lines on the product are necessarily shown. Actually, it may be considered that almost all the components are connected to each other.
 1 映像投写装置
 200 光検知センサ
 240 光出射角度制御部
 241 基準情報記憶部
 242 距離情報取得部
 243 傾き算出部
 244 調整完了判定部
 245 調整完了通知部
 300 操作検知部
 400 映像投写部
 500 距離センサ
 510 カメラ
 520 治具
 
DESCRIPTION OF SYMBOLS 1 Image projection apparatus 200 Light detection sensor 240 Light emission angle control part 241 Reference | standard information storage part 242 Distance information acquisition part 243 Inclination calculation part 244 Adjustment completion determination part 245 Adjustment completion notification part 300 Operation detection part 400 Image projection part 400 Image | video projection part 500 Distance sensor 510 Camera 520 Jig

Claims (13)

  1.  映像信号に基づいて生成された映像を投写面に投写する映像投写装置であって、
     前記投写面に投写された映像画面上の物体を検知するための検知光を出射し、前記映像画面上の物体からの前記検知光の反射光を検知する光検知部と、
     前記反射光の検知信号に基づいて前記映像画面に対する操作信号を生成する操作信号生成部と、
     前記検知光の出射角度を制御する光出射角度制御部と
     を含むことを特徴とする映像投写装置。
    An image projection device for projecting an image generated based on an image signal onto a projection surface,
    A light detection unit that emits detection light for detecting an object on the video screen projected on the projection surface and detects reflected light of the detection light from the object on the video screen;
    An operation signal generating unit that generates an operation signal for the video screen based on the detection signal of the reflected light;
    An image projection apparatus, comprising: a light emission angle control unit that controls an emission angle of the detection light.
  2.  前記光出射角度制御部は、前記映像投写装置の前記投写面に対する設置状態に応じて、前記検知光の出射方向が前記投写面と平行に近づくように前記検知光の出射角度を制御する
     ことを特徴とする請求項1に記載の映像投写装置。
    The light emission angle control unit controls the emission angle of the detection light so that the emission direction of the detection light approaches parallel to the projection surface according to an installation state of the image projection apparatus with respect to the projection surface. The video projection device according to claim 1, wherein the video projection device is a video projector.
  3.  前記光検知部の傾きを調整する傾き調整部
     を含み、
     前記光検知部は、前記傾き調整部による傾きの調整により前記出射角度が調整された前記検知光を出射し、
     前記光出射角度制御部は、前記傾き調整部による調整量に基づいて前記光検知部の傾き調整の完了を判定する
     ことを特徴とする請求項1に記載の映像投写装置。
    An inclination adjustment unit for adjusting the inclination of the light detection unit,
    The light detection unit emits the detection light whose emission angle is adjusted by adjusting the inclination by the inclination adjustment unit,
    The image projection apparatus according to claim 1, wherein the light emission angle control unit determines completion of tilt adjustment of the light detection unit based on an adjustment amount by the tilt adjustment unit.
  4.  前記傾き調整部は、前記映像投写装置の外側に手動により操作可能に設けられている
     ことを特徴とする請求項3に記載の映像投写装置。
    The image projection device according to claim 3, wherein the tilt adjustment unit is provided on the outside of the image projection device so as to be manually operable.
  5.  前記光出射角度制御部は、
     予め定められた前記映像投写装置の設置状態である基準設置状態からの前記映像投写装置の前記投写面に対する傾き量を算出し、
     算出した前記傾き量及び前記傾き調整部による調整量に基づいて、前記光検知部の傾き調整の完了を判定する
     ことを特徴とする請求項3に記載の映像投写装置。
    The light emission angle controller is
    Calculating an amount of inclination of the image projection device with respect to the projection plane from a reference installation state which is a predetermined installation state of the image projection device;
    The image projection apparatus according to claim 3, wherein completion of the tilt adjustment of the light detection unit is determined based on the calculated tilt amount and the adjustment amount by the tilt adjustment unit.
  6.  距離を計測する計測部であって前記計測部から前記投写面までの距離である投写面距離を計測する投写面距離計測部
     を含み、
     前記光出射角度制御部は、計測された前記投写面距離及び前記映像投写装置の底面から前記投写面距離計測部までの距離である計測部距離と、前記基準設置状態における投写面距離及び計測部距離とに基づいて、前記傾き量を算出する
     ことを特徴とする請求項5に記載の映像投写装置。
    A measurement unit for measuring a distance, and a projection plane distance measurement unit for measuring a projection plane distance that is a distance from the measurement unit to the projection plane,
    The light emission angle control unit includes the measured projection plane distance, a measurement unit distance that is a distance from the bottom surface of the image projection device to the projection plane distance measurement unit, and a projection plane distance and measurement unit in the reference installation state. The video projection device according to claim 5, wherein the amount of inclination is calculated based on a distance.
  7.  前記映像信号により生成された映像を拡大する投写レンズ
     を備え、
     前記投写面距離計測部は、前記投写面よりも前記投写レンズに隣接する位置に設けられる
     ことを特徴とする請求項6に記載の映像投写装置。
    A projection lens for enlarging the image generated by the image signal;
    The video projection apparatus according to claim 6, wherein the projection plane distance measurement unit is provided at a position adjacent to the projection lens with respect to the projection plane.
  8.  前記光検知部は、前記光検知センサから前記投写面までの距離である投写面距離を計測するための計測用の光を出射し、
     計測された前記投写面距離及び前記基準設置状態における投写面距離に基づいて、前記傾き量を算出する
     ことを特徴とする請求項5に記載の映像投写装置。
    The light detection unit emits measurement light for measuring a projection plane distance, which is a distance from the light detection sensor to the projection plane,
    The image projection apparatus according to claim 5, wherein the tilt amount is calculated based on the measured projection plane distance and the projection plane distance in the reference installation state.
  9.  前記映像画面の形状を認識する形状認識装置
     を含み、
     前記光出射角度制御部は、予め定められた前記映像投写装置の設置状態である基準設置状態における前記映像画面の形状と認識された前記形状と前記傾き調整部による調整量とに基づいて、前記光検知部の傾き調整の完了を判定する
     ことを特徴とする請求項3に記載の映像投写装置。
    A shape recognition device for recognizing the shape of the video screen,
    The light emission angle control unit is based on the shape recognized as the shape of the video screen in a reference installation state, which is a predetermined installation state of the video projection device, and the adjustment amount by the tilt adjustment unit. The image projection device according to claim 3, wherein completion of tilt adjustment of the light detection unit is determined.
  10.  前記光検知部は、前記検知光が当たる位置により反射率の異なる光反射部からの反射光を受光し、
     前記光出射角度制御部は、前記光反射部からの前記反射光の反射率及び前記傾き調整部による調整量に基づいて、前記光検知部の傾き調整の完了を判定する
     ことを特徴とする請求項3に記載の映像投写装置。
    The light detection unit receives reflected light from a light reflection unit having a different reflectance depending on a position where the detection light hits,
    The light emission angle control unit determines completion of tilt adjustment of the light detection unit based on a reflectance of the reflected light from the light reflection unit and an adjustment amount by the tilt adjustment unit. Item 4. The image projection device according to Item 3.
  11.  前記物体からの前記反射光を検知したことを操作者が認識できる態様で通知する検知通知部を含む
     ことを特徴とする請求項1に記載の映像投写装置。
    The image projection apparatus according to claim 1, further comprising: a detection notification unit that notifies the operator that the reflected light from the object has been detected.
  12.  前記光検知部は、前記映像画面の大きさに応じて、前記検知光の走査範囲を変更する
     ことを特徴とする請求項1に記載の映像投写装置。
    The video projection device according to claim 1, wherein the light detection unit changes a scanning range of the detection light according to a size of the video screen.
  13.  映像信号に基づいて生成された映像を投写面に投写する映像投写装置であって、
     前記投写面に投写された映像画面上の物体を検知するための検知光を出射し、前記映像画面上の物体からの前記検知光の反射光を検知する光検知部と、
     前記反射光の検知信号に基づいて前記映像画面に対する操作信号を生成する操作信号生成部と、
     前記検知光の出射角度を手動により調整可能な光出射角度調整部と
     を含むことを特徴とする映像投写装置。
     
    An image projection device for projecting an image generated based on an image signal onto a projection surface,
    A light detection unit that emits detection light for detecting an object on the video screen projected on the projection surface and detects reflected light of the detection light from the object on the video screen;
    An operation signal generating unit that generates an operation signal for the video screen based on the detection signal of the reflected light;
    An image projection apparatus comprising: a light emission angle adjustment unit capable of manually adjusting an emission angle of the detection light.
PCT/JP2014/077071 2014-10-09 2014-10-09 Image projection device WO2016056102A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2014/077071 WO2016056102A1 (en) 2014-10-09 2014-10-09 Image projection device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2014/077071 WO2016056102A1 (en) 2014-10-09 2014-10-09 Image projection device

Publications (1)

Publication Number Publication Date
WO2016056102A1 true WO2016056102A1 (en) 2016-04-14

Family

ID=55652758

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2014/077071 WO2016056102A1 (en) 2014-10-09 2014-10-09 Image projection device

Country Status (1)

Country Link
WO (1) WO2016056102A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0822374A (en) * 1994-07-07 1996-01-23 Canon Inc Information processor
JP2002032195A (en) * 2000-07-19 2002-01-31 Fujitsu General Ltd Optical scanning-type touch panel and method for adjusting optical axis thereof
JP2003091358A (en) * 2001-09-19 2003-03-28 Ricoh Co Ltd Coordinate input device
JP2009258569A (en) * 2008-04-21 2009-11-05 Ricoh Co Ltd Electronic device
JP2014132435A (en) * 2012-12-07 2014-07-17 Ricoh Co Ltd Coordinate detector and electronic information board system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0822374A (en) * 1994-07-07 1996-01-23 Canon Inc Information processor
JP2002032195A (en) * 2000-07-19 2002-01-31 Fujitsu General Ltd Optical scanning-type touch panel and method for adjusting optical axis thereof
JP2003091358A (en) * 2001-09-19 2003-03-28 Ricoh Co Ltd Coordinate input device
JP2009258569A (en) * 2008-04-21 2009-11-05 Ricoh Co Ltd Electronic device
JP2014132435A (en) * 2012-12-07 2014-07-17 Ricoh Co Ltd Coordinate detector and electronic information board system

Similar Documents

Publication Publication Date Title
JP5277703B2 (en) Electronics
US9740338B2 (en) System and methods for providing a three-dimensional touch screen
US9753192B2 (en) Reflector, adjustment method, and position detection apparatus
EP2237563A2 (en) Device and method for displaying an image
JP2010244484A (en) Image display device, image display method and image display program
JP6134804B2 (en) Video projection device
TW201106707A (en) Projector
JP5974189B2 (en) Projection-type image display apparatus and projection-type image display method
US10168897B2 (en) Touch input association
US10114512B2 (en) Projection system manager
WO2016035231A1 (en) User interface device and projector device
CN107077195B (en) Display object indicator
WO2017060943A1 (en) Optical ranging device and image projection apparatus
WO2013111376A1 (en) Interface device and method for driving interface device
JP2018164251A (en) Image display device and method for controlling the same
US11431959B2 (en) Object capture and illumination
US20150279336A1 (en) Bidirectional display method and bidirectional display device
US20150185321A1 (en) Image Display Device
US9946333B2 (en) Interactive image projection
WO2016056102A1 (en) Image projection device
JP6349886B2 (en) Image projection apparatus, control method for image projection apparatus, and control program for image projection apparatus
US20180113567A1 (en) Projector, projection system and image projection method
JP5713401B2 (en) User interface device for generating projected image signal for pointer projection, image projection method and program
US20170285874A1 (en) Capture and projection of an object image
JP2019213168A (en) Projection apparatus

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14903630

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

NENP Non-entry into the national phase

Ref country code: JP

122 Ep: pct application non-entry in european phase

Ref document number: 14903630

Country of ref document: EP

Kind code of ref document: A1