WO2016031038A1 - Video display system and projection-type video display device - Google Patents

Video display system and projection-type video display device Download PDF

Info

Publication number
WO2016031038A1
WO2016031038A1 PCT/JP2014/072696 JP2014072696W WO2016031038A1 WO 2016031038 A1 WO2016031038 A1 WO 2016031038A1 JP 2014072696 W JP2014072696 W JP 2014072696W WO 2016031038 A1 WO2016031038 A1 WO 2016031038A1
Authority
WO
WIPO (PCT)
Prior art keywords
projection
display device
image
unit
sensor
Prior art date
Application number
PCT/JP2014/072696
Other languages
French (fr)
Japanese (ja)
Inventor
俊彦 松▲沢▼
公舟 市川
信二 小野寺
宣孝 堀田
Original Assignee
日立マクセル株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日立マクセル株式会社 filed Critical 日立マクセル株式会社
Priority to PCT/JP2014/072696 priority Critical patent/WO2016031038A1/en
Publication of WO2016031038A1 publication Critical patent/WO2016031038A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/74Projection arrangements for image reproduction, e.g. using eidophor

Definitions

  • the present invention relates to a projection-type image display device having an interactive function and a technology related thereto.
  • Patent Document 1 discloses a method of scanning infrared laser light in a plane parallel to the image projection plane and detecting light returning to the operation article. The position of the operation article in the scanning plane is obtained from the time difference between the emission and detection of the laser beam.
  • the projection video display device from the information processing device side by using the projection video display device as a video display system connected to an information processing device such as a personal computer.
  • the interactive function described above is configured to be performed on the projection type video display device side of the main body. For example, calibration data relating to the detection position of the operation article and information relating to the obstacle cannot be handled on the information processing apparatus side, which hinders efficient control.
  • the projection type image display device having the conventional interactive function has a problem that the preparation for installation or use is not sufficient.
  • An object of the present invention is to make it possible to more suitably prepare for installation or use in a projection type video display device and video display system having an interactive function.
  • the present invention relates to, for example, a video display system including a projection video display device that displays video and an information processing device.
  • the projection video display device projects a display video onto a projection surface, and a projection surface.
  • a sensor capable of imaging light emitted or reflected by a touched operation article, a first interactive function unit that obtains a position of the operation article based on an image captured by the sensor, and controls display of an image according to the position;
  • the information processing device includes a first communication unit capable of transmitting and receiving information, and the information processing device includes a second communication unit capable of transmitting and receiving information to and from the projection video display device, and a second communication unit.
  • the position of the operation article is calculated from the image captured by the sensor received from the projection type video display device or the information generated based on the image captured by the sensor, and the projection type video display device determines the position according to the position.
  • the projection type video display device has a second interactive function unit capable of controlling video display, and the projection type video display device has a first interactive function of the projection type video display device with respect to an interactive function for a display video displayed by the projection type video display device.
  • a menu screen that can select whether to perform the operation in the first operation mode operated by the unit or in the second operation mode operated by the second interactive function unit of the information processing apparatus can be displayed.
  • the present invention provides, for example, a projection-type image display device that projects and displays an image, a light-emitting unit that emits laser light, and a projection that is used in an image display system that includes a laser light adjustment unit that blocks or reflects the laser light.
  • the projection type image display device includes a projection optical system that projects a display image on a projection surface, and a sensor that can capture reflected light reflected by an operation object that has contacted the projection surface with laser light emitted from a light emitting unit.
  • the interactive function unit that obtains the position of the operation article based on the image captured by the sensor and controls the display of the image according to the position is provided.
  • the captured image obtained by capturing the area including the projection surface by the sensor is displayed on the projection surface by the projection optical system.
  • FIG. 1 is a configuration diagram showing a video display system in Embodiment 1.
  • FIG. 2 is a block diagram showing an internal configuration of the projection display apparatus 100.
  • the block diagram which shows the internal structure of the laser sheet generation
  • FIG. 4 is a diagram illustrating an example of a captured image displayed on the screen 1.
  • FIG. 6 is a diagram illustrating an internal configuration of an information processing apparatus 400 according to a second embodiment.
  • the figure which shows the example of the mode selection screen of an interactive function. Explanatory drawing in the case of performing calibration operation automatically. Explanatory drawing in the case of performing calibration operation manually.
  • FIG. 1 is a configuration diagram illustrating a video display system according to the first embodiment.
  • the video display system is configured by connecting a video output device 200, a laser sheet generator 300, and an information processing device 400 to a projection video display device (projector) 100.
  • the connection method between the devices may be wired connection or wireless connection.
  • Projection type image display device (projector) 100 projects and displays an image on screen 1 which is a projection surface.
  • the display screen 20 of the screen 1 includes an operation icon area 22 for displaying operation icons for performing an interactive function, in addition to a video area 21 for displaying an image.
  • the operation icons include, for example, “mouse operation”, “line drawing” and “eraser” as drawing functions, and an icon for calling an operation menu of the main body of the projection display 100.
  • a function of switching the video display content of the projection type video display device, a function of adjusting other operations (such as audio output) of the projection type video display device, and the like are included.
  • the user performs line drawing 23 and 24 by bringing the light-emitting pen 2 or the finger 3 as the operation article into contact with the screen 1 (video area 21).
  • the video output device 200 outputs video data to be displayed in the video area 21 to the projection video display device 100.
  • various video devices such as a personal computer and a DVD player can be used.
  • a mobile terminal such as a tablet computer or a smartphone may be used.
  • the connection between the video output device 200 and the projection video display device 100 may be a wired connection or a wireless communication connection, but is particularly wireless when the video output device 200 is a tablet computer or a mobile terminal such as a smartphone. Communication connection is preferred.
  • the laser sheet generator 300 irradiates a laser beam in a non-visible light band in a plane near the display surface of the screen 1.
  • the projection display apparatus 100 detects the user's operation (contact position) by detecting the laser light that returns upon hitting the operation article (finger 3) or the light from the light emitting pen 2.
  • the laser light in the non-visible light band does not hinder the viewer from viewing the display image.
  • infrared laser light is particularly desirable among laser light in the invisible light band.
  • the information processing apparatus 400 is, for example, a personal computer, a tablet computer, a mobile terminal such as a smartphone, and the like, and inputs and outputs various control signals including an interactive function to and from the projection display apparatus 100. Thereby, the display operation of the projection display apparatus 100 can be controlled on the information processing apparatus 400 side.
  • the information processing apparatus 400 and the video output apparatus 200 may be the same apparatus.
  • FIG. 2 is a diagram for explaining a detection method of a user's finger contact.
  • A is the figure which looked at the screen from the side
  • (b) is the figure which looked at the screen from the front.
  • the screen 1 is attached to the wall 9, and the projection type image display device 100 is installed on the top.
  • the projection-type image display device 100 projects an image in an oblique direction from the projection optical system 101 toward the screen 1 and displays the image on the display screen 20.
  • the projection type image display apparatus 100 is provided with a sensor 150 configured by an infrared camera or the like, and images the screen 1 and detects laser light reflected (scattered) by the user's finger 3.
  • the laser sheet generator 300 is installed on the upper side of the screen 1 below the projection display apparatus 100 and irradiates infrared laser light in a plane parallel to and close to the display surface of the screen 1.
  • the shape of the laser light irradiation area is a sheet having a thin cross section, and the laser light irradiation area is placed close to the screen 1 so that the gap d is several mm or less.
  • the irradiation area is shown in gray, but the irradiation area of the laser light is hereinafter referred to as “laser sheet” 350.
  • the sensor 150 of the projection display apparatus 100 acquires the captured image including the reflected light 30.
  • the projection display apparatus 100 analyzes the captured image acquired by the sensor 150 to determine the contact position of the finger 3, determines the operation content based on the position of the user's finger, and performs interactive functions such as drawing display. Execute.
  • FIG. 3 is a block diagram showing the internal configuration of the projection display apparatus 100.
  • the projection optical system 101 is an optical system that projects an image onto the screen 1 and includes a lens and / or a mirror.
  • the display element 102 is an element that generates an image to be projected, and uses a transmissive liquid crystal panel, a reflective liquid crystal panel, a DMD (Digital Micromirror Device, registered trademark) panel, or the like.
  • the display element driving unit 103 sends a drive signal corresponding to the video signal to the display element 102.
  • the light source 105 generates illumination light for projection, and uses a high pressure mercury lamp, a xenon lamp, an LED light source, a laser light source, or the like.
  • the power source 106 supplies power to the light source 105.
  • the illumination optical system 104 condenses the illumination light generated by the light source 105, makes it more uniform, and irradiates the display element 102.
  • An operation signal input unit 107 is an operation button on the apparatus main body or a light receiving unit of a remote controller, and inputs an operation signal from a user.
  • the nonvolatile memory 108 stores data for various operations, display icons, and calibration data in the interactive function.
  • the memory 109 stores video data to be projected and control data for the apparatus.
  • the control unit 110 controls the operation of each unit in the apparatus. Particularly, the interactive function is executed by controlling the sensor 150 and the interactive function unit 120.
  • the sensor 150 is a camera that captures the front surface of the screen 1 and can detect reflected light from the operation article by detecting an infrared light component.
  • the cut wavelength of the optical filter to the visible light wavelength range (for example, setting it in the middle of the red visible light range)
  • some visible light components other than infrared light that is, the projected image on the display screen
  • the interactive function unit 120 is a part that performs interactive operations such as writing characters and figures in the video area 21 by the user operating the light-emitting pen and fingers.
  • the coordinate calculation unit 121 analyzes the infrared image acquired from the sensor 150, recognizes the light emitted from the light-emitting pen or the scattered light from the laser sheet, and calculates the position (position operated by the user).
  • the application unit 122 synthesizes the video area 21 and the operation icon area 22, an application that performs a drawing process or the like based on a user operation, an application that performs an operation on a video input from the video output apparatus 200, or the like. An application that can be operated with a light-emitting pen or a finger is executed.
  • the imaging range of the sensor 150 and the range of an image projected on the screen 1 are unlikely to coincide with each other. Therefore, when the position operated (drawn) by the user is calculated by the coordinate calculation unit 121, it is necessary to convert the coordinates in the shooting range of the sensor 150 and the coordinate position in the image projected on the screen 1. Therefore, the interactive function unit 120 includes a calibration unit 123 that performs the conversion process and a process for creating conversion table data (calibration data) for the conversion process.
  • the video input unit 131 connects the external video output device 200 and inputs video data.
  • the communication unit 132 is connected to the laser sheet generation unit 300 or the information processing apparatus 400 to input / output various control signals.
  • the power supply unit 133 supplies power and a control signal to the laser sheet generation unit 300.
  • FIG. 4 is a block diagram showing an internal configuration of the laser sheet generating unit 300.
  • the power input unit 301 receives power from the projection display apparatus 100 and supplies it to the laser light source 304.
  • the laser light source 304 generates laser light in the infrared region (for example, near the wavelength of 850 nm).
  • the laser sheet forming unit 305 is an optical system that converts the laser beam generated by the laser light source 304 into a thin sheet (laser sheet 350) and emits it.
  • an existing optical system such as a cylindrical lens may be used.
  • an adjustment mechanism for adjusting the formation position of the laser sheet 350 in parallel with the screen 1 may be provided.
  • the laser sheet generator 300 acquires power without passing through the projection display apparatus 100.
  • the laser sheet generating unit 300 is configured most simply, the laser light source 304 generates infrared laser light using the power input by the power input unit 301, and the laser sheet is converted into a laser sheet by the laser sheet forming unit 305. Is sufficient, and the control signal input unit 302 and the control unit 303 shown in the figure are not necessary.
  • control signal input unit 302 that inputs a control signal for controlling laser emission from the projection display apparatus 100 to the laser sheet generation unit 300 and a control unit 303 that controls the laser light source 304 by the input control signal are provided.
  • a control signal input unit 302 that inputs a control signal for controlling laser emission from the projection display apparatus 100 to the laser sheet generation unit 300 and a control unit 303 that controls the laser light source 304 by the input control signal are provided.
  • a control signal is transmitted from the communication unit 132 of the projection display apparatus 100 to the laser sheet generating unit 300, the laser light source 304 is turned off for a minute time that does not affect the operation, and then turned back on immediately.
  • an infrared image is imaged by the sensor 150 of the projection display apparatus 100, and an average value, an intermediate value, and the like of the luminance are acquired.
  • a control signal may be transmitted from the communication unit 132 in accordance with the acquired average value, intermediate value, or the like, and the light emission intensity of the laser light source 304 thereafter may be controlled.
  • the communication unit 132 transmits the laser sheet.
  • a control signal may be transmitted to the generator 300 and the laser light source 304 may be turned off. Thereby, it is possible to prevent the laser sheet from being generated unnecessarily when the interactive function is not used for a while.
  • the laser light source 304 cannot be returned to the ON state by a finger operation, a message or icon indicating the return method may be displayed in the display image on the screen 1.
  • a display such as “Please press the ⁇ button on the remote control to resume finger operation” may be performed.
  • a control signal may be transmitted from the communication unit 132 to the laser sheet generation unit 300 and the laser light source 304 may be turned on.
  • the sensor 150 can capture a part of visible light by setting the cut wavelength of the optical filter in the visible light wavelength region. Imaging can be performed. Therefore, if there is visible light, it is possible to take an image of the arm or the like of the person to be operated without scattering of the laser light. Therefore, in addition to the object for recognizing the light emission of the light emitting pen or the scattered light of the laser sheet as described in (2) above, it is recognized whether there is a moving object in the sensor image, and a control signal is sent from the communication unit 132. It is also possible to switch ON / OFF of the laser light source 304.
  • the coordinate calculation unit 121 of the projection display apparatus 100 does not recognize the light emission of the light emitting pen or the scattered light of the laser sheet for a first predetermined time (for example, 30 minutes) or longer, and the projection display apparatus
  • the moving object analysis unit included in the sensor 100 analyzes the presence / absence of a moving object in the sensor image, and when the moving object is not recognized for a second predetermined time (for example, 40 minutes) or more, the communication unit 132 controls the laser sheet generation unit 300.
  • a signal may be transmitted and the laser light source 304 may be turned off.
  • the laser light source 304 may be returned to the ON state as in (2) above.
  • the moving body analysis unit may analyze the presence or absence of a moving body in the sensor image without the laser sheet. Since it is possible, when a moving object in the sensor image is recognized again, a control signal may be transmitted from the communication unit 132 to the laser sheet generating unit 300 and the laser light source 304 may be turned on.
  • a control signal may be transmitted from the communication unit 132 to the laser sheet generation unit 300, and in synchronization with the imaging frame rate of the sensor 150, the laser light source 304 may be blinked at high speed for imaging.
  • the sensor image of the laser sheet ON and the sensor image of the laser sheet OFF can be alternately acquired for each frame. Based on the difference between the sensor image image of the continuous laser sheet ON and the sensor image of the laser sheet OFF, if recognition processing of light emitted from the light emitting pen or scattered light from the laser sheet is performed, the external light in the sensor image of the laser sheet OFF is detected. Since the influence can be excluded, it is possible to reduce an erroneous operation due to external light.
  • FIG. 5 is a diagram for explaining an example of installing a light shielding plate against abnormal reflection.
  • (a) is the figure which looked at the screen from the side
  • (b) is the figure which looked at the screen from the front.
  • an obstacle 4 protruding to the display side exists at the lower end of the screen 1 and the obstacle 4 intersects the laser sheet 350.
  • a whiteboard is used as the screen
  • a writing instrument holder for the whiteboard may become the obstacle 4 in some cases.
  • unnecessary reflected light 40 (hereinafter referred to as abnormal reflection) is generated there, and part of it reaches the sensor 150 and is detected.
  • the projection display apparatus 100 determines that the reflected light is due to an operation by the user's finger 3 and performs an incorrect operation.
  • the light shielding plate 5 is installed so that the reflected light 40 is not generated.
  • the light-shielding plate 5 is installed in front of the obstacle 4 (upward in the figure) as viewed from the laser sheet generating unit 300, but is preferably installed outside the display screen 20 that displays an image. The specific structure of the light shielding plate 5 will be described later.
  • the laser light source 304 of the laser sheet generator 300 is turned on, and the front surface of the screen 1 is photographed by the sensor 150. Then, the photographed image (hereinafter referred to as a sensor image) is displayed on the screen 1 and the adjustment work described below is performed.
  • FIG. 6A is a diagram illustrating an example of a captured image (sensor image) displayed on the screen 1.
  • the sensor image is displayed on the display screen 20 of the screen 1 as shown in FIG. 6A.
  • an image 20 ′ on the display screen 20 of the screen 1 and an image 300 ′ of the laser sheet generator 300 are displayed.
  • the image 300 ′ of the laser sheet generator is displayed below the display screen 20 because of the sensor installation position, and the image 20 ′ of the display area is displayed in a trapezoidal shape with the vertical and horizontal directions reversed.
  • an adjustment reflecting member (not shown) is placed at a predetermined position on the screen, the position and direction of the reflected light are obtained from the sensor image, and the laser sheet 350 is parallel to the screen and has a predetermined position. What is necessary is just to comprise so that it may be notified to a user whether it is formed in distance.
  • the state in which the reflected light indicated by reference numeral 50 is imaged by the adjustment reflecting member may be displayed as it is to prompt the user to make adjustments.
  • a display that indicates a method to be adjusted by the user based on the position information of the reflected light 50 acquired from the sensor image may be displayed on the display screen 20.
  • the adjustment mechanism of the laser sheet generating unit 300 is notified to the user.
  • the laser sheet generator 300 may be urged to adjust the installation position and posture, and this may be repeated until it is determined that the distance d between the laser sheet 350 and the screen is in a predetermined range and in a parallel state.
  • FIG. 6B is a diagram showing an example in which an abnormal reflection confirmation message is displayed on the screen.
  • a message 60 “Please check whether there is abnormal reflection” is displayed on the screen 1.
  • a message such as “If there is abnormal reflection, install a light shielding plate between the reflector and the light emitting part” is displayed.
  • the user confirms the presence or absence of abnormal reflection in the sensor image.
  • the “reflector” in the message refers to the obstacle 4, and the “light emitting unit” refers to the laser sheet generating unit 300.
  • FIG. 6C is a diagram illustrating an example of a sensor image when abnormal reflection exists.
  • an abnormal reflection 40 is recognized in the upper left part of the sensor image.
  • the user installs the light shielding plate 5 between the reflector 4 and the light emitting unit (laser sheet generating unit 300) (see FIG. 5).
  • FIG. 6D is a diagram illustrating an example of a sensor image after the light shielding plate is installed.
  • the image of the light shielding plate 5 is denoted by 5 ′ in a state where the light shielding plate 5 is installed in front of the reflector (obstacle) 4.
  • the reflector is hidden behind the light shielding plate 5 'and cannot be seen.
  • the sensor image is displayed on the screen at the time of the initial adjustment, the user can easily find the position of the abnormal reflection from the sensor image. Further, after the light shielding plate is installed, it can be confirmed from the sensor image that the abnormal reflection has been reliably eliminated.
  • FIG. 7A is a diagram illustrating a basic shape of the light shielding plate 5.
  • (A) shows an absorption type
  • (b) shows a reflection type.
  • the absorption type (a) absorbs the incident laser beam 31 by the light shielding surface 5a.
  • the light shielding surface 5a is coated with a light absorption film or an antireflection film to reduce the light reflectance.
  • the shape of the light shielding plate 5 is arbitrary. That is, the angle ⁇ between the installation surface 5b on the screen 1 or the wall 9 and the light shielding surface 5a is arbitrary and may be 90 °.
  • the reflection type has a structure in which the light-shielding plate 5 has a triangular prism shape and is reflected by the light-shielding surface 5a.
  • the material of the light shielding plate 5 is not limited, but the angle ⁇ between the installation surface 5b on the screen 1 or the wall 9 and the light shielding surface (reflective surface) 5a needs to be smaller than 90 ° in consideration of the direction of the reflected laser light 32. There is.
  • the angle ⁇ is equal to 90 °
  • the laser beam 32 reflected by the light shielding surface 5a returns as it is in parallel with the screen 1 or the wall 9 and is likely to reach the sensor 150.
  • the angle ⁇ is larger than 90 °, the laser light 32 reflected by the light shielding surface 5a is irradiated onto the screen 1 (inside the display screen 20), which may be detected by the sensor 150 and cause a malfunction.
  • FIG. 7B is a diagram illustrating selection of the light shielding surface angle of the reflective light shielding plate 5.
  • the light shielding plate 5 has a triangular prism shape, and the angle ⁇ between the installation surface 5b on the screen 1 or the wall 9 and the light shielding surface 5a is smaller than 45 °.
  • (A) is a side view of the entire installation, and (b) is an enlarged view of the light shielding plate 5.
  • projection display devices are often used indoors, and projection display devices having interactive functions are particularly likely to be used in conference rooms and classrooms.
  • conference rooms and classrooms there are many vertical surfaces such as walls, tables and back plates of desks that have an angle perpendicular to the floor surface. Considering these, it is desirable that the more preferable value of the angle ⁇ is as follows.
  • the laser light 32 reflected by the light shielding surface 5a travels downward from the horizontal direction. Even if there is a table or desk in the traveling direction and the light is reflected again by the vertical surface 10, it returns to a position below the light shielding plate 5 (reference numeral 33). Therefore, the return light does not travel to the laser sheet generating unit 300 side or irradiate the screen 1, and there is little possibility of erroneous detection by the sensor 150.
  • FIG. 7C is a diagram illustrating selection of the light shielding surface angle of the reflective light shielding plate 5.
  • a demerit when the angle ⁇ between the installation surface 5b on the screen 1 or the wall 9 and the light shielding surface 5a is 45 ° will be described.
  • (A) is a side view of the entire installation, and (b) is an enlarged view of the light shielding plate 5.
  • the laser beam 32 reflected by the light shielding surface 5a proceeds in the horizontal direction.
  • a table or desk exists in the traveling direction and is re-reflected by the vertical surface 10, it returns along substantially the same optical path and enters the light shielding plate 5 (reference numeral 33). Further, the light is reflected by the light shielding surface 5a and returned to the laser sheet generator 300 (reference numeral 34), which may be detected by the sensor 150. Therefore, the angle ⁇ between the installation surface 5b and the light-shielding surface 5a is most preferably 45 ° and should be avoided.
  • the angle ⁇ when the angle ⁇ is larger than 45 °, the laser light 32 reflected by the light shielding surface 5a travels upward from the horizontal direction. If a table, desk, or the like exists in the traveling direction and re-reflects on the vertical surface 10, it may return to a position above the light shielding plate 5 and irradiate the screen 1. Therefore, it is not preferable to make the angle ⁇ larger than 45 °.
  • the angle ⁇ is smaller than 45 ° in consideration of a general use environment.
  • FIG. 7D is a diagram illustrating selection of the light shielding surface angle of the reflective light shielding plate 5.
  • the installation surface of the light shielding plate 5 is selected from two surfaces.
  • the cross-sectional triangle is a substantially right-angled triangle with internal angles ⁇ ( ⁇ 45 °) and ⁇ (> 45 °). 5c. Then, the installation surfaces 5b and 5c are switched according to the situation.
  • (A) is an overall side view, and the reflected light 32 is switched between state 1 and state 2 by switching the installation surface of the light shielding plate. In the state 1, the reflected light 32 travels downward from the horizontal direction, and in the state 2, the reflected light 32 travels upward from the horizontal direction.
  • (B) is an enlarged view of installation in state 1, and by using the installation surface 5b, the angle with the light shielding surface 5a is ⁇ ( ⁇ 45 °) and the reflected light 32 is directed downward.
  • (C) is an enlarged view of the state 2, and by using the installation surface 5c, the angle with the light shielding surface 5a is ⁇ (> 45 °) and the reflected light 32 is directed upward.
  • the installation surface is changed from 5b to 5c, Switch to state 2 of (c). This may change the direction of the reflected light and eliminate the return light.
  • the installation in the two reflecting states can be selectively realized by the single light shielding plate 5, which is practical.
  • the same effect can be obtained by switching the oblique side of the three surfaces of the light shielding plate 5 as the installation surface 5a and switching the other two sides as the two light shielding surfaces (reflection surfaces) 5b and 5c.
  • the light shielding plate 5 described above may be referred to as a light shielding unit, a reflection unit, or a laser light adjustment unit.
  • the user since the sensor image is displayed on the screen at the time of the initial adjustment, the user can easily confirm the presence or absence of abnormal reflection that hinders the interactive function from the sensor image.
  • the adjustment work since it is possible to confirm from the sensor image that the abnormal reflection has been eliminated after the light shielding plate is installed, the adjustment work is easy and reliable.
  • Example 2 describes a case where the projection type video display apparatus 100 and the information processing apparatus 400 are connected to realize an interactive function on the information processing apparatus 400 side.
  • FIG. 8 is a diagram illustrating an internal configuration of the information processing apparatus 400 according to the second embodiment.
  • the information processing apparatus 400 is an example of a personal computer, but may be a mobile terminal such as a tablet computer smartphone.
  • the video output unit 401 outputs video data to the video input unit 131 of the projection video display device 100.
  • the communication unit 402 inputs and outputs various control signals related to the interactive function with the communication unit 132 of the projection display apparatus 100.
  • the storage 403 stores an interactive function program, video data, and the like.
  • the interactive function program may be acquired in advance from an external device or an external server via the communication unit 402 and stored in the storage 403.
  • the display unit 404 displays various videos and operation screens.
  • the operation input unit 405 is a keyboard or a mouse that receives an operation signal from the user. Also, by operating this, the operation of the projection display apparatus 100 can be operated.
  • the control unit 406 controls the operation of each unit in the information processing apparatus 400. Further, the interactive operation of the projection display apparatus 100 is controlled using the image signal from the sensor 150 of the projection display apparatus 100 and the interactive function unit 410.
  • the nonvolatile memory 407 stores data for various operations, display icons, and calibration data in the interactive function. These data may be stored in the storage 403.
  • the memory 408 expands the interactive function program stored in the storage 403, and the control unit 406 cooperates with the memory 408 to realize the interactive function.
  • the interactive function unit 410 calculates a coordinate of the user's operation position based on information acquired from the projection display apparatus 100 via the communication unit 402, and a process for creating calibration data.
  • a calibration unit 413 is provided.
  • an application unit 412 that is an application program that can be operated using the coordinates calculated by the coordinate calculation unit 411 of the interactive function unit 410 as an operation input can be expanded in the memory 408.
  • the application unit 412 is not only an application such as an electronic whiteboard drawing application that is assumed to be executed together with the projection display apparatus 100 but also an operation input of an operation input unit 405 (such as a keyboard and a mouse) of the information processing apparatus 400.
  • An application for a general personal computer to be operated may be used. In this case, the operation of the application can be realized by replacing the operation coordinates designated by the mouse operation of the operation input unit 405 with the coordinates calculated by the coordinate calculation unit 411.
  • the display content of the application unit 412 operated in this way is output from the video output unit 401 of the information processing device 400 to the projection video display device 100, and the projection video display device 100 displays it.
  • the information processing apparatus 400 can realize part or all of the interactive function.
  • the video displayed by the projection display apparatus 100 may be the video data itself output from the information processing apparatus 400.
  • the projection video display device 100 displays the video data output from the information processing device 400 and the video output from the video output device 200. Data may be combined and displayed.
  • FIG. 9 is a diagram showing an example of an interactive function mode selection screen.
  • An “interactive function mode selection menu” 90 as shown in the figure is displayed on the display screen 20 projected by the projection display apparatus 100.
  • This menu is generated and displayed by the control unit 110 using a menu screen image or the like stored in the non-volatile memory 108 based on a user operation via the operation signal input unit 107 of the projection display apparatus 100.
  • a mode 91 for processing on the projection display apparatus side, a mode 92 for processing on the information processing apparatus side, and a mode 93 not using the interactive function are displayed, and the user selects a desired mode from these.
  • the interactive function is executed using the interactive function unit 410 developed in the memory 408 of the information processing apparatus 400.
  • the information processing apparatus 400 is prepared by enabling the mode switching by the selection menu. Thus, it is possible to select whether to use a more advanced interactive function or to execute the interactive function on the projection display apparatus without preparing the information processing apparatus 400.
  • the projection display apparatus 100 and the information processing apparatus 400 each have an interactive function unit, either of them can execute the interactive function, but the data necessary for preparing for its use, For example, the calibration data is shared with each other to improve efficiency.
  • the calibration operation of the present embodiment will be described.
  • FIG. 10A is an explanatory diagram when the calibration operation is automatically executed.
  • the calibration operation refers to data for the calibration unit 123 of the projection display apparatus 100 or the calibration unit 413 of the information processing apparatus 400 to convert the coordinates of the captured image of the sensor 150 into the coordinates of the display screen 20. It is an action to get.
  • the procedure is as follows. In the following procedure, the operation of the information processing apparatus 400 and the projection video display apparatus 100 is linked with the control signal between the communication unit 402 of the information processing apparatus 400 and the communication unit 132 of the projection video display apparatus 100. This can be realized by inputting and outputting to The projection display apparatus 100 displays a visible light mark 70 (calibration image) at a predetermined position in the display screen 20. The shape, size, and color of the mark may be determined in advance, and the size and shape of the mark may be changed for each position.
  • the calibration video data including the mark 70 may be recorded in the nonvolatile memory 108 of the projection video display device 100, the storage 403 of the information processing device 400, or the like.
  • the sensor 150 of the projection display apparatus 100 captures a calibration image displayed on the display screen 20, that is, each mark 70. However, since the calibration image is generated with visible light, the sensor 150 can detect not only the laser light (infrared light) from the laser sheet generator 300 but also the visible light component as described above. It is necessary to configure.
  • FIG. 10B is an explanatory diagram when the calibration operation is executed manually.
  • the projection display apparatus 100 displays a visible light mark 80 (calibration image) at a predetermined position in the display screen 20.
  • the mark 80 indicates a position where the user should point using the light-emitting pen 2 or the finger 3.
  • the calibration image is recorded in the nonvolatile memory 108 of the projection display apparatus 100, the storage 403 of the information processing apparatus 400, or the like.
  • the projection display apparatus 100 uses a message or the like to instruct the user to point the mark 80 at each position one by one using the light-emitting pen 2 or the finger 3. To do.
  • the light-emitting pen 2 When the user points to the position where the mark 80 is displayed, the light-emitting pen 2 emits light, or the laser beam is reflected by the finger 3, and this is detected by the sensor 150. Then, a correspondence relationship between the position of each mark 80 in the displayed calibration image and the light detection position (pointing position) detected by the sensor 150 is obtained, and calibration data for performing coordinate conversion between the two is obtained. .
  • the calibration data acquired in this way is shared by both the projection display apparatus 100 and the information processing apparatus 400.
  • the acquired calibration data is stored in, for example, the non-volatile memory 108 and is naturally interactive on the projection display apparatus 100 side. It can be used to execute a function.
  • the calibration data is transmitted from the projection display apparatus 100 to the information processing apparatus 400 via the communication unit of both at any timing, and stored in the nonvolatile memory 407 or the storage 403. Accordingly, even when the interactive function unit 410 of the information processing apparatus 400 executes an interactive function, the information processing apparatus 400 can take over and use the received calibration data.
  • calibration data acquisition processing is executed by the calibration unit 413 of the information processing apparatus 400, and the acquired calibration data is then transmitted to the projection display apparatus 100 via both communication units. The calibration data received by the projection display apparatus 100 can be taken over and used.
  • the aspect ratio of the video input to the projection type video display device 100 is changed, when the output video aspect ratio from the projection type video display device 100 is changed, or in the projection type such as the multi-screen display mode.
  • the calibration data may not be used as it is.
  • a message is displayed in the video of the projection display apparatus 100 so that the calibration data acquisition process is executed again by the calibration unit of the one apparatus selected on the menu screen of FIG.
  • Calibration data acquisition processing may be executed.
  • the acquired calibration data may be transmitted to the other device, and the other device may take over and use the same as after the initial calibration data acquisition processing described above.
  • the calibration data acquisition process is performed once in either the calibration unit 123 of the projection video display apparatus 100 or the calibration unit 413 of the information processing apparatus 400, and the acquired calibration data is projected by the above-described method. It is shared by both the video display device 100 and the information processing device 400.
  • the interactive function is executed on the device side selected by the menu in FIG. 9, when various aspect ratios or display layouts are changed, the various aspect ratios or display layouts are changed. If the calibration unit on the apparatus side selected by the menu of FIG. 9 executes the geometric conversion process on the original calibration data and uses the converted calibration data for the subsequent interactive function process. Good.
  • the calibration data acquisition process needs to be performed only once by one apparatus, and even when various aspect ratios or display layouts are changed, or when interactive functions are processed by the menu of FIG. Even if the subject is switched, the interactive function by the system can be used continuously without executing the calibration data acquisition process again, which is very convenient for the user.
  • the control unit 110 of the projection display apparatus 100 has various aspect ratios. When there is a change or a display layout change, it is necessary to transmit control information capable of identifying the change contents to the information processing apparatus 400 via the communication unit 132.
  • the information processing apparatus 400 acquires calibration data by the calibration process as described above, and stores it in the storage 403 or the nonvolatile memory 407.
  • the sensor image itself captured by the sensor 150 of the projection display apparatus 100 is transmitted to the communication unit 402 of the information processing apparatus 400 via the communication unit 132.
  • the sensor image received by the communication unit 402 of the information processing apparatus 400 is analyzed by the coordinate calculation unit 411 of the interactive function unit 410 developed in the memory 408, and the light emission of the light emitting pen or the scattered light of the laser sheet is recognized and the light emission is performed.
  • the coordinates (position operated by the user) in the imaging range of the scattered light sensor 150 are calculated.
  • the coordinate calculation unit 411 uses the calibration data stored in the storage 403 or the non-volatile memory 407 to calculate the coordinates of the user operation position in the imaging range of the sensor 150 in the display screen 20 of the projection display apparatus 100. Convert to coordinates corresponding to the coordinates of.
  • the calibration data stored in the storage 403 or the nonvolatile memory 407 includes an assumed aspect ratio at the time of calibration, an output aspect ratio of the information processing apparatus 400, a display aspect ratio of the projection display apparatus 100, and a projection. What is necessary is just to use it, after performing geometric correction according to the display layout etc. of the type
  • the coordinates calculated by the coordinate calculation unit 411 are used for the operation of the application unit 412.
  • the information processing apparatus 400 acquires calibration data by the calibration process as described above, and stores it in the storage 403 or the nonvolatile memory 407.
  • the sensor image captured by the sensor 150 of the projection display apparatus 100 is sent to the coordinate calculation unit 121 of the interactive function unit 120.
  • the coordinate calculation unit 121 analyzes the sensor image and calculates the coordinates in the imaging range of the sensor 150 for the position of light emitted from the light-emitting pen or the scattered light of the laser sheet (user operation position).
  • the coordinate information in the imaging range of the sensor 150 at the user operation position is transmitted to the communication unit 402 of the information processing apparatus 400 via the communication unit 132.
  • the coordinate calculation unit 411 uses the calibration data stored in the storage 403 or the nonvolatile memory 407 to obtain the coordinate information in the imaging range of the sensor 150 of the user operation position received by the communication unit 402. Then, the coordinates are converted into coordinates corresponding to the coordinates in the display screen 20 of the projection display apparatus 100. This conversion process is the same as in the first method. Thus, the coordinates calculated by the coordinate calculation unit 411 are used for the operation of the application unit 412.
  • the example in which the calibration unit is provided in both of the projection display apparatus 100 and the information processing apparatus 400 has been described.
  • the projection display apparatus 100 may be configured to include a calibration unit.
  • the calibration data creation process is always created by the calibration unit 123 of the projection display apparatus 100, and the created calibration data is transmitted to the information processing apparatus 400 via the communication unit 132 and projected.
  • the type video display device 100 and the information processing device 400 are shared and used.
  • the projection display apparatus 100 not only the projection display apparatus 100 but also the information processing apparatus 400 can execute a part or all of the interactive function unit, so that both functions are used properly according to the situation. Can build a system that is easy to use. At that time, the calibration data can be shared by both parties, so the efficiency is improved.
  • Example 2 is also effective in a system that uses only a light-emitting pen. In this case, the laser sheet generator 300 is not necessary.

Abstract

A video display system including a projection-type video display device and an information processing device, wherein the projection-type video display device has: a sensor capable of imaging light emitted or reflected by a manipulation object that is in contact with a projection surface; and a first interactive function unit which determines the position of the manipulation object on the basis of an image imaged by the sensor and controls the display of a video in accordance with the position. The information processing device has a second interactive function unit which calculates the position of the manipulation object on the basis of the image imaged by the sensor and received from the projection-type video display device and which is capable of controlling the display of the video in the projection-type video display device in accordance with the position. The projection-type video display device displays a menu window for selecting whether an interactive function relating to the displayed video is to be operated in a first operation mode by the first interactive function unit of the projection-type video display device or in a second operation mode by the second interactive function unit of the information processing device.

Description

映像表示システムおよび投射型映像表示装置Image display system and projection image display device
 本発明は、インタラクティブ機能を有する投射型映像表示装置及びこれに関連する技術に関する。 The present invention relates to a projection-type image display device having an interactive function and a technology related thereto.
 スクリーン等に映像を投射する投射型映像表示装置において、ユーザのプレゼンテーションを快適に行うためインタラクティブ機能を備えた装置が実用化されている。これによりユーザは、専用のペンや指を用いて、表示画像の切替操作や画面に文字や図形を描画することができる。その際、表示画面に対する操作物である専用のペンや指の位置を検出するため、各種の方式が提案されている。例えば特許文献1には、画像投影面と平行でかつ近傍の平面内において、赤外線レーザー光を走査し、操作物に当たって戻ってくる光を検知する方式が開示されている。そして、操作物の走査平面内の位置は、レーザー光の出射と検知の時間差から求めている。 In a projection-type image display device that projects an image on a screen or the like, an apparatus having an interactive function has been put into practical use in order to perform a user's presentation comfortably. Thereby, the user can draw a character or a figure on the display image switching operation or the screen using a dedicated pen or finger. At that time, various methods have been proposed to detect the position of a dedicated pen or finger, which is an operation object with respect to the display screen. For example, Patent Document 1 discloses a method of scanning infrared laser light in a plane parallel to the image projection plane and detecting light returning to the operation article. The position of the operation article in the scanning plane is obtained from the time difference between the emission and detection of the laser beam.
特開2009-258569号公報JP 2009-2558569 A
 特許文献1のように、操作物の位置を検出するため、レーザー光を照射して操作物からの戻り光を検知する方式では、検出領域内に操作物(専用ペンや指など)以外の障害物が存在すると、それにより反射された戻り光を検出し、誤動作の原因となる。例えばホワイトボードを投射面とする場合、ホワイトボード面に突起物があると不要な反射光(異常反射)を発生させることになる。よって、ユーザに対し、異常反射が有るかどうか、また有りの場合はその場所を的確に知らせ、異常反射への対処を行うことが必要である。 In the method of detecting the return light from the operation article by irradiating the laser beam in order to detect the position of the operation article as in Patent Document 1, the obstacle other than the operation article (dedicated pen, finger, etc.) in the detection region If an object is present, the return light reflected by the object is detected, causing malfunction. For example, when a whiteboard is used as a projection surface, unnecessary reflected light (abnormal reflection) is generated if there are protrusions on the whiteboard surface. Therefore, it is necessary to accurately notify the user whether or not there is abnormal reflection, and if so, the location thereof and cope with abnormal reflection.
 さらに、投射型映像表示装置をパソコンなどの情報処理装置と接続した映像表示システムとし、情報処理装置側から投射型映像表示装置の制御を行うことも可能となっている。しかしながら、上記したインタラクティブ機能に関しては本体の投射型映像表示装置側で行う構成となっている。例えば、操作物の検出位置に関するキャリブレーションデータや障害物に関する情報は、情報処理装置側では扱えないので、効率的な制御の妨げになっていた。 Furthermore, it is also possible to control the projection video display device from the information processing device side by using the projection video display device as a video display system connected to an information processing device such as a personal computer. However, the interactive function described above is configured to be performed on the projection type video display device side of the main body. For example, calibration data relating to the detection position of the operation article and information relating to the obstacle cannot be handled on the information processing apparatus side, which hinders efficient control.
 このように、従来のインタラクティブ機能を備えた投射型映像表示装置では設置または使用のための準備の容易性が十分ではないという課題があった。 As described above, the projection type image display device having the conventional interactive function has a problem that the preparation for installation or use is not sufficient.
 本発明の目的は、インタラクティブ機能を有する投射型映像表示装置及び映像表示システムにおいて、より好適に設置または使用の準備を可能とすることである。 An object of the present invention is to make it possible to more suitably prepare for installation or use in a projection type video display device and video display system having an interactive function.
 本発明は、例えば、映像を表示する投射型映像表示装置と情報処理装置とを含む映像表示システムにおいて、投射型映像表示装置は、表示映像を投射面に投射する投射光学系と、投射面に接触した操作物が発光または反射する光を撮像可能なセンサーと、センサーによる撮像画像にもとづいて操作物の位置を求め、該位置に応じて映像の表示を制御する第1のインタラクティブ機能部と、情報処理装置と情報を送受信可能な第1の通信部とを有し、情報処理装置は、投射型映像表示装置と情報を送受信可能な第2の通信部と、第2の通信部を介して投射型映像表示装置から受信したセンサーによる撮像画像または該センサーによる撮像画像にもとづいて生成された情報から操作物の位置を算出し、該位置に応じて投射型映像表示装置における映像の表示を制御可能な第2のインタラクティブ機能部を有し、投射型映像表示装置は、投射型映像表示装置の表示する表示映像に対するインタラクティブ機能について、投射型映像表示装置の第1のインタラクティブ機能部によって動作する第1の動作モードで行うか、情報処理装置の前記第2のインタラクティブ機能部によって動作する第2の動作モードで行うかを選択可能なメニュー画面を表示可能に構成する。 The present invention relates to, for example, a video display system including a projection video display device that displays video and an information processing device. The projection video display device projects a display video onto a projection surface, and a projection surface. A sensor capable of imaging light emitted or reflected by a touched operation article, a first interactive function unit that obtains a position of the operation article based on an image captured by the sensor, and controls display of an image according to the position; The information processing device includes a first communication unit capable of transmitting and receiving information, and the information processing device includes a second communication unit capable of transmitting and receiving information to and from the projection video display device, and a second communication unit. The position of the operation article is calculated from the image captured by the sensor received from the projection type video display device or the information generated based on the image captured by the sensor, and the projection type video display device determines the position according to the position. The projection type video display device has a second interactive function unit capable of controlling video display, and the projection type video display device has a first interactive function of the projection type video display device with respect to an interactive function for a display video displayed by the projection type video display device. A menu screen that can select whether to perform the operation in the first operation mode operated by the unit or in the second operation mode operated by the second interactive function unit of the information processing apparatus can be displayed.
 本発明は、例えば、映像を投射して表示する投射型映像表示装置と、レーザー光を発光する発光ユニットと、該レーザー光を遮光または反射するレーザー光調整ユニットとを含む映像表示システムに用いる投射型映像表示装置において、投射型映像表示装置は、表示映像を投射面に投射する投射光学系と、発光ユニットが発するレーザー光を投射面に接触した操作物が反射した反射光を撮像可能なセンサーと、センサーによる撮像画像にもとづいて操作物の位置を求め、該位置に応じて映像の表示を制御するインタラクティブ機能部を備え、初期調整モードとして、レーザー光調整ユニットの設置位置を調整するために、センサーにより投射面を含む領域を撮像した撮像画像を投射光学系により投射面に表示するように構成する。 The present invention provides, for example, a projection-type image display device that projects and displays an image, a light-emitting unit that emits laser light, and a projection that is used in an image display system that includes a laser light adjustment unit that blocks or reflects the laser light. The projection type image display device includes a projection optical system that projects a display image on a projection surface, and a sensor that can capture reflected light reflected by an operation object that has contacted the projection surface with laser light emitted from a light emitting unit. In order to adjust the installation position of the laser light adjustment unit as an initial adjustment mode, the interactive function unit that obtains the position of the operation article based on the image captured by the sensor and controls the display of the image according to the position is provided. The captured image obtained by capturing the area including the projection surface by the sensor is displayed on the projection surface by the projection optical system.
 本発明によれば、より好適に設置または使用の準備を行うことが可能となる。 According to the present invention, it is possible to prepare for installation or use more suitably.
実施例1における映像表示システムを示す構成図。1 is a configuration diagram showing a video display system in Embodiment 1. FIG. ユーザの指接触の検出法を説明する図。The figure explaining the detection method of a user's finger contact. 投射型映像表示装置100の内部構成を示すブロック図。FIG. 2 is a block diagram showing an internal configuration of the projection display apparatus 100. レーザーシート発生部300の内部構成を示すブロック図。The block diagram which shows the internal structure of the laser sheet generation | occurrence | production part 300. FIG. 異常反射に対する遮光板設置の例を説明する図。The figure explaining the example of the light-shielding board installation with respect to abnormal reflection. スクリーン1に表示された撮影画像の例を示す図。FIG. 4 is a diagram illustrating an example of a captured image displayed on the screen 1. 異常反射の確認メッセージをスクリーンに表示した例を示す図。The figure which shows the example which displayed the confirmation message of the abnormal reflection on the screen. 異常反射が存在する場合のセンサー画像の例を示す図。The figure which shows the example of a sensor image in case abnormal reflection exists. 遮光板を設置した後のセンサー画像の例を示す図。The figure which shows the example of the sensor image after installing a light-shielding plate. 遮光板5の基本形状を示す図。The figure which shows the basic shape of the light-shielding plate. 反射型遮光板5の遮光面角度の選択について説明する図。The figure explaining selection of the light-shielding surface angle of the reflective light-shielding plate. 反射型遮光板5の遮光面角度の選択について説明する図。The figure explaining selection of the light-shielding surface angle of the reflective light-shielding plate. 反射型遮光板5の遮光面角度の選択について説明する図。The figure explaining selection of the light-shielding surface angle of the reflective light-shielding plate. 実施例2における情報処理装置400の内部構成を示す図。FIG. 6 is a diagram illustrating an internal configuration of an information processing apparatus 400 according to a second embodiment. インタラクティブ機能のモード選択画面の例を示す図。The figure which shows the example of the mode selection screen of an interactive function. キャリブレーション動作を自動で実行する場合の説明図。Explanatory drawing in the case of performing calibration operation automatically. キャリブレーション動作を手動で実行する場合の説明図。Explanatory drawing in the case of performing calibration operation manually.
 以下、本発明の実施形態について図面を用いて説明する。 Hereinafter, embodiments of the present invention will be described with reference to the drawings.
 図1は、実施例1における映像表示システムを示す構成図である。映像表示システムは、投射型映像表示装置(プロジェクタ)100に、映像出力装置200、レーザーシート発生部300、及び情報処理装置400を接続して構成される。各装置間の接続方式は、有線接続でも無線接続でもよい。 FIG. 1 is a configuration diagram illustrating a video display system according to the first embodiment. The video display system is configured by connecting a video output device 200, a laser sheet generator 300, and an information processing device 400 to a projection video display device (projector) 100. The connection method between the devices may be wired connection or wireless connection.
 投射型映像表示装置(プロジェクタ)100は、投射面であるスクリーン1に映像を投射して表示する。スクリーン1の表示画面20には、映像を表示する映像領域21の他に、インタラクティブ機能を行うための操作アイコンを表示する操作アイコン領域22がある。操作アイコンとして、例えば、「マウス操作」や、描画機能である「線描画」、「消しゴム」、投射型映像表示装置100本体の操作メニューを呼び出すアイコンなどを有する。その他、投射型映像表示装置の映像表示内容を切り替える機能や、投射型映像表示装置の他の動作(音声出力など)を調整する機能などを含む。図1では、インタラクティブ機能として、ユーザがスクリーン1(映像領域21)に操作物である発光ペン2や指3を接触させて、線描画23,24を行っている状態である。 Projection type image display device (projector) 100 projects and displays an image on screen 1 which is a projection surface. The display screen 20 of the screen 1 includes an operation icon area 22 for displaying operation icons for performing an interactive function, in addition to a video area 21 for displaying an image. The operation icons include, for example, “mouse operation”, “line drawing” and “eraser” as drawing functions, and an icon for calling an operation menu of the main body of the projection display 100. In addition, a function of switching the video display content of the projection type video display device, a function of adjusting other operations (such as audio output) of the projection type video display device, and the like are included. In FIG. 1, as an interactive function, the user performs line drawing 23 and 24 by bringing the light-emitting pen 2 or the finger 3 as the operation article into contact with the screen 1 (video area 21).
 映像出力装置200は、投射型映像表示装置100に対し映像領域21にて表示する映像データを出力する。映像出力装置200には、例えばパーソナルコンピュータやDVDプレーヤーなどの各種映像機器が使用可能である。また、映像出力装置200の他の例としては、タブレット型コンピュータやスマートフォン等のモバイル端末でも構わない。映像出力装置200と投射型映像表示装置100との間の接続は有線接続でも無線通信接続でも構わないが、映像出力装置200がタブレット型コンピュータや、スマートフォン等のモバイル端末である場合は、特に無線通信接続とするのが好ましい。 The video output device 200 outputs video data to be displayed in the video area 21 to the projection video display device 100. For the video output device 200, for example, various video devices such as a personal computer and a DVD player can be used. Further, as another example of the video output device 200, a mobile terminal such as a tablet computer or a smartphone may be used. The connection between the video output device 200 and the projection video display device 100 may be a wired connection or a wireless communication connection, but is particularly wireless when the video output device 200 is a tablet computer or a mobile terminal such as a smartphone. Communication connection is preferred.
 レーザーシート発生部300は、スクリーン1の表示面近傍の平面内に非可視光帯域のレーザー光を照射する。投射型映像表示装置100は、操作物(指3)に当たって戻ってくるレーザー光、または発光ペン2からの光を検知して、ユーザの操作(接触位置)を検出する。ここで、非可視光帯域のレーザー光であれば、視聴者が表示映像を視聴する妨げにならない。また、操作の際にユーザ(操作者)が近傍にいることが前提となるので、非可視光帯域のレーザー光のうち、特に赤外線レーザー光が望ましい。 The laser sheet generator 300 irradiates a laser beam in a non-visible light band in a plane near the display surface of the screen 1. The projection display apparatus 100 detects the user's operation (contact position) by detecting the laser light that returns upon hitting the operation article (finger 3) or the light from the light emitting pen 2. Here, the laser light in the non-visible light band does not hinder the viewer from viewing the display image. In addition, since it is assumed that the user (operator) is in the vicinity at the time of operation, infrared laser light is particularly desirable among laser light in the invisible light band.
 情報処理装置400は、例えばパーソナルコンピュータ、タブレット型コンピュータ、スマートフォン等の携帯端末等であり、投射型映像表示装置100との間で、インタラクティブ機能を含む各種制御信号を入出力する。これにより、情報処理装置400側で投射型映像表示装置100の表示動作を制御することが可能である。なお、情報処理装置400と上記映像出力装置200は同一の装置でも構わない。 The information processing apparatus 400 is, for example, a personal computer, a tablet computer, a mobile terminal such as a smartphone, and the like, and inputs and outputs various control signals including an interactive function to and from the projection display apparatus 100. Thereby, the display operation of the projection display apparatus 100 can be controlled on the information processing apparatus 400 side. The information processing apparatus 400 and the video output apparatus 200 may be the same apparatus.
 図2は、ユーザの指接触の検出法を説明する図である。(a)はスクリーンを側面から見た図、(b)はスクリーンを正面から見た図である。 FIG. 2 is a diagram for explaining a detection method of a user's finger contact. (A) is the figure which looked at the screen from the side, (b) is the figure which looked at the screen from the front.
 スクリーン1を壁9に取り付け、その上部に投射型映像表示装置100を設置する。投射型映像表示装置100は、投射光学系101からスクリーン1に向かって斜め方向に映像を投射し、表示画面20に映像を表示する。また、投射型映像表示装置100には赤外線カメラなどにより構成されるセンサー150を設け、スクリーン1を撮影するとともにユーザの指3により反射(散乱)されたレーザー光を検知する。 The screen 1 is attached to the wall 9, and the projection type image display device 100 is installed on the top. The projection-type image display device 100 projects an image in an oblique direction from the projection optical system 101 toward the screen 1 and displays the image on the display screen 20. In addition, the projection type image display apparatus 100 is provided with a sensor 150 configured by an infrared camera or the like, and images the screen 1 and detects laser light reflected (scattered) by the user's finger 3.
 レーザーシート発生部300は、スクリーン1の上部で投射型映像表示装置100の下側に設置し、スクリーン1の表示面と平行でかつ近接した平面内において、赤外線レーザー光を照射する。レーザー光の照射領域の形状は、断面が薄膜のシート状で、スクリーン1との間隙dは数mm以下になるよう近接して設置する。図2では照射領域を灰色で塗りつぶして示すが、以下、レーザー光の照射領域を「レーザーシート」350と呼ぶことにする。 The laser sheet generator 300 is installed on the upper side of the screen 1 below the projection display apparatus 100 and irradiates infrared laser light in a plane parallel to and close to the display surface of the screen 1. The shape of the laser light irradiation area is a sheet having a thin cross section, and the laser light irradiation area is placed close to the screen 1 so that the gap d is several mm or less. In FIG. 2, the irradiation area is shown in gray, but the irradiation area of the laser light is hereinafter referred to as “laser sheet” 350.
 ユーザが指3でスクリーン1に接触すると、指3の一部がレーザー光、すなわちレーザーシート350に交差して、レーザー光が反射(散乱)される。反射光を符号30で示す。投射型映像表示装置100のセンサー150はこの反射光30を含む撮影画像内を取得する。投射型映像表示装置100は、センサー150が取得した撮影画像を解析して、指3の接触位置を求め、ユーザの指の位置に基づいて操作内容を判断して、描画表示などのインタラクティブ機能を実行する。 When the user touches the screen 1 with the finger 3, a part of the finger 3 intersects the laser beam, that is, the laser sheet 350, and the laser beam is reflected (scattered). The reflected light is denoted by reference numeral 30. The sensor 150 of the projection display apparatus 100 acquires the captured image including the reflected light 30. The projection display apparatus 100 analyzes the captured image acquired by the sensor 150 to determine the contact position of the finger 3, determines the operation content based on the position of the user's finger, and performs interactive functions such as drawing display. Execute.
 図3は、投射型映像表示装置100の内部構成を示すブロック図である。
  投射光学系101は、映像をスクリーン1へ投射する光学系で、レンズおよび/またはミラーを含む。表示素子102は、投射する映像を生成する素子で、透過型液晶パネル、反射型液晶パネル、DMD(Digital Micromirror Device、登録商標)パネル等を用いる。表示素子駆動部103は、表示素子102に対して映像信号に応じた駆動信号を送る。光源105は投射用の照明光を発生するもので、高圧水銀ランプ、キセノンランプ、LED光源、レーザー光源等を用いる。電源106は光源105に電力を供給する。照明光学系104は、光源105で発生した照明光を集光し、より均一化して表示素子102に照射する。操作信号入力部107は、装置本体上の操作ボタンやリモコンの受光部であり、ユーザからの操作信号を入力する。
FIG. 3 is a block diagram showing the internal configuration of the projection display apparatus 100.
The projection optical system 101 is an optical system that projects an image onto the screen 1 and includes a lens and / or a mirror. The display element 102 is an element that generates an image to be projected, and uses a transmissive liquid crystal panel, a reflective liquid crystal panel, a DMD (Digital Micromirror Device, registered trademark) panel, or the like. The display element driving unit 103 sends a drive signal corresponding to the video signal to the display element 102. The light source 105 generates illumination light for projection, and uses a high pressure mercury lamp, a xenon lamp, an LED light source, a laser light source, or the like. The power source 106 supplies power to the light source 105. The illumination optical system 104 condenses the illumination light generated by the light source 105, makes it more uniform, and irradiates the display element 102. An operation signal input unit 107 is an operation button on the apparatus main body or a light receiving unit of a remote controller, and inputs an operation signal from a user.
 不揮発性メモリ108は、インタラクティブ機能における各種操作用のデータ、表示アイコン、キャリブレーション用のデータを格納する。メモリ109は、投射する映像データや装置の制御用データを記憶する。制御部110は、装置内各部の動作を制御する。特に、センサー150とインタラクティブ機能部120を制御してインタラクティブ機能を実行させる。 The nonvolatile memory 108 stores data for various operations, display icons, and calibration data in the interactive function. The memory 109 stores video data to be projected and control data for the apparatus. The control unit 110 controls the operation of each unit in the apparatus. Particularly, the interactive function is executed by controlling the sensor 150 and the interactive function unit 120.
 センサー150は、スクリーン1の前面を撮影するカメラで、赤外光成分を検出することで、操作物による反射光を検知することができる。なお、光学フィルターのカット波長を可視光波長域に設定する(例えば、赤色可視光領域の途中に設定する)ことで、赤外光以外の一部の可視光成分(すなわち表示画面の投射映像)を赤外光成分とともに撮影することも可能である。 The sensor 150 is a camera that captures the front surface of the screen 1 and can detect reflected light from the operation article by detecting an infrared light component. In addition, by setting the cut wavelength of the optical filter to the visible light wavelength range (for example, setting it in the middle of the red visible light range), some visible light components other than infrared light (that is, the projected image on the display screen) Can be photographed together with an infrared light component.
 インタラクティブ機能部120は、ユーザが発光ペンや指を操作することで、映像領域21へ文字や図形を書き込むなどのインタラクティブ動作を行う部分である。そのために座標算出部121は、センサー150から取得した赤外線画像を解析して、発光ペンの発光またはレーザーシートの散乱光を認識し、その位置(ユーザが操作した位置)を算出する。またアプリケーション部122は、映像領域21と操作アイコン領域22とを合成したり、ユーザの操作に基づいて描画処理等を行うアプリケーションや、映像出力装置200から入力される映像等の操作を行うアプリケーションなど、発光ペンや指により操作可能なアプリケーションを実行する。ここで、センサー150の撮影範囲と、スクリーン1に投射された映像(表示素子102の映像領域のスクリーン1上での光学像)の範囲とが、一致することはまずない。よって、座標算出部121でユーザが操作(描画)した位置を算出する際に、センサー150の撮影範囲での座標と、スクリーン1に投射された映像中の座標位置を変換する必要がある。よって、インタラクティブ機能部120は、当該変換の処理および当該変換処理のための変換テーブルデータ(キャリブレーションデータ)を作成するための処理を行うキャリブレーション部123を有する。 The interactive function unit 120 is a part that performs interactive operations such as writing characters and figures in the video area 21 by the user operating the light-emitting pen and fingers. For this purpose, the coordinate calculation unit 121 analyzes the infrared image acquired from the sensor 150, recognizes the light emitted from the light-emitting pen or the scattered light from the laser sheet, and calculates the position (position operated by the user). The application unit 122 synthesizes the video area 21 and the operation icon area 22, an application that performs a drawing process or the like based on a user operation, an application that performs an operation on a video input from the video output apparatus 200, or the like. An application that can be operated with a light-emitting pen or a finger is executed. Here, the imaging range of the sensor 150 and the range of an image projected on the screen 1 (an optical image on the screen 1 in the image area of the display element 102) are unlikely to coincide with each other. Therefore, when the position operated (drawn) by the user is calculated by the coordinate calculation unit 121, it is necessary to convert the coordinates in the shooting range of the sensor 150 and the coordinate position in the image projected on the screen 1. Therefore, the interactive function unit 120 includes a calibration unit 123 that performs the conversion process and a process for creating conversion table data (calibration data) for the conversion process.
 映像入力部131は、外部の映像出力装置200を接続して映像データを入力する。通信部132は、例えば、レーザーシート発生部300や情報処理装置400と接続して各種制御信号を入出力する。電力供給部133は、レーザーシート発生部300に対して電力と制御信号を供給する。 The video input unit 131 connects the external video output device 200 and inputs video data. For example, the communication unit 132 is connected to the laser sheet generation unit 300 or the information processing apparatus 400 to input / output various control signals. The power supply unit 133 supplies power and a control signal to the laser sheet generation unit 300.
 図4は、レーザーシート発生部300の内部構成を示すブロック図である。
  電源入力部301は、投射型映像表示装置100から電力を入力し、レーザー光源304へ供給する。
FIG. 4 is a block diagram showing an internal configuration of the laser sheet generating unit 300.
The power input unit 301 receives power from the projection display apparatus 100 and supplies it to the laser light source 304.
 レーザー光源304は、赤外領域(例えば波長850nm近傍)のレーザー光を発生する。レーザーシート化部305は、レーザー光源304で発生したレーザー光を薄いシート状(レーザーシート350)に変換して出射する光学系である。例えばシリンドリカルレンズ等の既存の光学系を用いて構成すればよい。あるいは、レーザー光を広角に走査する走査機構を用いて構成してもよい。また、図示しないが、レーザーシート350の形成位置をスクリーン1と平行に調整する調整機構を備えるように構成してもよい。 The laser light source 304 generates laser light in the infrared region (for example, near the wavelength of 850 nm). The laser sheet forming unit 305 is an optical system that converts the laser beam generated by the laser light source 304 into a thin sheet (laser sheet 350) and emits it. For example, an existing optical system such as a cylindrical lens may be used. Or you may comprise using the scanning mechanism which scans a laser beam to a wide angle. Although not shown, an adjustment mechanism for adjusting the formation position of the laser sheet 350 in parallel with the screen 1 may be provided.
 なお、本実施例では、投射型映像表示装置100から電力をレーザーシート発生部300へ供給する例を説明するが、レーザーシート発生部300が投射型映像表示装置100を介さずに電力を取得するように構成しても構わない。レーザーシート発生部300を最も簡便に構成する場合は、電源入力部301で入力された電力を用いてレーザー光源304が赤外線レーザー光を生成し、レーザーシート化部305でレーザーシート化してレーザーシート350を発生させるだけで十分であり、図示される制御信号入力部302および制御部303は必要ない。 In this embodiment, an example in which power is supplied from the projection display apparatus 100 to the laser sheet generator 300 will be described. However, the laser sheet generator 300 acquires power without passing through the projection display apparatus 100. You may comprise as follows. When the laser sheet generating unit 300 is configured most simply, the laser light source 304 generates infrared laser light using the power input by the power input unit 301, and the laser sheet is converted into a laser sheet by the laser sheet forming unit 305. Is sufficient, and the control signal input unit 302 and the control unit 303 shown in the figure are not necessary.
 しかし、レーザーシート発生部300に、投射型映像表示装置100からレーザー発光を制御するための制御信号を入力する制御信号入力部302と、入力した制御信号によってレーザー光源304を制御する制御部303を搭載することにより、より高度な制御を行うことが可能となる。例えば、下記の(1)~(4)のような制御を実現することが可能である。 However, a control signal input unit 302 that inputs a control signal for controlling laser emission from the projection display apparatus 100 to the laser sheet generation unit 300 and a control unit 303 that controls the laser light source 304 by the input control signal are provided. By installing it, it becomes possible to perform more advanced control. For example, the following controls (1) to (4) can be realized.
 (1)例えば、投射型映像表示装置100の通信部132からレーザーシート発生部300に制御信号を送信し、操作に影響しない程度の微小時間レーザー光源304をOFFにし、直後にONに戻す。これと連動して、レーザー光源304がOFFの時間に、投射型映像表示装置100のセンサー150で赤外線映像を撮像し、その輝度の平均値や中間値等を取得する。取得した平均値や中間値等に応じて、通信部132から制御信号を送信し、それ以降のレーザー光源304の発光強度を制御しても良い。この制御を間欠的に繰り返すことにより、外光の影響を考慮したレーザー光源の発光強度の調整を実現することができる。これにより、外光の影響が強い場合のみレーザー強度を強くすればよいので、外光の影響が少ない場合に無駄に電力を消費することを防ぐことができる。 (1) For example, a control signal is transmitted from the communication unit 132 of the projection display apparatus 100 to the laser sheet generating unit 300, the laser light source 304 is turned off for a minute time that does not affect the operation, and then turned back on immediately. In conjunction with this, during the time when the laser light source 304 is OFF, an infrared image is imaged by the sensor 150 of the projection display apparatus 100, and an average value, an intermediate value, and the like of the luminance are acquired. A control signal may be transmitted from the communication unit 132 in accordance with the acquired average value, intermediate value, or the like, and the light emission intensity of the laser light source 304 thereafter may be controlled. By repeating this control intermittently, it is possible to realize adjustment of the light emission intensity of the laser light source in consideration of the influence of external light. Thereby, the laser intensity only needs to be increased only when the influence of external light is strong, so that wasteful power consumption can be prevented when the influence of external light is small.
 (2)例えば、投射型映像表示装置100の座標算出部121が、所定時間(例えば、1時間)以上、発光ペンの発光またはレーザーシートの散乱光を認識しない場合に、通信部132からレーザーシート発生部300に制御信号を送信し、レーザー光源304をOFFにしてもよい。これにより、暫くインタラクティブ機能を用いない場合に無駄にレーザーシートを発生させておくことを防ぐことができる。この際、レーザー光源304のONへの復帰は、指操作ではできないことになるので、スクリーン1上の表示映像中に復帰の方法を示すメッセージやアイコンを表示しておいてもよい。例えば、「指操作再開のためにはリモコンの○○ボタンを押してください」等の表示を行えばよい。操作信号入力部107等により当該復帰動作信号が入力された場合は、通信部132からレーザーシート発生部300に制御信号を送信し、レーザー光源304をONにすればよい。 (2) For example, when the coordinate calculation unit 121 of the projection display apparatus 100 does not recognize the light emitted from the light-emitting pen or the scattered light from the laser sheet for a predetermined time (for example, 1 hour) or longer, the communication unit 132 transmits the laser sheet. A control signal may be transmitted to the generator 300 and the laser light source 304 may be turned off. Thereby, it is possible to prevent the laser sheet from being generated unnecessarily when the interactive function is not used for a while. At this time, since the laser light source 304 cannot be returned to the ON state by a finger operation, a message or icon indicating the return method may be displayed in the display image on the screen 1. For example, a display such as “Please press the ○○ button on the remote control to resume finger operation” may be performed. When the return operation signal is input from the operation signal input unit 107 or the like, a control signal may be transmitted from the communication unit 132 to the laser sheet generation unit 300 and the laser light source 304 may be turned on.
 (3)上述の通り、センサー150は、光学フィルターのカット波長を可視光波長域に設定することにより、一部の可視光を撮像することができ、赤外光と一部の可視光とで撮像ができる。よって、可視光があれば、レーザー光の散乱がなくとも操作する人物の腕等を撮像することができる。よって、上述の(2)のような発光ペンの発光またはレーザーシートの散乱光を認識する対象に加えて、センサー映像中に動く物体があるか否かを認識し、通信部132から制御信号を送信し、レーザー光源304のON/OFFを切り替えてもよい。 (3) As described above, the sensor 150 can capture a part of visible light by setting the cut wavelength of the optical filter in the visible light wavelength region. Imaging can be performed. Therefore, if there is visible light, it is possible to take an image of the arm or the like of the person to be operated without scattering of the laser light. Therefore, in addition to the object for recognizing the light emission of the light emitting pen or the scattered light of the laser sheet as described in (2) above, it is recognized whether there is a moving object in the sensor image, and a control signal is sent from the communication unit 132. It is also possible to switch ON / OFF of the laser light source 304.
 すなわち、投射型映像表示装置100の座標算出部121が、第1の所定時間(例えば、30分)以上、発光ペンの発光またはレーザーシートの散乱光を認識しない場合、かつ、投射型映像表示装置100が有する図示しない動体解析部がセンサー映像中の動体の有無を解析し、第2の所定時間(例えば、40分)以上動体を認識しない場合に、通信部132からレーザーシート発生部300に制御信号を送信し、レーザー光源304をOFFにしてもよい。発光ペンの発光またはレーザーシートの散乱光がない場合でも動体が近くにある場合は、操作者が付近にいる可能性が高く、いつ指操作が行われてもおかしくない。よって、第2の所定時間を第1の所定時間よりも長くすることにより意図しないレーザー光源のOFFを防ぐことができる。また、レーザー光源304のONへの復帰は、上記(2)同様としてもよいが、(3)の例では、レーザーシートがなくとも動体解析部がセンサー映像中の動体の有無を解析することが可能であるので、センサー映像中の動体が再び認識された場合に、通信部132からレーザーシート発生部300に制御信号を送信し、レーザー光源304をONにすればよい。 That is, when the coordinate calculation unit 121 of the projection display apparatus 100 does not recognize the light emission of the light emitting pen or the scattered light of the laser sheet for a first predetermined time (for example, 30 minutes) or longer, and the projection display apparatus The moving object analysis unit (not shown) included in the sensor 100 analyzes the presence / absence of a moving object in the sensor image, and when the moving object is not recognized for a second predetermined time (for example, 40 minutes) or more, the communication unit 132 controls the laser sheet generation unit 300. A signal may be transmitted and the laser light source 304 may be turned off. Even if there is no light emitted from the light-emitting pen or scattered light from the laser sheet, if the moving object is nearby, there is a high possibility that the operator is in the vicinity, and it is not strange when a finger operation is performed. Therefore, unintended turning off of the laser light source can be prevented by making the second predetermined time longer than the first predetermined time. In addition, the laser light source 304 may be returned to the ON state as in (2) above. However, in the example of (3), the moving body analysis unit may analyze the presence or absence of a moving body in the sensor image without the laser sheet. Since it is possible, when a moving object in the sensor image is recognized again, a control signal may be transmitted from the communication unit 132 to the laser sheet generating unit 300 and the laser light source 304 may be turned on.
 (4)通信部132からレーザーシート発生部300に制御信号を送信し、センサー150の撮像フレームレートと同期して、レーザー光源304を高速に点滅させて撮像する方式にしてもよい。例えば、1フレームごとにレーザーシートONのセンサー画像とレーザーシートOFFのセンサー画像を交互に取得することができる。連続するレーザーシートONのセンサー画像の画像とレーザーシートOFFのセンサー画像の差分に基づいて、発光ペンの発光またはレーザーシートの散乱光の認識処理を行えば、レーザーシートOFFのセンサー画像における外光の影響を除外することができるので、外光による誤り動作を低減することが可能となる。 (4) A control signal may be transmitted from the communication unit 132 to the laser sheet generation unit 300, and in synchronization with the imaging frame rate of the sensor 150, the laser light source 304 may be blinked at high speed for imaging. For example, the sensor image of the laser sheet ON and the sensor image of the laser sheet OFF can be alternately acquired for each frame. Based on the difference between the sensor image image of the continuous laser sheet ON and the sensor image of the laser sheet OFF, if recognition processing of light emitted from the light emitting pen or scattered light from the laser sheet is performed, the external light in the sensor image of the laser sheet OFF is detected. Since the influence can be excluded, it is possible to reduce an erroneous operation due to external light.
 以上説明した(1)~(4)の変形例は、それぞれを単独で行っても良いが組み合わせて実装してもよい。 The modified examples (1) to (4) described above may be performed alone or in combination.
 次に、図5は、異常反射に対する遮光板を設置する例を説明する図である。図2と同様に、(a)はスクリーンを側面から見た図、(b)はスクリーンを正面から見た図である。 Next, FIG. 5 is a diagram for explaining an example of installing a light shielding plate against abnormal reflection. Like FIG. 2, (a) is the figure which looked at the screen from the side, (b) is the figure which looked at the screen from the front.
 図5のように、スクリーン1の下端に表示側に突き出た障害物4が存在し、かつその障害物4がレーザーシート350に交差している場合を想定する。例えば、スクリーンとしてホワイトボードを用いるとき、ホワイトボード用の筆記用具置きなどがこの障害物4となる場合がある。障害物4はレーザーシート350と交差するので、そこで不要な反射光40(以下、異常反射)が発生し、一部がセンサー150に達して検知される。その結果、投射型映像表示装置100はこの反射光をユーザの指3による操作によるものと判断し、誤った動作を行うことになる。 As shown in FIG. 5, it is assumed that an obstacle 4 protruding to the display side exists at the lower end of the screen 1 and the obstacle 4 intersects the laser sheet 350. For example, when a whiteboard is used as the screen, a writing instrument holder for the whiteboard may become the obstacle 4 in some cases. Since the obstacle 4 intersects with the laser sheet 350, unnecessary reflected light 40 (hereinafter referred to as abnormal reflection) is generated there, and part of it reaches the sensor 150 and is detected. As a result, the projection display apparatus 100 determines that the reflected light is due to an operation by the user's finger 3 and performs an incorrect operation.
 そこで、この誤動作を防止するため、遮光板5を設置して反射光40を発生させないようにする。遮光板5は、レーザーシート発生部300から見て、障害物4の手前(図では上方)に設置するが、映像を表示する表示画面20の外側に設置するのが望ましい。遮光板5の具体的構造は後述する。遮光板5を設置することで、レーザー光が障害物4に照射されることなく、よって不要な異常反射40も発生しなくなる。 Therefore, in order to prevent this malfunction, the light shielding plate 5 is installed so that the reflected light 40 is not generated. The light-shielding plate 5 is installed in front of the obstacle 4 (upward in the figure) as viewed from the laser sheet generating unit 300, but is preferably installed outside the display screen 20 that displays an image. The specific structure of the light shielding plate 5 will be described later. By installing the light shielding plate 5, the laser beam is not irradiated on the obstacle 4, and thus unnecessary abnormal reflection 40 does not occur.
 次に、投射型映像表示装置100によるインタラクティブ機能を動作させるための初期調整モードについて説明する。レーザーシート発生部300のレーザー光源304をONさせ、センサー150でスクリーン1の前面を撮影する。そして、撮影した画像(以下、センサー画像と呼ぶ)をスクリーン1に表示して以下に述べる調整作業を行う。 Next, an initial adjustment mode for operating the interactive function by the projection display apparatus 100 will be described. The laser light source 304 of the laser sheet generator 300 is turned on, and the front surface of the screen 1 is photographed by the sensor 150. Then, the photographed image (hereinafter referred to as a sensor image) is displayed on the screen 1 and the adjustment work described below is performed.
 図6Aは、スクリーン1に表示された撮影画像(センサー画像)の例を示す図である。投射型映像表示装置100やレーザーシート発生部300を図2に示した位置関係に設置した場合、スクリーン1の表示画面20には、センサー画像が図6Aに示すように表示される。センサー画像として、スクリーン1の表示画面20の画像20’や、レーザーシート発生部300の画像300’が表示される。本実施例では、センサーの設置位置の関係から、レーザーシート発生部の画像300’は表示画面20の下方に表示され、表示領域の画像20’は上下左右方向が反転して台形状に表示される。 FIG. 6A is a diagram illustrating an example of a captured image (sensor image) displayed on the screen 1. When the projection display apparatus 100 and the laser sheet generator 300 are installed in the positional relationship shown in FIG. 2, the sensor image is displayed on the display screen 20 of the screen 1 as shown in FIG. 6A. As the sensor image, an image 20 ′ on the display screen 20 of the screen 1 and an image 300 ′ of the laser sheet generator 300 are displayed. In this embodiment, the image 300 ′ of the laser sheet generator is displayed below the display screen 20 because of the sensor installation position, and the image 20 ′ of the display area is displayed in a trapezoidal shape with the vertical and horizontal directions reversed. The
 まず、初期調整として、図示しない調整用反射部材をスクリーン上の所定位置に置き、その反射光の位置や方向を、センサー画像から取得して、レーザーシート350がスクリーンに対し、平行にかつ所定の距離に形成されているかどうかをユーザに伝えられるように構成すればよい。例えば、図6Aの例のように、調整用反射部材により符号50で示す反射光が撮像された状態をそのままユーザに表示して調整を促してもよい。また、センサー画像から取得した反射光50の位置情報に基づいてユーザが調整すべき方法を、文字または図で指示する表示を表示画面20上に表示しても良い。もし、これらの手段により、レーザーシート350とスクリーンの間隔dが所定範囲にないと判断された場合や、平行状態でないと判断された場合には、ユーザに、レーザーシート発生部300の調整機構によりレーザーシート発生部300の設置位置と姿勢を調整するように促して、レーザーシート350とスクリーンの間隔dが所定範囲かつ平行状態であると判断されるまでこれを繰り返すようにすればよい。 First, as an initial adjustment, an adjustment reflecting member (not shown) is placed at a predetermined position on the screen, the position and direction of the reflected light are obtained from the sensor image, and the laser sheet 350 is parallel to the screen and has a predetermined position. What is necessary is just to comprise so that it may be notified to a user whether it is formed in distance. For example, as in the example of FIG. 6A, the state in which the reflected light indicated by reference numeral 50 is imaged by the adjustment reflecting member may be displayed as it is to prompt the user to make adjustments. In addition, a display that indicates a method to be adjusted by the user based on the position information of the reflected light 50 acquired from the sensor image may be displayed on the display screen 20. If it is determined by these means that the distance d between the laser sheet 350 and the screen is not within a predetermined range, or if it is determined that they are not in a parallel state, the adjustment mechanism of the laser sheet generating unit 300 is notified to the user. The laser sheet generator 300 may be urged to adjust the installation position and posture, and this may be repeated until it is determined that the distance d between the laser sheet 350 and the screen is in a predetermined range and in a parallel state.
 次に、スクリーン上の異常反射(不要光)の防止について説明する。スクリーン上に何らかの障害物があると、それでレーザー光が反射(散乱)され、センサー150で検知されて誤動作を引き起こすことになる。そこで、異常反射の有無を確認し遮光板を設置する作業に進む。 Next, prevention of abnormal reflection (unnecessary light) on the screen will be described. If there is any obstruction on the screen, the laser beam is reflected (scattered) by that, and is detected by the sensor 150 to cause a malfunction. Then, the presence or absence of abnormal reflection is confirmed, and the operation proceeds to the operation of installing a light shielding plate.
 図6Bは、異常反射の確認メッセージをスクリーンに表示した例を示す図である。例えばスクリーン1に、「異常な反射の有無を確認してください」とのメッセージ60を表示する。またその対処法として、「異常な反射がある場合は、反射物と発光部との間に遮光板を設置してください」のように表示する。これに従ってユーザは、センサー画像内の異常反射の有無を確認する。なお、メッセージ中の「反射物」は障害物4を、「発光部」はレーザーシート発生部300のことである。 FIG. 6B is a diagram showing an example in which an abnormal reflection confirmation message is displayed on the screen. For example, a message 60 “Please check whether there is abnormal reflection” is displayed on the screen 1. As a countermeasure, a message such as “If there is abnormal reflection, install a light shielding plate between the reflector and the light emitting part” is displayed. In accordance with this, the user confirms the presence or absence of abnormal reflection in the sensor image. The “reflector” in the message refers to the obstacle 4, and the “light emitting unit” refers to the laser sheet generating unit 300.
 図6Cは、異常反射が存在する場合のセンサー画像の例を示す図である。この例では、センサー画像の左上部に異常反射40が認められた場合である。異常反射が認められた位置には、レーザー光を反射する何らかの反射物(障害物)4が存在し、ユーザはその反射物が何であるかを特定する。なお、センサー画像は赤外光画像であるため、センサー画像から反射物4自身が視認できるとは限らない。ユーザはメッセージ60に従って、反射物4と発光部(レーザーシート発生部300)との間に遮光板5を設置する(図5参照)。 FIG. 6C is a diagram illustrating an example of a sensor image when abnormal reflection exists. In this example, an abnormal reflection 40 is recognized in the upper left part of the sensor image. There is a reflection object (obstacle) 4 that reflects the laser light at the position where the abnormal reflection is recognized, and the user specifies what the reflection object is. Since the sensor image is an infrared light image, the reflector 4 itself is not always visible from the sensor image. In accordance with the message 60, the user installs the light shielding plate 5 between the reflector 4 and the light emitting unit (laser sheet generating unit 300) (see FIG. 5).
 図6Dは、遮光板を設置した後のセンサー画像の例を示す図である。この例では、遮光板5を反射物(障害物)4の手前に設置した状態で、遮光板5の画像を5’で示す。センサー画像では、反射物は遮光板5’に隠れて見えない。遮光板5を設置することで、レーザー光が反射物4に当たることがなくなり、反射物4での異常反射40をなくすことができる。 FIG. 6D is a diagram illustrating an example of a sensor image after the light shielding plate is installed. In this example, the image of the light shielding plate 5 is denoted by 5 ′ in a state where the light shielding plate 5 is installed in front of the reflector (obstacle) 4. In the sensor image, the reflector is hidden behind the light shielding plate 5 'and cannot be seen. By installing the light shielding plate 5, the laser light does not hit the reflector 4, and the abnormal reflection 40 on the reflector 4 can be eliminated.
 このように本実施例では、初期調整時にセンサー画像をスクリーンに表示するようにしたので、ユーザはセンサー画像から、異常反射の位置を容易に発見することができる。また、遮光板を設置した後、異常反射が確実に解消されたことをセンサー画像から確認することができる。 As described above, in this embodiment, since the sensor image is displayed on the screen at the time of the initial adjustment, the user can easily find the position of the abnormal reflection from the sensor image. Further, after the light shielding plate is installed, it can be confirmed from the sensor image that the abnormal reflection has been reliably eliminated.
 次に、遮光板5の形状について具体的に説明する。
  図7Aは、遮光板5の基本形状を示す図である。(a)は吸収型、(b)は反射型を示す。
  (a)の吸収型は入射するレーザー光31を遮光面5aで吸収するもので、例えば遮光面5aに光吸収膜や反射防止膜をコーティングし、光の反射率を低くした構造とする。吸収型の場合、遮光板5の形状は任意である。すなわち、スクリーン1または壁9への設置面5bと遮光面5aとの角度αは任意であり、90°でも構わない。
Next, the shape of the light shielding plate 5 will be specifically described.
FIG. 7A is a diagram illustrating a basic shape of the light shielding plate 5. (A) shows an absorption type, and (b) shows a reflection type.
The absorption type (a) absorbs the incident laser beam 31 by the light shielding surface 5a. For example, the light shielding surface 5a is coated with a light absorption film or an antireflection film to reduce the light reflectance. In the case of the absorption type, the shape of the light shielding plate 5 is arbitrary. That is, the angle α between the installation surface 5b on the screen 1 or the wall 9 and the light shielding surface 5a is arbitrary and may be 90 °.
 (b)の反射型は、遮光板5を三角柱状とし、遮光面5aで反射させる構造である。遮光板5の材料は問わないが、反射するレーザー光32の方向を考慮し、スクリーン1または壁9への設置面5bと遮光面(反射面)5aとの角度αは90°より小さくする必要がある。角度αが90°に等しいと、遮光面5aで反射したレーザー光32が、そのままスクリーン1または壁9と平行に戻り光となり、センサー150に達する恐れが高くなる。また、角度αを90°より大きくすると、遮光面5aで反射したレーザー光32がスクリーン1(表示画面20内)に照射され、これをセンサー150が検知して誤動作を起こす恐れがある。 (B) The reflection type has a structure in which the light-shielding plate 5 has a triangular prism shape and is reflected by the light-shielding surface 5a. The material of the light shielding plate 5 is not limited, but the angle α between the installation surface 5b on the screen 1 or the wall 9 and the light shielding surface (reflective surface) 5a needs to be smaller than 90 ° in consideration of the direction of the reflected laser light 32. There is. When the angle α is equal to 90 °, the laser beam 32 reflected by the light shielding surface 5a returns as it is in parallel with the screen 1 or the wall 9 and is likely to reach the sensor 150. If the angle α is larger than 90 °, the laser light 32 reflected by the light shielding surface 5a is irradiated onto the screen 1 (inside the display screen 20), which may be detected by the sensor 150 and cause a malfunction.
 以下、反射型の遮光板における遮光面(反射面)の好ましい角度について説明する。
  図7Bは、反射型遮光板5の遮光面角度の選択について説明する図である。この例では、遮光板5を三角柱状とし、スクリーン1または壁9への設置面5bと遮光面5aとの角度αを45°より小さくした場合である。(a)は設置全体の側面図、(b)は遮光板5の拡大図である。
Hereinafter, a preferable angle of the light shielding surface (reflection surface) in the reflective light shielding plate will be described.
FIG. 7B is a diagram illustrating selection of the light shielding surface angle of the reflective light shielding plate 5. In this example, the light shielding plate 5 has a triangular prism shape, and the angle α between the installation surface 5b on the screen 1 or the wall 9 and the light shielding surface 5a is smaller than 45 °. (A) is a side view of the entire installation, and (b) is an enlarged view of the light shielding plate 5.
 前提として、投射型表示装置は室内で使用されることが多く、インタラクティブ機能を備える投射型表示装置は、特に会議室や教室で使用される可能性が高い。会議室や教室では、壁、テーブルやデスクの背面板など、床面に対して垂直な角度を有する垂直面が多く存在する。これらを考慮すると、角度αのより好適な値は以下のようにするのが望ましい。 As a premise, projection display devices are often used indoors, and projection display devices having interactive functions are particularly likely to be used in conference rooms and classrooms. In conference rooms and classrooms, there are many vertical surfaces such as walls, tables and back plates of desks that have an angle perpendicular to the floor surface. Considering these, it is desirable that the more preferable value of the angle α is as follows.
 角度αを45°より小さくした場合には、遮光面5aで反射したレーザー光32は、水平方向よりも下方向に進む。進行方向にテーブルやデスクなどが存在しその垂直面10で再反射しても、遮光板5よりも下方位置に戻ることになる(符号33)。よって、戻り光がレーザーシート発生部300側に進んだりスクリーン1に照射されたりすることはなく、センサー150で誤検出する恐れが少ない。 When the angle α is smaller than 45 °, the laser light 32 reflected by the light shielding surface 5a travels downward from the horizontal direction. Even if there is a table or desk in the traveling direction and the light is reflected again by the vertical surface 10, it returns to a position below the light shielding plate 5 (reference numeral 33). Therefore, the return light does not travel to the laser sheet generating unit 300 side or irradiate the screen 1, and there is little possibility of erroneous detection by the sensor 150.
 図7Cは、反射型遮光板5の遮光面角度の選択について説明する図である。この例では、スクリーン1または壁9への設置面5bと遮光面5aとの角度αを45°とした場合のデメリットについて説明する。(a)は設置全体の側面図、(b)は遮光板5の拡大図である。 FIG. 7C is a diagram illustrating selection of the light shielding surface angle of the reflective light shielding plate 5. In this example, a demerit when the angle α between the installation surface 5b on the screen 1 or the wall 9 and the light shielding surface 5a is 45 ° will be described. (A) is a side view of the entire installation, and (b) is an enlarged view of the light shielding plate 5.
 角度αを45°とした場合、遮光面5aで反射したレーザー光32が、水平方向に進む。進行方向にテーブルやデスクが存在しその垂直面10で再反射すると、ほぼ同じ光路を戻り遮光板5に入射する(符号33)。さらに遮光面5aで反射してレーザーシート発生部300側に戻り(符号34)、センサー150で検知される恐れがある。よって、設置面5bと遮光面5aとの角度αについて45°は最も好ましくなく、避けるべきである。 When the angle α is 45 °, the laser beam 32 reflected by the light shielding surface 5a proceeds in the horizontal direction. When a table or desk exists in the traveling direction and is re-reflected by the vertical surface 10, it returns along substantially the same optical path and enters the light shielding plate 5 (reference numeral 33). Further, the light is reflected by the light shielding surface 5a and returned to the laser sheet generator 300 (reference numeral 34), which may be detected by the sensor 150. Therefore, the angle α between the installation surface 5b and the light-shielding surface 5a is most preferably 45 ° and should be avoided.
 また、図示していないが、角度αを45°より大きくした場合には、遮光面5aで反射したレーザー光32は、水平方向よりも上方向に進む。進行方向にテーブルやデスクなどが存在しその垂直面10で再反射すると、遮光板5よりも上方位置に戻りスクリーン1に照射されることがある。よって、角度αを45°より大きくするのは好ましくない。 Although not shown, when the angle α is larger than 45 °, the laser light 32 reflected by the light shielding surface 5a travels upward from the horizontal direction. If a table, desk, or the like exists in the traveling direction and re-reflects on the vertical surface 10, it may return to a position above the light shielding plate 5 and irradiate the screen 1. Therefore, it is not preferable to make the angle α larger than 45 °.
 以上のとおり、一般的な使用環境を考慮した場合、角度αは45°より小さくすることが望ましい。 As described above, it is desirable that the angle α is smaller than 45 ° in consideration of a general use environment.
 図7Dは、反射型遮光板5の遮光面角度の選択について説明する図である。この例では、遮光板5の設置面を2つの面から選択する構成としている。断面の三角形は、内角がα(<45°)とβ(>45°)を持つ略直角三角形とし、その斜辺を遮光面(反射面)5aに、他の2辺を2つの設置面5b、5cとする。そして、設置面5b、5cを状況に応じて切り替えて用いる。 FIG. 7D is a diagram illustrating selection of the light shielding surface angle of the reflective light shielding plate 5. In this example, the installation surface of the light shielding plate 5 is selected from two surfaces. The cross-sectional triangle is a substantially right-angled triangle with internal angles α (<45 °) and β (> 45 °). 5c. Then, the installation surfaces 5b and 5c are switched according to the situation.
 (a)は全体側面図で、遮光板の設置面を切り替えることで、反射光32を状態1と状態2に切り替える。状態1では反射光32は水平方向より下方向に、状態2では反射光32は水平方向より上方向に進む。
  (b)は状態1の設置拡大図で、設置面5bを用いることで遮光面5aとの角度をα(<45°)とし、反射光32を下方向に向けている。
  (c)は状態2の設置拡大図で、設置面5cを用いることで遮光面5aとの角をβ(>45°)とし、反射光32を上方向に向けている。
(A) is an overall side view, and the reflected light 32 is switched between state 1 and state 2 by switching the installation surface of the light shielding plate. In the state 1, the reflected light 32 travels downward from the horizontal direction, and in the state 2, the reflected light 32 travels upward from the horizontal direction.
(B) is an enlarged view of installation in state 1, and by using the installation surface 5b, the angle with the light shielding surface 5a is α (<45 °) and the reflected light 32 is directed downward.
(C) is an enlarged view of the state 2, and by using the installation surface 5c, the angle with the light shielding surface 5a is β (> 45 °) and the reflected light 32 is directed upward.
 通常は(b)の状態1で用いればよいが、反射光方向に何らかの反射物があり、その戻り光がセンサー150で検知されるような場合には、設置面を5bから5cに変更し、(c)の状態2に切り替える。これにより反射光の方向を変え、戻り光をなくせる可能性がある。このように、1つの遮光板5で2つの反射状態での設置を選択的に実現できるので実用的である。 Usually, it may be used in the state 1 of (b), but when there is some reflection in the reflected light direction and the return light is detected by the sensor 150, the installation surface is changed from 5b to 5c, Switch to state 2 of (c). This may change the direction of the reflected light and eliminate the return light. As described above, the installation in the two reflecting states can be selectively realized by the single light shielding plate 5, which is practical.
 なお、上記の変形として、遮光板5の3面のうち斜辺を設置面5aとし、他の2辺を2つの遮光面(反射面)5b、5cとして切り替えて用いても同様の効果がある。 As a modification of the above, the same effect can be obtained by switching the oblique side of the three surfaces of the light shielding plate 5 as the installation surface 5a and switching the other two sides as the two light shielding surfaces (reflection surfaces) 5b and 5c.
 以上説明した遮光板5は、遮光ユニット、反射ユニット、またはレーザー光調整ユニットと称してもよい。 The light shielding plate 5 described above may be referred to as a light shielding unit, a reflection unit, or a laser light adjustment unit.
 実施例1によれば、初期調整時にセンサー画像をスクリーンに表示するようにしたので、ユーザはセンサー画像から、インタラクティブ機能の障害となる異常反射の有無を容易に確認することができる。また、遮光板を設置した後、異常反射が解消されたことをセンサー画像から確認することができるので、調整作業が容易で確実なものとなる。 According to the first embodiment, since the sensor image is displayed on the screen at the time of the initial adjustment, the user can easily confirm the presence or absence of abnormal reflection that hinders the interactive function from the sensor image. In addition, since it is possible to confirm from the sensor image that the abnormal reflection has been eliminated after the light shielding plate is installed, the adjustment work is easy and reliable.
 実施例2では、投射型映像表示装置100と情報処理装置400を接続して、情報処理装置400側でインタラクティブ機能を実現する場合について説明する。 Example 2 describes a case where the projection type video display apparatus 100 and the information processing apparatus 400 are connected to realize an interactive function on the information processing apparatus 400 side.
 図8は、実施例2における情報処理装置400の内部構成を示す図である。情報処理装置400は、パーソナルコンピュータの例を示すが、その他に、タブレット型コンピュータスマートフォン等のモバイル端末などでもよい。 FIG. 8 is a diagram illustrating an internal configuration of the information processing apparatus 400 according to the second embodiment. The information processing apparatus 400 is an example of a personal computer, but may be a mobile terminal such as a tablet computer smartphone.
 映像出力部401は、投射型映像表示装置100の映像入力部131に対し映像データを出力する。通信部402は、投射型映像表示装置100の通信部132との間でインタラクティブ機能に関する各種制御信号を入出力する。ストレージ403は、インタラクティブ機能用プログラムや映像データなどを記憶する。インタラクティブ機能用プログラムは、事前に通信部402を経由して外部機器または外部サーバーから取得してストレージ403に格納しておけばよい。表示部404は、各種映像や操作画面を表示する。操作入力部405は、ユーザからの操作信号を受け付けるキーボードやマウスなどである。また、これを操作することで、投射型映像表示装置100の動作を操作することができる。 The video output unit 401 outputs video data to the video input unit 131 of the projection video display device 100. The communication unit 402 inputs and outputs various control signals related to the interactive function with the communication unit 132 of the projection display apparatus 100. The storage 403 stores an interactive function program, video data, and the like. The interactive function program may be acquired in advance from an external device or an external server via the communication unit 402 and stored in the storage 403. The display unit 404 displays various videos and operation screens. The operation input unit 405 is a keyboard or a mouse that receives an operation signal from the user. Also, by operating this, the operation of the projection display apparatus 100 can be operated.
 制御部406は、情報処理装置400内各部の動作を制御する。さらに、投射型映像表示装置100のセンサー150からの画像信号とインタラクティブ機能部410を用いて投射型映像表示装置100のインタラクティブ動作を制御する。不揮発性メモリ407は、インタラクティブ機能における各種操作用のデータ、表示アイコン、キャリブレーション用のデータを格納する。なおこれらのデータはストレージ403に記憶しても良い。 The control unit 406 controls the operation of each unit in the information processing apparatus 400. Further, the interactive operation of the projection display apparatus 100 is controlled using the image signal from the sensor 150 of the projection display apparatus 100 and the interactive function unit 410. The nonvolatile memory 407 stores data for various operations, display icons, and calibration data in the interactive function. These data may be stored in the storage 403.
 メモリ408は、ストレージ403に記憶しているインタラクティブ機能用プログラムを展開し、制御部406はメモリ408と協働して、インタラクティブ機能を実現する。インタラクティブ機能部410は、通信部402を介して投射型映像表示装置100から取得する情報に基づいてユーザの操作位置の座標を算出する座標算出部411と、キャリブレーションデータを作成するための処理を行うキャリブレーション部413を有する。 The memory 408 expands the interactive function program stored in the storage 403, and the control unit 406 cooperates with the memory 408 to realize the interactive function. The interactive function unit 410 calculates a coordinate of the user's operation position based on information acquired from the projection display apparatus 100 via the communication unit 402, and a process for creating calibration data. A calibration unit 413 is provided.
 メモリ408には、さらに、インタラクティブ機能部410の座標算出部411の算出した座標を操作入力として操作可能なアプリケーションプログラムであるアプリケーション部412を展開可能である。アプリケーション部412は、電子ホワイトボードの描画アプリケーションなど投射型映像表示装置100とともに実行することを前提としたアプリケーションだけでなく、情報処理装置400の操作入力部405(キーボードやマウスなど)の操作入力で操作する一般のパーソナルコンピュータ向けアプリケーションでも構わない。この場合、操作入力部405のマウス操作などで指定される操作座標を、座標算出部411が算出した座標に置き換えることにより、アプリケーションの操作を実現できる。このように操作されたアプリケーション部412の表示内容が、情報処理装置400の映像出力部401から投射型映像表示装置100に出力され、投射型映像表示装置100がこれを表示する。 Further, an application unit 412 that is an application program that can be operated using the coordinates calculated by the coordinate calculation unit 411 of the interactive function unit 410 as an operation input can be expanded in the memory 408. The application unit 412 is not only an application such as an electronic whiteboard drawing application that is assumed to be executed together with the projection display apparatus 100 but also an operation input of an operation input unit 405 (such as a keyboard and a mouse) of the information processing apparatus 400. An application for a general personal computer to be operated may be used. In this case, the operation of the application can be realized by replacing the operation coordinates designated by the mouse operation of the operation input unit 405 with the coordinates calculated by the coordinate calculation unit 411. The display content of the application unit 412 operated in this way is output from the video output unit 401 of the information processing device 400 to the projection video display device 100, and the projection video display device 100 displays it.
 このように本実施例では、投射型映像表示装置100のインタラクティブ機能部120の機能に代わって、情報処理装置400がインタラクティブ機能の一部またはすべてを実現することができる構成となっている。なお、情報処理装置400のインタラクティブ機能部410を用いる場合、投射型映像表示装置100が表示する映像は、情報処理装置400が出力する映像データ自体でよい。しかし、同時に他の映像出力装置(例えば、映像出力装置200)が接続されている場合に、投射型映像表示装置100は、情報処理装置400が出力する映像データと映像出力装置200が出力する映像データとを合成して表示してもよい。 As described above, in this embodiment, instead of the function of the interactive function unit 120 of the projection display apparatus 100, the information processing apparatus 400 can realize part or all of the interactive function. Note that when the interactive function unit 410 of the information processing apparatus 400 is used, the video displayed by the projection display apparatus 100 may be the video data itself output from the information processing apparatus 400. However, when another video output device (for example, the video output device 200) is connected at the same time, the projection video display device 100 displays the video data output from the information processing device 400 and the video output from the video output device 200. Data may be combined and displayed.
 図9は、インタラクティブ機能のモード選択画面の例を示す図である。投射型映像表示装置100が投射する表示画面20には、図のような「インタラクティブ機能モード選択メニュー」90を表示する。このメニューは、投射型映像表示装置100の操作信号入力部107を介したユーザの操作に基づいて、制御部110が不揮発性メモリ108に記憶されるメニュー画面画像等を用いて生成して表示する。このメニューでは、投射型映像表示装置側で処理するモード91、情報処理装置側で処理するモード92、及びインタラクティブ機能を使用しないモード93を表示し、ユーザはこれらから所望のモードを選択する。ユーザが情報処理装置側処理モード92を選択した場合は、情報処理装置400のメモリ408に展開されるインタラクティブ機能部410を用いてインタラクティブ機能を実行する。 FIG. 9 is a diagram showing an example of an interactive function mode selection screen. An “interactive function mode selection menu” 90 as shown in the figure is displayed on the display screen 20 projected by the projection display apparatus 100. This menu is generated and displayed by the control unit 110 using a menu screen image or the like stored in the non-volatile memory 108 based on a user operation via the operation signal input unit 107 of the projection display apparatus 100. . In this menu, a mode 91 for processing on the projection display apparatus side, a mode 92 for processing on the information processing apparatus side, and a mode 93 not using the interactive function are displayed, and the user selects a desired mode from these. When the user selects the information processing apparatus side processing mode 92, the interactive function is executed using the interactive function unit 410 developed in the memory 408 of the information processing apparatus 400.
 インタラクティブ機能部410がインタラクティブ機能部120よりも高性能な機能やインタラクティブ機能部120には無い機能を実現できる場合に、当該選択メニューによりモードの切り替えを可能とすることによって、情報処理装置400を準備してより高度なインタラクティブ機能を用いるか、情報処理装置400を準備せずに投射型映像表示装置でインタラクティブ機能を実行するかを選択することが可能となる。本実施例では、投射型映像表示装置100と情報処理装置400はそれぞれインタラクティブ機能部を有しているので、いずれかがインタラクティブ機能を実行できるが、その使用の準備の際に共に必要なデータ、例えばキャリブレーションデータについては、互いに共有することで効率化を図る。以下、本実施例のキャリブレーション動作について説明する。 When the interactive function unit 410 can realize a function higher than the interactive function unit 120 or a function that the interactive function unit 120 does not have, the information processing apparatus 400 is prepared by enabling the mode switching by the selection menu. Thus, it is possible to select whether to use a more advanced interactive function or to execute the interactive function on the projection display apparatus without preparing the information processing apparatus 400. In the present embodiment, since the projection display apparatus 100 and the information processing apparatus 400 each have an interactive function unit, either of them can execute the interactive function, but the data necessary for preparing for its use, For example, the calibration data is shared with each other to improve efficiency. Hereinafter, the calibration operation of the present embodiment will be described.
 図10Aは、キャリブレーション動作を自動で実行する場合の説明図である。キャリブレーション動作とは、投射型映像表示装置100のキャリブレーション部123、または情報処理装置400のキャリブレーション部413が、センサー150の撮影画像の座標を表示画面20の座標へ変換するためのデータを取得する動作である。 FIG. 10A is an explanatory diagram when the calibration operation is automatically executed. The calibration operation refers to data for the calibration unit 123 of the projection display apparatus 100 or the calibration unit 413 of the information processing apparatus 400 to convert the coordinates of the captured image of the sensor 150 into the coordinates of the display screen 20. It is an action to get.
 その手順は、以下のとおりとなる。以下の手順において、情報処理装置400と投射型映像表示装置100の動作の連動は、情報処理装置400の通信部402と、投射型映像表示装置100の通信部132との間で制御信号を相互に入出力することにより実現できる。投射型映像表示装置100は表示画面20内の所定位置に、可視光のマーク70(キャリブレーション用画像)を表示する。マークの形状や大きさや色は予め定めておき、各位置ごとにマークの大きさや形状を変えても良い。マーク70を含むキャリブレーション用映像データは、投射型映像表示装置100の不揮発性メモリ108や情報処理装置400のストレージ403等に記録しておけばよい。 The procedure is as follows. In the following procedure, the operation of the information processing apparatus 400 and the projection video display apparatus 100 is linked with the control signal between the communication unit 402 of the information processing apparatus 400 and the communication unit 132 of the projection video display apparatus 100. This can be realized by inputting and outputting to The projection display apparatus 100 displays a visible light mark 70 (calibration image) at a predetermined position in the display screen 20. The shape, size, and color of the mark may be determined in advance, and the size and shape of the mark may be changed for each position. The calibration video data including the mark 70 may be recorded in the nonvolatile memory 108 of the projection video display device 100, the storage 403 of the information processing device 400, or the like.
 投射型映像表示装置100のセンサー150は、表示画面20に表示されたキャリブレーション用画像、すなわち各マーク70を撮影する。ただしキャリブレーション用画像は可視光で生成されているので、センサー150は、上述のとおり、レーザーシート発生部300からのレーザー光(赤外光)だけでなく、上記の可視光成分も検出できるように構成しておく必要がある。 The sensor 150 of the projection display apparatus 100 captures a calibration image displayed on the display screen 20, that is, each mark 70. However, since the calibration image is generated with visible light, the sensor 150 can detect not only the laser light (infrared light) from the laser sheet generator 300 but also the visible light component as described above. It is necessary to configure.
 表示したキャリブレーション用画像中の各マーク70の位置と、センサー150で撮影した画像内の各マーク70の位置との対応関係を求め、両者の座標変換を行うためのキャリブレーションデータを取得する。 The correspondence between the position of each mark 70 in the displayed calibration image and the position of each mark 70 in the image photographed by the sensor 150 is obtained, and calibration data for performing coordinate conversion between the two is obtained.
 図10Bは、キャリブレーション動作を手動で実行する場合の説明図である。投射型映像表示装置100は表示画面20内の所定位置に、可視光のマーク80(キャリブレーション用画像)を表示する。このマーク80は、ユーザが発光ペン2または指3を用いてポインティングすべき位置を示している。このキャリブレーション用画像は、投射型映像表示装置100の不揮発性メモリ108や情報処理装置400のストレージ403等に記録しておく。キャリブレーション用画像を表示中に、投射型映像表示装置100はメッセージなどを用いて、ユーザに対して、発光ペン2または指3を用いて各位置のマーク80を1個ずつポインティングするように指示する。 FIG. 10B is an explanatory diagram when the calibration operation is executed manually. The projection display apparatus 100 displays a visible light mark 80 (calibration image) at a predetermined position in the display screen 20. The mark 80 indicates a position where the user should point using the light-emitting pen 2 or the finger 3. The calibration image is recorded in the nonvolatile memory 108 of the projection display apparatus 100, the storage 403 of the information processing apparatus 400, or the like. While displaying the calibration image, the projection display apparatus 100 uses a message or the like to instruct the user to point the mark 80 at each position one by one using the light-emitting pen 2 or the finger 3. To do.
 ユーザは、マーク80の表示された位置をポインティングすると、発光ペン2が発光し、または指3でレーザー光が反射され、これをセンサー150で検知する。そして、表示したキャリブレーション用画像中の各マーク80の位置と、センサー150で検知した光検出位置(ポインティング位置)との対応関係を求め、両者の座標変換を行うためのキャリブレーションデータを取得する。 When the user points to the position where the mark 80 is displayed, the light-emitting pen 2 emits light, or the laser beam is reflected by the finger 3, and this is detected by the sensor 150. Then, a correspondence relationship between the position of each mark 80 in the displayed calibration image and the light detection position (pointing position) detected by the sensor 150 is obtained, and calibration data for performing coordinate conversion between the two is obtained. .
 このようにして取得したキャリブレーションデータは、投射型映像表示装置100と情報処理装置400の両方で共有する。例えば、投射型映像表示装置100のキャリブレーション部123でキャリブレーションデータ取得処理を実行した場合、取得したキャリブレーションデータは、例えば不揮発性メモリ108に記憶され、当然投射型映像表示装置100側でインタラクティブ機能を実行する場合に用いることができる。さらに、当該キャリブレーションデータをいずれかのタイミングで、投射型映像表示装置100から両者の通信部を介して、情報処理装置400に伝送し、不揮発性メモリ407またはストレージ403に記憶する。これにより、情報処理装置400のインタラクティブ機能部410でインタラクティブ機能を実行する場合にも、情報処理装置400では、受信したキャリブレーションデータを引き継いで用いることができる。もちろん、その逆として、情報処理装置400のキャリブレーション部413でキャリブレーションデータ取得処理を実行し、取得したキャリブレーションデータをその後、両者の通信部を介して投射型映像表示装置100に送信して、投射型映像表示装置100で受信したキャリブレーションデータを引き継いで用いることができる。 The calibration data acquired in this way is shared by both the projection display apparatus 100 and the information processing apparatus 400. For example, when calibration data acquisition processing is executed by the calibration unit 123 of the projection display apparatus 100, the acquired calibration data is stored in, for example, the non-volatile memory 108 and is naturally interactive on the projection display apparatus 100 side. It can be used to execute a function. Further, the calibration data is transmitted from the projection display apparatus 100 to the information processing apparatus 400 via the communication unit of both at any timing, and stored in the nonvolatile memory 407 or the storage 403. Accordingly, even when the interactive function unit 410 of the information processing apparatus 400 executes an interactive function, the information processing apparatus 400 can take over and use the received calibration data. Of course, conversely, calibration data acquisition processing is executed by the calibration unit 413 of the information processing apparatus 400, and the acquired calibration data is then transmitted to the projection display apparatus 100 via both communication units. The calibration data received by the projection display apparatus 100 can be taken over and used.
 なお、投射型映像表示装置100に入力される映像のアスペクト比が変更になった場合、投射型映像表示装置100からの出力映像アスペクト比が変更になった場合、または多画面表示モードなど投射型映像表示装置100の表示レイアウトが変更になった場合など、キャリブレーションデータがそのまま使えなくなる場合がある。この場合は、図9のメニュー画面で選択されている一方の装置のキャリブレーション部により再度のキャリブレーションデータ取得処理を実行するように投射型映像表示装置100の映像中にメッセージを表示して当該キャリブレーションデータ取得処理を実行させてもよい。その場合は、上述の初期のキャリブレーションデータ取得処理後と同様に、取得したキャリブレーションデータを他方の装置に送信し、他方の装置はこれを引き継いで使用すればよい。 In addition, when the aspect ratio of the video input to the projection type video display device 100 is changed, when the output video aspect ratio from the projection type video display device 100 is changed, or in the projection type such as the multi-screen display mode. When the display layout of the video display device 100 is changed, the calibration data may not be used as it is. In this case, a message is displayed in the video of the projection display apparatus 100 so that the calibration data acquisition process is executed again by the calibration unit of the one apparatus selected on the menu screen of FIG. Calibration data acquisition processing may be executed. In that case, the acquired calibration data may be transmitted to the other device, and the other device may take over and use the same as after the initial calibration data acquisition processing described above.
 また、各種アスペクト比の変更や、表示レイアウトの変更はその前後の対応関係を幾何学的に計算できる。よって、ユーザにとって利便性の高い動作は以下の様になる。キャリブレーションデータ取得処理は、投射型映像表示装置100のキャリブレーション部123または情報処理装置400のキャリブレーション部413のいずれかにおいて1回行い、取得したキャリブレーションデータは、上述の方法によって、投射型映像表示装置100および情報処理装置400の両者で共有する。図9のメニューによって選択された装置側でインタラクティブ機能を実行している際に、各種アスペクト比の変更または表示レイアウトの変更があった場合に、当該各種アスペクト比の変更または表示レイアウトの変更に応じて、図9のメニューによって選択された装置側のキャリブレーション部において、元のキャリブレーションデータに対して幾何学変換処理を実行して変換後のキャリブレーションデータをその後のインタラクティブ機能の処理に用いればよい。 Also, changes in various aspect ratios and display layouts can be geometrically calculated before and after. Therefore, operations that are convenient for the user are as follows. The calibration data acquisition process is performed once in either the calibration unit 123 of the projection video display apparatus 100 or the calibration unit 413 of the information processing apparatus 400, and the acquired calibration data is projected by the above-described method. It is shared by both the video display device 100 and the information processing device 400. When the interactive function is executed on the device side selected by the menu in FIG. 9, when various aspect ratios or display layouts are changed, the various aspect ratios or display layouts are changed. If the calibration unit on the apparatus side selected by the menu of FIG. 9 executes the geometric conversion process on the original calibration data and uses the converted calibration data for the subsequent interactive function process. Good.
 このようにすれば、キャリブレーションデータ取得処理は一方の装置で1回のみ行えばよく、各種アスペクト比の変更または表示レイアウトの変更があった場合でも、または、図9のメニューによってインタラクティブ機能の処理主体が切り替えられたとしても、再度のキャリブレーションデータ取得処理を実行しなくとも当該システムによるインタラクティブ機能を継続して使用可能となるため、ユーザにとって非常に使い勝手が良い。なお、図9のメニューによってインタラクティブ機能の処理主体が情報処理装置400に設定されている場合に、上述の処理を実現するためには、投射型映像表示装置100の制御部110は、各種アスペクト比の変更または表示レイアウトの変更があった場合に、その変更内容が識別可能な制御情報を通信部132を介して情報処理装置400に送信する必要がある。 In this way, the calibration data acquisition process needs to be performed only once by one apparatus, and even when various aspect ratios or display layouts are changed, or when interactive functions are processed by the menu of FIG. Even if the subject is switched, the interactive function by the system can be used continuously without executing the calibration data acquisition process again, which is very convenient for the user. In addition, when the processing subject of the interactive function is set in the information processing apparatus 400 by the menu in FIG. 9, in order to realize the above-described processing, the control unit 110 of the projection display apparatus 100 has various aspect ratios. When there is a change or a display layout change, it is necessary to transmit control information capable of identifying the change contents to the information processing apparatus 400 via the communication unit 132.
 なお、図9のメニューによってインタラクティブ機能の処理主体が情報処理装置400に設定されている場合の座標算出処理の例について説明する。座標算出処理の例としては、以下のとおり少なくとも2通りの方式が考えられる。 Note that an example of coordinate calculation processing when the processing subject of the interactive function is set in the information processing apparatus 400 using the menu of FIG. 9 will be described. As an example of the coordinate calculation process, at least two methods can be considered as follows.
 第1の方式について説明する。まず、上述の通りのキャリブレーション処理により、情報処理装置400はキャリブレーションデータを取得し、ストレージ403または不揮発性メモリ407に記憶している。次に、投射型映像表示装置100のセンサー150が撮像したセンサー画像自体を通信部132を介して、情報処理装置400の通信部402に送信する。情報処理装置400の通信部402で受信したセンサー画像を、メモリ408に展開されたインタラクティブ機能部410の座標算出部411が解析し、発光ペンの発光またはレーザーシートの散乱光を認識し、当該発光または散乱光のセンサー150の撮影範囲での座標(ユーザが操作した位置)を算出する。 The first method will be described. First, the information processing apparatus 400 acquires calibration data by the calibration process as described above, and stores it in the storage 403 or the nonvolatile memory 407. Next, the sensor image itself captured by the sensor 150 of the projection display apparatus 100 is transmitted to the communication unit 402 of the information processing apparatus 400 via the communication unit 132. The sensor image received by the communication unit 402 of the information processing apparatus 400 is analyzed by the coordinate calculation unit 411 of the interactive function unit 410 developed in the memory 408, and the light emission of the light emitting pen or the scattered light of the laser sheet is recognized and the light emission is performed. Alternatively, the coordinates (position operated by the user) in the imaging range of the scattered light sensor 150 are calculated.
 さらに座標算出部411はストレージ403または不揮発性メモリ407に記憶されているキャリブレーションデータを用いて、センサー150の撮影範囲でのユーザ操作位置の座標を、投射型映像表示装置100の表示画面20内の座標に対応する座標に変換する。この際、ストレージ403または不揮発性メモリ407に記憶されているキャリブレーションデータは、キャリブレーション時の想定アスペクト比や、情報処理装置400の出力アスペクト比、投射型映像表示装置100の表示アスペクト比、投射型映像表示装置100の表示レイアウト等に応じて幾何学的補正を行った上で使用すればよい。このようにして、座標算出部411が算出した座標は、アプリケーション部412の操作に用いられる。 Further, the coordinate calculation unit 411 uses the calibration data stored in the storage 403 or the non-volatile memory 407 to calculate the coordinates of the user operation position in the imaging range of the sensor 150 in the display screen 20 of the projection display apparatus 100. Convert to coordinates corresponding to the coordinates of. At this time, the calibration data stored in the storage 403 or the nonvolatile memory 407 includes an assumed aspect ratio at the time of calibration, an output aspect ratio of the information processing apparatus 400, a display aspect ratio of the projection display apparatus 100, and a projection. What is necessary is just to use it, after performing geometric correction according to the display layout etc. of the type | mold video display apparatus 100. FIG. Thus, the coordinates calculated by the coordinate calculation unit 411 are used for the operation of the application unit 412.
 次に、第2の方式について説明する。まず、上述の通りのキャリブレーション処理により、情報処理装置400はキャリブレーションデータを取得し、ストレージ403または不揮発性メモリ407に記憶している。第2の方式では、投射型映像表示装置100のセンサー150が撮像したセンサー画像は、インタラクティブ機能部120の座標算出部121に送られる。座標算出部121は、センサー画像を解析して発光ペンの発光またはレーザーシートの散乱光の位置(ユーザ操作位置)について、センサー150の撮影範囲における座標を算出する。当該ユーザ操作位置のセンサー150の撮影範囲における座標情報を通信部132を介して、情報処理装置400の通信部402に送信する。 Next, the second method will be described. First, the information processing apparatus 400 acquires calibration data by the calibration process as described above, and stores it in the storage 403 or the nonvolatile memory 407. In the second method, the sensor image captured by the sensor 150 of the projection display apparatus 100 is sent to the coordinate calculation unit 121 of the interactive function unit 120. The coordinate calculation unit 121 analyzes the sensor image and calculates the coordinates in the imaging range of the sensor 150 for the position of light emitted from the light-emitting pen or the scattered light of the laser sheet (user operation position). The coordinate information in the imaging range of the sensor 150 at the user operation position is transmitted to the communication unit 402 of the information processing apparatus 400 via the communication unit 132.
 情報処理装置400では、座標算出部411がストレージ403または不揮発性メモリ407に記憶されているキャリブレーションデータを用いて、通信部402で受信した当該ユーザ操作位置のセンサー150の撮影範囲における座標情報を、投射型映像表示装置100の表示画面20内の座標に対応する座標に変換する。この変換処理は第1の方式と同様である。このようにして、座標算出部411が算出した座標は、アプリケーション部412の操作に用いられる。 In the information processing apparatus 400, the coordinate calculation unit 411 uses the calibration data stored in the storage 403 or the nonvolatile memory 407 to obtain the coordinate information in the imaging range of the sensor 150 of the user operation position received by the communication unit 402. Then, the coordinates are converted into coordinates corresponding to the coordinates in the display screen 20 of the projection display apparatus 100. This conversion process is the same as in the first method. Thus, the coordinates calculated by the coordinate calculation unit 411 are used for the operation of the application unit 412.
 以上説明した第2の座標算出方式によれば、投射型映像表示装置100から情報処理装置400へセンサー画像自体を送信する必要がないので、両者間の通信データを削減でき、帯域の大きなインタフェースでなくとも実現できるというメリットがある。 According to the second coordinate calculation method described above, since it is not necessary to transmit the sensor image itself from the projection display apparatus 100 to the information processing apparatus 400, communication data between the two can be reduced, and an interface with a large bandwidth can be used. There is a merit that it can be realized without it.
 なお、本実施例では、投射型映像表示装置100から情報処理装置400の双方にキャリブレーション部を備える例を説明した。しかし、投射型映像表示装置100のみがキャリブレーション部を備えるように構成してもよい。この場合は、キャリブレーションデータ作成処理は常に投射型映像表示装置100のキャリブレーション部123により作成され、作成されたキャリブレーションデータは、通信部132を介して情報処理装置400に送信されて、投射型映像表示装置100と情報処理装置400の両者で共有され用いられることとなる。 In the present embodiment, the example in which the calibration unit is provided in both of the projection display apparatus 100 and the information processing apparatus 400 has been described. However, only the projection display apparatus 100 may be configured to include a calibration unit. In this case, the calibration data creation process is always created by the calibration unit 123 of the projection display apparatus 100, and the created calibration data is transmitted to the information processing apparatus 400 via the communication unit 132 and projected. The type video display device 100 and the information processing device 400 are shared and used.
 以上説明した実施例2によれば、投射型映像表示装置100だけでなく、情報処理装置400についてもインタラクティブ機能部の一部またはすべてを実行可能であるので、状況に応じて両者の機能を使い分けることができ、使い勝手に優れたシステムを構築できる。その際、キャリブレーションデータについては、両者で共有することができるので、効率が向上する。 According to the second embodiment described above, not only the projection display apparatus 100 but also the information processing apparatus 400 can execute a part or all of the interactive function unit, so that both functions are used properly according to the situation. Can build a system that is easy to use. At that time, the calibration data can be shared by both parties, so the efficiency is improved.
 なお、実施例2の当該効果は、発光ペンのみを用いるシステムでも効果がある。この場合は、レーザーシート発生部300は不要である。 Note that the effect of Example 2 is also effective in a system that uses only a light-emitting pen. In this case, the laser sheet generator 300 is not necessary.
 1:スクリーン(投射面)、2:発光ペン、3:指(操作物)、4:障害物(反射物)、5:遮光板、5a:遮光面(反射面)、5b,5c:設置面、20:表示画面、21:映像領域、22:操作アイコン領域、30:反射光、40:異常反射、60:メッセージ、70,80:マーク、90:モード選択メニュー、100:投射型映像表示装置、101:投射光学系、108:不揮発性メモリ、110:制御部、120:インタラクティブ機能部、121:座標算出部、122:アプリケーション部、123:キャリブレーション部、150:センサー、200:映像出力装置、300:レーザーシート発生部、304:レーザー光源、305:レーザーシート化部、350:レーザーシート、400:情報処理装置、402:通信部、403:ストレージ、410:インタラクティブ機能部、413:キャリブレーション部。 1: Screen (projection surface), 2: Light emitting pen, 3: Finger (operation object), 4: Obstacle (reflection object), 5: Light shielding plate, 5a: Light shielding surface (reflection surface), 5b, 5c: Installation surface , 20: display screen, 21: video area, 22: operation icon area, 30: reflected light, 40: extraordinary reflection, 60: message, 70, 80: mark, 90: mode selection menu, 100: projection type video display device , 101: projection optical system, 108: nonvolatile memory, 110: control unit, 120: interactive function unit, 121: coordinate calculation unit, 122: application unit, 123: calibration unit, 150: sensor, 200: video output device , 300: Laser sheet generating unit, 304: Laser light source, 305: Laser sheet forming unit, 350: Laser sheet, 400: Information processing device, 402: Communication unit 403: storage, 410: interactivity unit, 413: calibration unit.

Claims (9)

  1.  映像を表示する投射型映像表示装置と情報処理装置とを含む映像表示システムにおいて、
     前記投射型映像表示装置は、
     表示映像を投射面に投射する投射光学系と、
     前記投射面に接触した操作物が発光または反射する光を撮像可能なセンサーと、
     前記センサーによる撮像画像にもとづいて前記操作物の位置を求め、該位置に応じて映像の表示を制御する第1のインタラクティブ機能部と、
     前記情報処理装置と情報を送受信可能な第1の通信部と、
    を有し、
     前記情報処理装置は、
     前記投射型映像表示装置と情報を送受信可能な第2の通信部と、
     前記第2の通信部を介して前記投射型映像表示装置から受信した前記センサーによる撮像画像または該センサーによる撮像画像にもとづいて生成された情報から前記操作物の位置を算出し、該位置に応じて前記投射型映像表示装置における映像の表示を制御可能な第2のインタラクティブ機能部を有し、
     前記投射型映像表示装置は、前記投射型映像表示装置の表示する前記表示映像に対するインタラクティブ機能について、前記投射型映像表示装置の前記第1のインタラクティブ機能部によって動作する第1の動作モードで行うか、前記情報処理装置の前記第2のインタラクティブ機能部によって動作する第2の動作モードで行うかを選択可能なメニュー画面を表示可能であることを特徴とする映像表示システム。
    In a video display system including a projection-type video display device that displays video and an information processing device,
    The projection-type image display device
    A projection optical system for projecting a display image onto a projection surface;
    A sensor capable of imaging light that is emitted or reflected by the operation article in contact with the projection surface;
    A first interactive function unit for obtaining a position of the operation article based on an image captured by the sensor and controlling display of an image according to the position;
    A first communication unit capable of transmitting and receiving information to and from the information processing apparatus;
    Have
    The information processing apparatus includes:
    A second communication unit capable of transmitting and receiving information to and from the projection display apparatus;
    The position of the operation article is calculated from the image captured by the sensor or the information generated based on the image captured by the sensor received from the projection video display device via the second communication unit, and the position is determined according to the position. A second interactive function unit capable of controlling display of video in the projection-type video display device,
    Whether the projection-type image display device performs an interactive function for the display image displayed by the projection-type image display device in a first operation mode operated by the first interactive function unit of the projection-type image display device. A video display system capable of displaying a menu screen capable of selecting whether to perform the operation in the second operation mode operated by the second interactive function unit of the information processing apparatus.
  2.  請求項1に記載の映像表示システムにおいて、
     前記投射型映像表示装置の前記第1のインタラクティブ機能部は、前記センサーによる撮像画像における前記操作物の位置の座標を、前記表示映像における位置の座標へ座標変換するために用いるキャリブレーションデータを生成する処理を行う第1のキャリブレーション部を有し、
     前記投射型映像表示装置は、前記第1のキャリブレーション部で生成したキャリブレーションデータを前記第1の通信部を介して前記情報処理装置に送信することを特徴とする映像表示システム。
    The video display system according to claim 1,
    The first interactive function unit of the projection-type image display device generates calibration data used for coordinate conversion of the coordinates of the position of the operation article in the image captured by the sensor into the coordinates of the position in the display image. A first calibration unit that performs processing to
    The projection-type video display device transmits calibration data generated by the first calibration unit to the information processing device via the first communication unit.
  3.  請求項2に記載の映像表示システムにおいて、
     前記情報処理装置の前記第2のインタラクティブ機能部は、前記センサーによる撮像画像における前記操作物の位置の座標を、前記表示映像における位置の座標へ座標変換するために用いるキャリブレーションデータを生成する処理を行う第2のキャリブレーション部を有し、
     前記情報処理装置は、前記第2のキャリブレーション部で生成したキャリブレーションデータを前記第2の通信部を介して前記投射型映像表示装置に送信することを特徴とする映像表示システム。
    The video display system according to claim 2,
    The second interactive function unit of the information processing device generates calibration data used for coordinate conversion of the coordinates of the position of the operation article in the image captured by the sensor into the coordinates of the position in the display image. A second calibration unit for performing
    The information processing apparatus transmits calibration data generated by the second calibration unit to the projection display apparatus via the second communication unit.
  4.  外部の情報処理装置と情報を送受信可能な投射型映像表示装置であって、
     表示映像を投射面に投射する投射光学系と、
     前記投射面に接触した操作物が発光または反射する光を撮像可能なセンサーと、
     前記センサーによる撮像画像にもとづいて前記操作物の位置を求め、該位置に応じて映像の表示を制御するインタラクティブ機能部と、
     前記情報処理装置と情報を送受信可能な通信部と
    を有し、
     前記投射型映像表示装置は、前記投射型映像表示装置の表示する前記表示映像に対するインタラクティブ機能について、前記投射型映像表示装置の前記インタラクティブ機能部の処理によって行うか、前記情報処理装置の処理によって行うかを選択可能なメニュー画面を表示可能であることを特徴とする投射型映像表示装置。
    A projection-type video display device capable of transmitting and receiving information to and from an external information processing device,
    A projection optical system for projecting a display image onto a projection surface;
    A sensor capable of imaging light that is emitted or reflected by the operation article in contact with the projection surface;
    An interactive function unit that obtains a position of the operation article based on an image captured by the sensor and controls display of an image according to the position;
    A communication unit capable of transmitting and receiving information to and from the information processing apparatus;
    The projection type video display device performs an interactive function for the display video displayed by the projection type video display device by processing of the interactive function unit of the projection type video display device or by processing of the information processing device. A projection type image display device capable of displaying a menu screen capable of selecting either of them.
  5.  請求項4に記載の投射型映像表示装置において、
     前記投射型映像表示装置の前記インタラクティブ機能部は、前記センサーによる撮像画像における前記操作物の位置の座標を、前記表示映像における位置の座標へ座標変換するために用いるキャリブレーションデータを生成する処理を行うキャリブレーション部を有し、
     前記投射型映像表示装置は、前記キャリブレーション部で生成したキャリブレーションデータを前記通信部を介して前記情報処理装置に送信することを特徴とする投射型映像表示装置。
    In the projection type video display device according to claim 4,
    The interactive function unit of the projection type video display device generates a calibration data used for coordinate conversion of the coordinates of the position of the operation article in the image captured by the sensor into the coordinates of the position in the display video. A calibration unit to perform,
    The projection-type image display apparatus transmits calibration data generated by the calibration unit to the information processing apparatus via the communication unit.
  6.  請求項5に記載の投射型映像表示装置において、
     前記投射型映像表示装置の通信部は、前記情報処理装置から送信されるキャリブレーションに関する情報を受信し、前記インタラクティブ機能部の処理に用いることを特徴とする投射型映像表示装置。
    In the projection type video display device according to claim 5,
    The projection type video display device, wherein the communication unit of the projection type video display device receives information related to calibration transmitted from the information processing device and uses the information for processing of the interactive function unit.
  7.  映像を投射して表示する投射型映像表示装置と、レーザー光を発光する発光ユニットと、該レーザー光を遮光または反射するレーザー光調整ユニットとを含む映像表示システムに用いる投射型映像表示装置であって、
     前記投射型映像表示装置は、
     表示映像を投射面に投射する投射光学系と、
     前記発光ユニットが発するレーザー光を前記投射面に接触した操作物が反射した反射光を撮像可能なセンサーと、
     前記センサーによる撮像画像にもとづいて前記操作物の位置を求め、該位置に応じて映像の表示を制御するインタラクティブ機能部を備え、
     初期調整モードとして、前記レーザー光調整ユニットの設置位置を調整するために、前記センサーにより前記投射面を含む領域を撮像した撮像画像を前記投射光学系により前記投射面に表示することを特徴とする投射型映像表示装置。
    A projection-type image display device used in an image display system including a projection-type image display device that projects and displays an image, a light-emitting unit that emits laser light, and a laser light adjustment unit that blocks or reflects the laser light. And
    The projection-type image display device
    A projection optical system for projecting a display image onto a projection surface;
    A sensor capable of imaging reflected light reflected by an operation object that has contacted the projection surface with laser light emitted by the light emitting unit;
    An interactive function unit that obtains the position of the operation article based on an image captured by the sensor and controls display of an image according to the position;
    As an initial adjustment mode, in order to adjust an installation position of the laser light adjustment unit, a captured image obtained by capturing an area including the projection surface by the sensor is displayed on the projection surface by the projection optical system. Projection-type image display device.
  8.  請求項7に記載の投射型映像表示装置において、
     前記初期調整モードにおいて、前記センサーにより前記投射面を含む領域を撮像した撮像画像を前記投射光学系により前記投射面に表示する際に、異常反射が有る場合にレーザー光調整ユニットを設置するよう指示するメッセージをともに表示することを特徴とする投射型映像表示装置。
    In the projection type video display device according to claim 7,
    In the initial adjustment mode, an instruction to install a laser light adjustment unit when abnormal reflection is present when a captured image obtained by capturing an area including the projection surface by the sensor is displayed on the projection surface by the projection optical system. A projection-type image display device that displays a message to be displayed together.
  9.  映像表示システムであって
     映像を投射面に投射して表示する投射型映像表示装置と、
     レーザー光を発光する発光ユニットと、
     前記レーザー光を反射する反射ユニットとを含み、
     前記反射ユニットは、
     前記投射面側に設置するための設置面と、
     前記レーザー光を入射方向とは異なる方向に反射するための反射面とを備え、
     前記設置面と前記反射面とがなす角度αが45°より小さい三角柱状としたことを特徴とする映像表示システム。
    A projection display device that is a video display system that projects and displays video on a projection surface;
    A light emitting unit that emits laser light;
    A reflection unit that reflects the laser light,
    The reflection unit is
    An installation surface for installation on the projection surface side;
    A reflection surface for reflecting the laser beam in a direction different from the incident direction,
    An image display system, wherein an angle α formed by the installation surface and the reflecting surface is a triangular prism shape smaller than 45 °.
PCT/JP2014/072696 2014-08-29 2014-08-29 Video display system and projection-type video display device WO2016031038A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2014/072696 WO2016031038A1 (en) 2014-08-29 2014-08-29 Video display system and projection-type video display device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2014/072696 WO2016031038A1 (en) 2014-08-29 2014-08-29 Video display system and projection-type video display device

Publications (1)

Publication Number Publication Date
WO2016031038A1 true WO2016031038A1 (en) 2016-03-03

Family

ID=55398974

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2014/072696 WO2016031038A1 (en) 2014-08-29 2014-08-29 Video display system and projection-type video display device

Country Status (1)

Country Link
WO (1) WO2016031038A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004070912A (en) * 2002-06-10 2004-03-04 Konica Minolta Holdings Inc Portable terminal equipment
JP2011096834A (en) * 2009-10-29 2011-05-12 J&K Car Electronics Corp Optical type reflecting object detector and touch panel
JP2014138257A (en) * 2013-01-16 2014-07-28 Ricoh Co Ltd Image projection device, image projection system, control method and program

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004070912A (en) * 2002-06-10 2004-03-04 Konica Minolta Holdings Inc Portable terminal equipment
JP2011096834A (en) * 2009-10-29 2011-05-12 J&K Car Electronics Corp Optical type reflecting object detector and touch panel
JP2014138257A (en) * 2013-01-16 2014-07-28 Ricoh Co Ltd Image projection device, image projection system, control method and program

Similar Documents

Publication Publication Date Title
US10228611B2 (en) Projector, projection system, and control method of projector
US11606851B2 (en) Lighting apparatus
JP6068392B2 (en) Projection capturing system and projection capturing method
US9740338B2 (en) System and methods for providing a three-dimensional touch screen
JP2011216088A (en) Projection system with touch-sensitive projection image
JPH1124839A (en) Information input device
US20110148824A1 (en) Optical pen
JP2007141199A (en) Handheld computer cursor controlling device, computer device for controlling cursor using handheld computer cursor controlling device and method, and computer readable medium
JP6665194B2 (en) Lighting equipment
JP2017009829A (en) Image projection device, image projection system and video supply device
US10805586B2 (en) Projection display unit with detection function
JP2017182109A (en) Display system, information processing device, projector, and information processing method
JP2012234149A (en) Image projection device
JP4712754B2 (en) Information processing apparatus and information processing method
WO2016031038A1 (en) Video display system and projection-type video display device
US10410323B2 (en) Display apparatus, information processing apparatus, and information processing method for displaying a second image that includes options for manipulating a first image
JP4687820B2 (en) Information input device and information input method
JP3770308B2 (en) Light pointer
JP2017009664A (en) Image projection device, and interactive type input/output system
JP2012053603A (en) Information display system
KR20240048110A (en) Computer input system with virtual touch screen with improved recognition rate
JP2015052874A (en) Display device, and control method of the same
JP2020135518A (en) Projection device and control method for the same
JP2017167028A (en) Control system and control method
JP2015018269A (en) Image projection device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14900840

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

NENP Non-entry into the national phase

Ref country code: JP

122 Ep: pct application non-entry in european phase

Ref document number: 14900840

Country of ref document: EP

Kind code of ref document: A1