WO2015045125A1 - 映像投射装置 - Google Patents
映像投射装置 Download PDFInfo
- Publication number
- WO2015045125A1 WO2015045125A1 PCT/JP2013/076386 JP2013076386W WO2015045125A1 WO 2015045125 A1 WO2015045125 A1 WO 2015045125A1 JP 2013076386 W JP2013076386 W JP 2013076386W WO 2015045125 A1 WO2015045125 A1 WO 2015045125A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- video
- projection device
- light
- detection element
- video projection
- Prior art date
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3179—Video signal processing therefor
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B17/00—Details of cameras or camera bodies; Accessories therefor
- G03B17/48—Details of cameras or camera bodies; Accessories therefor adapted for combination with other photographic or optical apparatus
- G03B17/54—Details of cameras or camera bodies; Accessories therefor adapted for combination with other photographic or optical apparatus with projector
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/36—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3191—Testing thereof
- H04N9/3194—Testing thereof including sensor feedback
Definitions
- Patent Document 1 even when the projection range of the projector is changed, the human detection means is controlled so that the projection range and the detection range of the sensor for detecting a person are the same range. The person was detected for. However, since the detection range is wide and the sensitivity is low, it has been difficult to detect fine movements (hereinafter referred to as gestures) such as human gestures.
- gestures fine movements
- a function is developed to detect a gesture of an operator of the projection device and control the projection device itself or a display image corresponding to the gesture.
- a gesture of an operator of the projection device
- control such that when a certain gesture is detected, the power of the projection apparatus is turned off, or an image to be displayed is scrolled and frame-advanced.
- the present invention employs, for example, the configurations described in the claims.
- the present application includes a plurality of components that solve the above-described problems.
- the present invention is a video projection device that projects light having video information and displays video, and generates light.
- FIG. 1 is an overview of the projection apparatus of the first embodiment.
- the configuration of the present embodiment includes a projection device 1, a detection element 2, a screen 10, and a projection device arrangement table 11.
- the projection device 1 is arranged on the projection device arrangement table 11 and projects an image on the screen 10.
- the detection element 2 detects a gesture in the detection range 2a.
- the detection element 2 may have a light source for detecting a gesture, or may be a passive sensor without a light source.
- FIG. 2A and 2B are first and second overview diagrams showing a projection state of the projection apparatus 1.
- FIG. 2A is a diagram illustrating a case where an image is projected onto the screen 10 (hereinafter, when projecting on a wall surface)
- FIG. 2B is a diagram illustrating a case where an image is projected onto the projection device placement table 11 (hereinafter, when projecting on a desk). It is a figure.
- FIG. 2B since the projection is performed on the desk, it is assumed that the operator does not block the image, and the gesture is performed near the screen.
- the detection range includes both operation ranges, the sensitivity decreases as the detection range becomes wider.
- detection is performed including a range unnecessary for one projection state, there is a possibility of erroneous detection such as detecting a gesture of a person other than the operator. From the above, it can be seen that in order to achieve both sensitivity and detection of the required operation range, it is necessary to switch the detection range in accordance with the projection state.
- the gesture detection unit 14 includes a detection signal calculation unit 3 and a detection range switching unit 5.
- the detection signal calculation unit 3 includes a signal detection unit 3a, a gesture determination unit 3b, and an operation signal generation unit 3c.
- the signal detection unit 3a detects a signal including the gesture information of the operator supplied from the detection element 2, and supplies the detected signal to the gesture determination unit 3b.
- the gesture determination unit 3b performs signal processing for determining various gesture movements.
- the operation signal generation unit 3c outputs an operation command signal corresponding to the output signal of the gesture determination unit 3b to an external device 6 such as a PC (Personal Computer) or a smartphone.
- the external device 6 controls the video signal supplied to the projection device 1 according to the operation signal of the operation signal generation unit 3c.
- the image projected on the screen 10 from the projection device 1 is controlled in accordance with the operation command signal generated based on the gesture by the operator. For example, according to the direction in which the operator moves his / her hand, an operation such as scrolling or frame-by-frame displaying is performed.
- the external device 6 may be any device as long as it supplies a video signal to the projection device 1.
- a card-like storage medium inserted into a card interface provided in the projection apparatus 1 may be used.
- the projection unit 4 includes a video control unit 4a, a light source unit 4b, a light control unit 4c, and a projection mirror 4e.
- the video control unit 4a outputs control signals to the light source unit 4b and the light control unit 4c according to the video signal supplied from the external device 6.
- the light source unit 4b includes a light source that emits light such as a halogen lamp, an LED, and a laser, and adjusts the amount of light according to the output signal of the video control unit 4a. When the light source unit 4b includes three colors of R, G, and B, the light amount may be controlled independently according to the video signal.
- the light control unit 4c has optical system components such as a mirror, a lens, a prism, and an imager (for example, a display device such as a liquid crystal panel), and is supplied from the external device 6 using light emitted from the light source unit 4b. An optical image based on the received image signal is generated.
- the projection lens 4d enlarges the image output from the light control unit 4c.
- the projection mirror 4e reflects the light radiated from the projection lens 4d and projects an image on the previous screen 10, for example.
- the projection mirror 4e uses an aspherical mirror, and when projecting an image of the same size, the projection distance can be shortened compared to a general projection device.
- the projection unit 4 using the projection mirror 4e has been shown, but other configurations may be used as long as the configuration can realize video projection as well as this configuration.
- the projection lens 4d and the projection mirror 4e may be collectively referred to as a projection optical unit.
- the configuration of the detection range switching unit 5 will be described, and in particular, the detection range switching unit and detection sensitivity setting will be described.
- the projection state can be detected not only at the time of wall surface projection and desktop projection but also when the projection device 1 is disposed obliquely by using the above-described sensor.
- the detection element 2 has a laser light source as a light source for gesture detection
- the projection distance to the screen 10 is projected based on the time difference between the projected light and the reflected light by projecting the laser light source toward the screen 10. Measurement is possible.
- a signal corresponding to the distance information detected by the detection element 2 is output from the signal detection unit 3a to the detection range switching signal generation unit 5a.
- the detecting element 2 for example, a photodiode for detecting laser light or a pyroelectric sensor for detecting infrared rays generated by a human body is used. Depending on which one is used as the electromagnetic wave for detection, the sensor to be used changes as well as whether or not the detection element 2 has a light source.
- the detection element control unit 5b controls the detection range 2a of the detection element 2 in accordance with the signal supplied from the detection range switching signal generation unit 5a.
- FIG. 4 is a diagram showing a mechanism for adjusting the inclination of the detection element 2.
- the detection element 2 is disposed on a turntable 12 included in the projection apparatus 1.
- the turntable 12 controls the set angle of the detection element 2 according to the signal from the detection element control unit 5b. In this way, the position of the detection range 2a can be switched.
- a movable lens may be provided on the detection element 2.
- the movable lens adjusts the detection region 2a by changing the distance from the detection element 2 in accordance with the projection direction of the projection apparatus 1.
- the detection element 2 is a pyroelectric sensor that detects a change in the intensity of electromagnetic waves such as infrared rays
- the detection area can be enlarged or reduced by moving the position of the hole or Fresnel lens above the pyroelectric sensor. .
- the detection range 2 ⁇ and the detection center angle ⁇ are determined by the gesture position Lg, the height hs of the detection element 2, and the operation region H.
- the operation range H varies depending on the projection state of the projection unit 1. As described above, the operation range H is large during the wall surface projection, and the operation area H is small during the desktop projection.
- the detection element 2 When a passive sensor (for example, a pyroelectric sensor) that does not have a light source for gesture detection is used as the detection element 2, in order to realize highly accurate gesture detection, the detection range is a desktop as described below. Alternatively, it is effective to set so as not to overlap the wall surface.
- a passive sensor for example, a pyroelectric sensor
- FIG. 7A and 7B are first and second diagrams showing a detection method when a passive sensor is used.
- the detection amount of the pyroelectric sensor is determined by the ratio of the heat source occupying the detection range and the heat amount. In other words, the greater the proportion of gestures that occupy the detection range, the greater the detection amount.
- FIG. 7A there is a region where a gesture cannot be performed when the detection range overlaps with the projection device arrangement table 11. For this reason, it becomes difficult to obtain a large detection amount.
- a dead zone hg is provided at the lower side of the detection range as shown in FIG. 7B, and the detection range is narrowed and optimized so as to exclude regions where gestures cannot be performed. Thereby, a large detection amount can be obtained.
- the sensitivity may be adjusted according to the user and environment. Considering the detection amount of the pyroelectric sensor, even if a gesture is performed under the same conditions, the detection amount differs because the temperature of the hand varies depending on the user. Even the same user has different detection amounts depending on the environmental temperature. Even when detection is performed using a laser, the detection amount varies depending on the reflectance of the user's hand. Therefore, the sensitivity may be improved by adjusting the detection region according to the user and the environment. When using a laser, the intensity of the laser may be increased, or the scanning range may be limited to partially improve the sensitivity.
- FIG. 8 is a diagram showing a case where the image of the projection apparatus 1 is a multi-screen. Although three screens are shown in FIG. 8, one is a main screen and the other two are sub-screens. For example, when an operation is performed by gesturing an icon displayed on the sub screen while looking at the main screen, it is not necessary to set the main screen as a detection range, so the detection range is limited to only two sub screens. Since the detection range is narrowed, the detection amount of the detection element 2 is increased, and the detection sensitivity is increased. A difference in laser intensity and detection range may be provided between the sub-screens.
- FIG. 10 is an overview diagram showing a configuration for realizing line scanning using a laser light source.
- the line scan is realized by using a laser light source 7, a variable angle mirror 8, a light receiving element 9, and a cylindrical lens 13.
- the light emitted from the laser light source 7 is reflected at an arbitrary angle by the angle variable mirror 8.
- the light reflected by the variable angle mirror 8 enters the cylindrical lens 13 and becomes a line light source having a width in the Z direction.
- the variable angle mirror 8 uses a mirror that scans only in the X direction shown in FIGS. 10, 11A, and 11B.
- the detection element 2 using the laser according to the first embodiment uses a mirror that scans two-dimensionally, which increases the cost, but can detect information in three axial directions.
- the one-dimensionally scanned mirror of the present embodiment detects only in the biaxial direction as compared with the first embodiment, but the cost is low.
- the second embodiment shows a method of obtaining a function similar to that in the case of using a mirror that scans two-dimensionally by giving priority to cost by using the mirror 8 and the cylindrical lens 13 that scan one-dimensionally.
- the detection sensitivity decreases as the distance from the upper portion to the end portion increases. That is, there is a possibility that sufficient sensitivity cannot be obtained even if a gesture is made at the end of the screen 10. Therefore, it is necessary to make the laser intensity constant regardless of the location so that the same sensitivity can be obtained regardless of the position on the screen 10.
- the light intensity generated in the laser light source 7 may be adjusted depending on whether the light is irradiated directly on the detection element 2 or the edge. Good. Any other method may be used as long as the same function can be realized.
- the laser light source 7 and the light receiving element 9 use an infrared wavelength region so that the light emitted from the projection apparatus 1 does not affect the detection of the gesture.
- the infrared wavelength region By using the infrared wavelength region, highly sensitive detection can be performed even under external light.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Computer Hardware Design (AREA)
- Human Computer Interaction (AREA)
- Transforming Electric Information Into Light Information (AREA)
- Projection Apparatus (AREA)
- Position Input By Displaying (AREA)
Abstract
Description
本願は上記課題を解決する構成要素を複数含んでいるが、その一例を挙げるならば、本発明は映像情報を有する光を投射して映像を表示する映像投射装置であって、光を発生する光源部と、該光源部が発生した光を用いて前記映像情報に基づく光学的な映像を生成する光制御部と、該光制御部が生成した光を投射する投射光学部を有する投射部と、前記映像投射装置の操作者のジェスチャを検出するセンサを備える検出素子と、該検出素子が検出したジェスチャに基づき前記映像情報に係る映像信号を操作するための操作信号を生成する操作信号生成部を有するジェスチャ検出部を備えることを特徴としている。
2θ=Arc Tan ((H-hs)/Lg)+Arc Tan (hs/Lg) ・・・(式1)
φ =(1/2)(Arc Tan ((H-hs)/Lg)-Arc Tan (hs/Lg)) (式2)
上式が示すように、検出範囲2θ、検出中心角度φはジェスチャ位置Lg、検出素子2の高さhs、操作領域Hにより決定される。操作範囲Hは投射部1の投射状態により変わる。前述したが、壁面投射時は操作範囲Hが大きく机上投射時は操作領域Hが小さい。実使用環境を想定すると机上投射時の操作領域Hは数cm~10cm程度、壁面投射時は数cm~数10cmと想定される。ジェスチャ位置Lgは例えば投射装置1から最も離れた画面サイズ端部の距離としてもよい。また、検出素子2にレーザ光源を用いる場合、安全性を考慮し人の目に照射しない領域に操作範囲Hを設けてもよい。机上投射時、投射装置1の正面に人が座った際、目に照射しないよう、検出範囲2θ、検出中心角度φを設定してもよい。
L[m] = 3.0 x 108 x t / 2 ・・・(式3)
次にレーザを用いたラインスキャンの構成を説明する。
Claims (15)
- 映像情報を有する光を投射して映像を表示する映像投射装置であって、
光を発生する光源部と、該光源部が発生した光を用いて前記映像情報に基づく光学的な映像を生成する光制御部と、該光制御部が生成した光を投射する投射光学部を有する投射部と、
前記映像投射装置の操作者のジェスチャを検出するセンサを備える検出素子と、該検出素子が検出したジェスチャに基づき前記映像情報に係る映像信号を操作するための操作信号を生成する操作信号生成部を有するジェスチャ検出部
を備えることを特徴とする映像投射装置。 - 請求項1に記載の映像投射装置において、
前記検出素子は、前記検出素子が前記ジェスチャを検出する範囲を制御する検出素子制御部を有し、該検出素子制御部は、前記投射部が光を投射する方向、及び/又は、前記映像投射装置の設置状態に応じて、前記検出素子が前記ジェスチャを検出する範囲を制御する
ことを特徴とする映像投射装置。 - 請求項1に記載の映像投射装置において、
前記検出素子は、前記ジェスチャを三次元的に検出する
ことを特徴とする映像投射装置。 - 請求項3に記載の映像投射装置において、
前記検出素子は、前記映像投射装置の操作者に照射するための光を発生する光源と、該光源が発生した光を反射して前記人に照射する設定角度が制御自在な角度可変ミラーを有する
ことを特徴とする映像投射装置。 - 請求項4に記載の映像投射装置において、
前記検出素子が有する光源は、レーザ光源である
ことを特徴とする映像投射装置。 - 請求項4に記載の映像投射装置において、
前記検出素子が有するセンサは、フォトダイオードである
ことを特徴とする映像投射装置。 - 請求項2に記載の映像投射装置において、
前記検出素子が有するセンサは、前記映像投射装置に係る人が発生する赤外線を検出する焦電型センサである
ことを特徴とする映像投射装置。 - 請求項7に記載の映像投射装置において、
前記検出素子は前記焦電型センサに前記赤外線を集光するフレネルレンズを有する
ことを特徴とする映像投射装置。 - 請求項8に記載の映像投射装置において、
前記検出素子制御部は、前記投射部が光を投射する方向、及び/又は、前記映像投射装置の設置状態に応じて、前記フレネルレンズと前記焦電型センサとの間の距離を制御する
ことを特徴とする映像投射装置。 - 請求項4に記載の映像投射装置において、
前記角度可変ミラーから供給された光を受け、特定の平面の方向に前記光の進行方向を拡げて前記人に照射するシリンドリカルレンズ
を有することを特徴とする映像投射装置。 - 請求項10に記載の映像投射装置において、
前記特定の平面の方向は、前記投射部が投射する光の投射方向である
ことを特徴とする映像投射装置。 - 請求項11に記載の映像投射装置において、
前記シリンドリカルレンズのレンズ面の曲率は、中央部から端部に向かって増加する
ことを特徴とする映像投射装置。 - 請求項6に記載の映像投射装置において、
前記検出素子は、前記光源から光を出射した時刻と、前記センサが前記光を検出した時刻との差に基づくTOF方式により距離を測定する
ことを特徴とする映像投射装置。 - 請求項4に記載の映像投射装置において、
前記投射部は、前記表示部に複数の映像に基づく光を異なる表示領域に投射し、
前記検出素子は、前記複数の表示領域のいずれかにおいて前記ジェスチャを検出する
ことを特徴とする映像投射装置。 - 請求項1に記載の映像投射装置において、
前記ジェスチャ検出部は、前記投射部が光を投射する方向、及び/又は、前記映像投射装置の設置状態を検出するための検出器を有する
ことを特徴とする映像投射装置。
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2013/076386 WO2015045125A1 (ja) | 2013-09-27 | 2013-09-27 | 映像投射装置 |
EP13894422.8A EP3051345B1 (en) | 2013-09-27 | 2013-09-27 | Video projection device |
US14/892,700 US9942529B2 (en) | 2013-09-27 | 2013-09-27 | Image projection device |
JP2015538761A JP6134804B2 (ja) | 2013-09-27 | 2013-09-27 | 映像投射装置 |
CN201380077050.3A CN105247414B (zh) | 2013-09-27 | 2013-09-27 | 影像投射装置 |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2013/076386 WO2015045125A1 (ja) | 2013-09-27 | 2013-09-27 | 映像投射装置 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2015045125A1 true WO2015045125A1 (ja) | 2015-04-02 |
Family
ID=52742327
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2013/076386 WO2015045125A1 (ja) | 2013-09-27 | 2013-09-27 | 映像投射装置 |
Country Status (5)
Country | Link |
---|---|
US (1) | US9942529B2 (ja) |
EP (1) | EP3051345B1 (ja) |
JP (1) | JP6134804B2 (ja) |
CN (1) | CN105247414B (ja) |
WO (1) | WO2015045125A1 (ja) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2016199267A1 (ja) * | 2015-06-11 | 2016-12-15 | 日立マクセル株式会社 | 光測距装置及びその制御方法、及びそれを用いたジェスチャ検出装置 |
WO2017060943A1 (ja) * | 2015-10-05 | 2017-04-13 | 日立マクセル株式会社 | 光測距装置及び映像投写装置 |
KR20170131044A (ko) * | 2016-05-20 | 2017-11-29 | 이탁건 | 전자기기 및 그 동작 방법 |
WO2017212601A1 (ja) * | 2016-06-09 | 2017-12-14 | 日立マクセル株式会社 | 光測距装置、及びこれを備えた映像投写装置 |
CN112687213A (zh) * | 2020-12-28 | 2021-04-20 | 青岛海信激光显示股份有限公司 | 激光投影设备及其控制方法 |
Families Citing this family (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2017146927A (ja) * | 2016-02-19 | 2017-08-24 | ソニーモバイルコミュニケーションズ株式会社 | 制御装置、制御方法及びプログラム |
EP3588247B1 (en) * | 2017-02-24 | 2023-07-12 | Sony Group Corporation | Information processing device, information processing method, and program |
JP6712609B2 (ja) * | 2018-02-28 | 2020-06-24 | コイト電工株式会社 | 非接触入力装置 |
CN108595001A (zh) * | 2018-04-20 | 2018-09-28 | 丝路视觉科技股份有限公司 | 投影控制方法、投影操控台及主机 |
CN109089091B (zh) * | 2018-08-01 | 2020-07-24 | 联想(北京)有限公司 | 一种投影设备及其控制方法 |
US10649539B1 (en) * | 2019-03-05 | 2020-05-12 | Motorola Mobility Llc | Hand temperature compensation |
CN110347007B (zh) * | 2019-05-05 | 2021-06-04 | 青岛小鸟看看科技有限公司 | 一种投影灯中激光器的校准方法和装置 |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008146374A (ja) * | 2006-12-11 | 2008-06-26 | Seiko Epson Corp | プロジェクタ |
JP2008227883A (ja) * | 2007-03-13 | 2008-09-25 | Brother Ind Ltd | プロジェクタ |
JP2008287142A (ja) * | 2007-05-21 | 2008-11-27 | Brother Ind Ltd | 画像投影装置 |
JP2009258569A (ja) * | 2008-04-21 | 2009-11-05 | Ricoh Co Ltd | 電子機器 |
JP2010258623A (ja) * | 2009-04-22 | 2010-11-11 | Yamaha Corp | 操作検出装置 |
JP2011043834A (ja) | 2010-09-21 | 2011-03-03 | Sanyo Electric Co Ltd | 投写型映像表示装置 |
JP2011188008A (ja) * | 2010-03-04 | 2011-09-22 | Nec Corp | プロジェクタシステム |
JP2012032464A (ja) * | 2010-07-29 | 2012-02-16 | Funai Electric Co Ltd | プロジェクタ |
JP2012220419A (ja) * | 2011-04-12 | 2012-11-12 | Panasonic Corp | 検知装置 |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE10242161A1 (de) * | 2002-09-10 | 2004-03-11 | Philips Intellectual Property & Standards Gmbh | Drahtloses Projektionssystem |
US7325933B2 (en) * | 2004-08-09 | 2008-02-05 | Sanyo Electric Co., Ltd | Projection type video display apparatus |
US8018579B1 (en) * | 2005-10-21 | 2011-09-13 | Apple Inc. | Three-dimensional imaging and display system |
US9116037B2 (en) * | 2006-10-13 | 2015-08-25 | Fresnel Technologies, Inc. | Passive infrared detector |
WO2011013240A1 (ja) * | 2009-07-31 | 2011-02-03 | Necディスプレイソリューションズ株式会社 | 投射型表示装置及び光量調整方法 |
US8947349B1 (en) * | 2010-03-02 | 2015-02-03 | Rawles Llc | Projecting content onto a display medium |
US9134799B2 (en) * | 2010-07-16 | 2015-09-15 | Qualcomm Incorporated | Interacting with a projected user interface using orientation sensors |
US20120154595A1 (en) * | 2010-12-17 | 2012-06-21 | Sony Ericsson Mobile Communications Ab | Integrated Camera-Projection Device |
KR20130001762A (ko) * | 2011-06-28 | 2013-01-07 | 삼성전자주식회사 | 영상 생성 장치 및 방법 |
US9109886B1 (en) * | 2012-10-09 | 2015-08-18 | Amazon Technologies, Inc. | Time-of-flight of light calibration |
-
2013
- 2013-09-27 JP JP2015538761A patent/JP6134804B2/ja active Active
- 2013-09-27 US US14/892,700 patent/US9942529B2/en active Active
- 2013-09-27 CN CN201380077050.3A patent/CN105247414B/zh active Active
- 2013-09-27 EP EP13894422.8A patent/EP3051345B1/en active Active
- 2013-09-27 WO PCT/JP2013/076386 patent/WO2015045125A1/ja active Application Filing
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008146374A (ja) * | 2006-12-11 | 2008-06-26 | Seiko Epson Corp | プロジェクタ |
JP2008227883A (ja) * | 2007-03-13 | 2008-09-25 | Brother Ind Ltd | プロジェクタ |
JP2008287142A (ja) * | 2007-05-21 | 2008-11-27 | Brother Ind Ltd | 画像投影装置 |
JP2009258569A (ja) * | 2008-04-21 | 2009-11-05 | Ricoh Co Ltd | 電子機器 |
JP2010258623A (ja) * | 2009-04-22 | 2010-11-11 | Yamaha Corp | 操作検出装置 |
JP2011188008A (ja) * | 2010-03-04 | 2011-09-22 | Nec Corp | プロジェクタシステム |
JP2012032464A (ja) * | 2010-07-29 | 2012-02-16 | Funai Electric Co Ltd | プロジェクタ |
JP2011043834A (ja) | 2010-09-21 | 2011-03-03 | Sanyo Electric Co Ltd | 投写型映像表示装置 |
JP2012220419A (ja) * | 2011-04-12 | 2012-11-12 | Panasonic Corp | 検知装置 |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2016199267A1 (ja) * | 2015-06-11 | 2016-12-15 | 日立マクセル株式会社 | 光測距装置及びその制御方法、及びそれを用いたジェスチャ検出装置 |
WO2017060943A1 (ja) * | 2015-10-05 | 2017-04-13 | 日立マクセル株式会社 | 光測距装置及び映像投写装置 |
KR20170131044A (ko) * | 2016-05-20 | 2017-11-29 | 이탁건 | 전자기기 및 그 동작 방법 |
KR101976605B1 (ko) * | 2016-05-20 | 2019-05-09 | 이탁건 | 전자기기 및 그 동작 방법 |
JP2019522860A (ja) * | 2016-05-20 | 2019-08-15 | コアダー カンパニー リミテッド | 電子機器及びその動作方法 |
US11169640B2 (en) | 2016-05-20 | 2021-11-09 | Coredar Co., Ltd. | Electronic device and operating method therefor |
WO2017212601A1 (ja) * | 2016-06-09 | 2017-12-14 | 日立マクセル株式会社 | 光測距装置、及びこれを備えた映像投写装置 |
CN112687213A (zh) * | 2020-12-28 | 2021-04-20 | 青岛海信激光显示股份有限公司 | 激光投影设备及其控制方法 |
CN112687213B (zh) * | 2020-12-28 | 2022-07-26 | 青岛海信激光显示股份有限公司 | 激光投影设备及其控制方法 |
Also Published As
Publication number | Publication date |
---|---|
CN105247414B (zh) | 2017-07-18 |
JP6134804B2 (ja) | 2017-05-24 |
EP3051345B1 (en) | 2019-12-18 |
CN105247414A (zh) | 2016-01-13 |
US20160105653A1 (en) | 2016-04-14 |
EP3051345A1 (en) | 2016-08-03 |
US9942529B2 (en) | 2018-04-10 |
EP3051345A4 (en) | 2017-08-09 |
JPWO2015045125A1 (ja) | 2017-03-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6134804B2 (ja) | 映像投射装置 | |
JP6222830B2 (ja) | 画像投射装置 | |
US8690337B2 (en) | Device and method for displaying an image on a VUI screen and also a main projection screen | |
US10194125B2 (en) | Projection apparatus | |
US20160334939A1 (en) | Interactive system | |
JP5974189B2 (ja) | 投写型映像表示装置及び投写型映像表示方法 | |
KR20080098374A (ko) | 인터랙티브 디스플레이 조명 및 개체 검출 방법과 인터랙티브 디스플레이 시스템 | |
JP2016520891A (ja) | 表示システムおよび方法 | |
JP2010244484A (ja) | 画像表示装置、画像表示方法および画像表示プログラム | |
KR20150034016A (ko) | 촉감 피드백을 제공하는 곡면 터치 디스플레이 장치 및 그 방법 | |
JPWO2011111201A1 (ja) | 画像位置調整装置 | |
US20120218225A1 (en) | Optical scanning type touch apparatus and operation method thereof | |
JP2016006447A (ja) | 画像表示装置 | |
WO2017141956A1 (ja) | 空間表示装置 | |
US20130003028A1 (en) | Floating virtual real image display apparatus | |
JP6102751B2 (ja) | インターフェース装置およびインターフェース装置の駆動方法 | |
JP2018164251A (ja) | 画像表示装置およびその制御方法 | |
WO2017212601A1 (ja) | 光測距装置、及びこれを備えた映像投写装置 | |
US20120327130A1 (en) | Floating virtual plasma display apparatus | |
JP2014170149A (ja) | プロジェクタ | |
JP6106565B2 (ja) | 映像投射装置 | |
JP2014170136A (ja) | プロジェクタおよびプロジェクタ機能を有する電子機器 | |
KR20140100105A (ko) | 프로젝터 장치 및 그 제어 방법 | |
KR20210070799A (ko) | 디스플레이 장치 | |
WO2016056102A1 (ja) | 映像投写装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13894422 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2015538761 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14892700 Country of ref document: US |
|
REEP | Request for entry into the european phase |
Ref document number: 2013894422 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2013894422 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |