WO2009031457A1 - Image projection apparatus and control method for same - Google Patents

Image projection apparatus and control method for same Download PDF

Info

Publication number
WO2009031457A1
WO2009031457A1 PCT/JP2008/065469 JP2008065469W WO2009031457A1 WO 2009031457 A1 WO2009031457 A1 WO 2009031457A1 JP 2008065469 W JP2008065469 W JP 2008065469W WO 2009031457 A1 WO2009031457 A1 WO 2009031457A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
region
unit
projection apparatus
detection unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2008/065469
Other languages
English (en)
French (fr)
Inventor
Yoshiki Ishii
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Canon Inc
Original Assignee
Canon Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Canon Inc filed Critical Canon Inc
Priority to CN2008800184686A priority Critical patent/CN101681217B/zh
Priority to US12/447,920 priority patent/US8118433B2/en
Priority to EP08792795.0A priority patent/EP2191350B1/en
Publication of WO2009031457A1 publication Critical patent/WO2009031457A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0425Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/001Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3179Video signal processing therefor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3191Testing thereof
    • H04N9/3194Testing thereof including sensor feedback
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3197Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM] using light modulating optical valves

Definitions

  • FIGS. 5A to 5D are diagrams schematically illustrating in-use states of the image projection apparatus according to the first embodiment of the present invention.
  • the projected image eliminating unit 104 receives projector characteristic information from the projection unit 102 as well as controls the projection unit 102 and the camera unit 103 depending on the result of the comparison between the projected image and the sensed image, again in order to improve the accuracy of elimination of the projected image.
  • projector characteristic information for example, color temperature information or the like as an example of the projector characteristic information can be used to exercise white balance control over the camera. These types of control will be described.
  • FIG. 2 is a block diagram illustrating a configuration example of the projected image eliminating unit 104 in the image projection apparatus 100 in the present embodiment.
  • An input image being projected by the projection unit 102 (an output from the GUI overlay unit 113) is input from a terminal 201, whereas an image being sensed by the camera unit 103 (a camera sensed image) is input from a terminal 202.
  • An input image correction unit 205 corrects input image signals, based on:
  • a sensed image correction unit 206 corrects sensed image signals, based on:
  • the sensed image with the projected image eliminated is input from the projected image eliminating unit 104 to a terminal 401.
  • the sensed image with the projected image eliminated is written in a memory 403 via a switch 407 in a closed state.
  • the memory 403 is configured to be able to store sensed images, and configured to hold previous sensed images supplied to the difference circuit 402 until a significant region determination unit 405 completes determination.
  • a difference circuit 402 a difference image is obtained between the input sensed image with the projected image eliminated and a previous sensed image with a projected image eliminated that is delayed by a predetermined period of time by the memory 403. Then, the difference is supplied to a threshold value processing unit 404.
  • FIG. 9 shows an example of the effective region of the moving object region detection unit 105.
  • a region A 901 is a region adjacent to a periphery of the image, and is considered as a region that is likely to be contained in a sensed region due to camera shake to cause false detection of moving objects particularly in a case in which the image projection apparatus 100 is held by hand.
  • the threshold value for determining a closed region as a significant region can be made higher than the threshold value applied to a region B 902 other than the peripheral portion of the image.
  • the closed region may be detected from only the region B 902.
  • the GUI overlay unit 113 superimposes an operation GUI image prepared in advance onto an image projected from the projection unit 102, depending on the display position and GUI image from the overlay GUI control unit 111, and supplies to the projection unit 102 the operation GUI image superimposed onto the image projected from the projection unit 102. This results in the operation GUI image overlay being displayed (superimposed) on the projected image.
  • the operation GUI can be displayed when a region of a user's hand is detected by the moving object region detection unit 105 as a significant region and the position of a fingertip is detected by the hot spot position detection unit 106 as a hot spot position.
  • the hot spot movement detection unit 107 compares the hot spot position detected by the hot spot position detection unit 106 with previous position history stored in the hot spot position memory 108 to detect movements of the hot spot.
  • the hot spot movement detection unit 107 outputs information relating to the detected movements of the hot spot (for example, information indicating types of movements (such as circular motion and linear motion) ) to the command decode unit 109.
  • the system control unit 110 controls components which are not shown in the figure, and carries out processing in accordance with the device control command. For example, if the device control command is a command for switching the projected image, image data to be displayed next is read from a storage device that is not shown in the figure and supplied via the terminal 101 to the overlay unit 113. [0071] As examples of the operation determined by the command decode unit 109, the following is conceivable.
  • an image projection apparatus 1000 in the present embodiment is characterized in that a particular shape region detection unit 1014 and a particular shape memory 1015 are provided instead of the moving object region detection unit 105.
  • a particular shape region detection unit 1014 and a particular shape memory 1015 are provided instead of the moving object region detection unit 105.
  • FIG. 11 is a block diagram illustrating a configuration example of the particular shape region detection unit 1014 in the image projection apparatus 1000 in the present embodiment.
  • the sensed image with the projected image eliminated is input to a terminal 1101 from the projected image elimination unit 104.
  • a spatial filter unit 1102 predetermined smoothing is carried out for eliminating random noise components.
  • a shape detection unit 1103 detects, based on the shape feature quantity information obtained via a terminal 1104 from the particular shape memory 1015, a particular shape region in the sensed image with the projected image eliminated, and outputs from a terminal 1105 information for specifying the detected particular shape region.
  • the particular shape region is detected, and it is thus unlikely that a pattern or the like existing on the projection surface will be falsely detected. Therefore, it is not necessary to obtain the difference between two sensed images with projected images eliminated as in the first embodiment .
  • an input image 601 that is, a projected image is a face mark, and there is a star pattern on a projection surface to serve as a background. Therefore, the face mark and the star pattern are both contained in a sensed image 602.
  • the input image 601 contains an image of the operation GUI.
  • the example of FIG. 12 differs from the example of FIG.
  • a hot spot movement detection unit 107 detects movements of the hot spot, from temporal variations of the hot spot position 607 detected in S13.
  • a command decode unit 109 specifies, in a case in which the operation GUI 609 is displayed, the display position of an operation GUI 609 and an operation instructed by the user from the movement of the hot spot detected in S14 and the position of the hot spot, and outputs a corresponding device operation command .
  • an image projection apparatus 1300 in the present embodiment includes both a moving object region detection unit 105, and a particular shape region detection unit 1014 and a particular shape memory 1015, and is characterized in that moving object region detection as in the first embodiment and particular shape region detection as in the second embodiment are used adaptively and selectively in accordance with control exercised by a detection control unit 1317.
  • moving object region detection as in the first embodiment and particular shape region detection as in the second embodiment are used adaptively and selectively in accordance with control exercised by a detection control unit 1317.
  • the moving object region detection unit 105 detects a significant region regarded as a moving object, from a sensed image with a projected image eliminated, which is output from a projected image elimination unit 104.
  • the configuration of the moving object region detection unit 105 is as described in the first embodiment with reference to FIG. 3, in a case in which the significant region is detected in the present embodiment, the shape feature quantity as information for specifying the significant region is stored in a particular shape memory 1015 in accordance with control exercised by the detection control unit 1317.
  • the detection control unit 1317 switches a switch 1316 in accordance with the detection result from the moving object region detection unit 105 and the detection result from the particular shape region detection unit 1014, and selects region information to be supplied to a hot spot position detection unit 106. [0109] More specifically, the detection control unit 1317 switches the switch 1316 such that the output from the moving object region detection unit 105 is selected when the region of the operation object is not detected, as shown in FIG. 5B. Further, the detection control unit 1317 switches the switch 1316 such that the output from the particular shape region detection unit 1014 is selected when the region of the operation object is detected, as shown in FIG. 5C. In addition, in this case, the detection control unit 1317 instructs the moving object region detection unit 105 to write the shape feature quantity of the detected moving object region (significant region) in the particular shape memory 1015.
  • the command decode unit 109 may determine a user operation without the use of position information on the operation GUI.
  • the particular movement itself of the hot spot may be determined as a command.
  • the circular motion of the hot spot can be determined as a display command for the operation GUI.
  • the standing-still can be regarded as "the particular movement of the hot spot" herein, that is, determined as a user operation.
  • the user can gesture to operate the image projection apparatus.
  • the image of the operation GUI does not interfere with seeing an input image.
  • the user may gesture in any location on the projected image, not in a particular region of the operation GUI, which thus has the advantage of a high degree of freedom in operation.
  • the determination of a user operation with the use of the operation GUI and the operation of a user operation based on only position information on the hot spot can be executed in combination.
  • the determination of a user operation based on only position information on the hot spot can be executed for an operation for displaying the operation GUI
  • the determination of a user operation based on both the display position of the operation GUI and position information on the hot spot can be carried out after displaying the operation GUI.
  • no camera unit has to be provided as long as a sensed image of a projection surface containing a projected image can be obtained.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Computer Hardware Design (AREA)
  • User Interface Of Digital Computer (AREA)
  • Projection Apparatus (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Position Input By Displaying (AREA)
  • Transforming Electric Information Into Light Information (AREA)
PCT/JP2008/065469 2007-09-04 2008-08-22 Image projection apparatus and control method for same Ceased WO2009031457A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN2008800184686A CN101681217B (zh) 2007-09-04 2008-08-22 图像投影设备及其控制方法
US12/447,920 US8118433B2 (en) 2007-09-04 2008-08-22 Image projection apparatus and control method for same
EP08792795.0A EP2191350B1 (en) 2007-09-04 2008-08-22 Image projection apparatus and control method for same

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2007229456A JP4829855B2 (ja) 2007-09-04 2007-09-04 画像投影装置及びその制御方法
JP2007-229456 2007-09-04

Publications (1)

Publication Number Publication Date
WO2009031457A1 true WO2009031457A1 (en) 2009-03-12

Family

ID=40428778

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2008/065469 Ceased WO2009031457A1 (en) 2007-09-04 2008-08-22 Image projection apparatus and control method for same

Country Status (6)

Country Link
US (1) US8118433B2 (enExample)
EP (1) EP2191350B1 (enExample)
JP (1) JP4829855B2 (enExample)
KR (1) KR20090087960A (enExample)
CN (1) CN101681217B (enExample)
WO (1) WO2009031457A1 (enExample)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102193730A (zh) * 2010-03-10 2011-09-21 索尼公司 图像处理设备,图像处理方法和程序
CN102118594B (zh) * 2009-12-31 2014-01-22 鸿富锦精密工业(深圳)有限公司 前投影系统及方法
RU2598598C2 (ru) * 2010-10-04 2016-09-27 Сони Корпорейшн Устройство обработки информации, система обработки информации и способ обработки информации

Families Citing this family (45)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6330926B1 (en) 1999-09-15 2001-12-18 Hill-Rom Services, Inc. Stretcher having a motorized wheel
JP5194723B2 (ja) * 2007-11-05 2013-05-08 カシオ計算機株式会社 投影装置、投影方法及びプログラム
WO2009150786A1 (ja) * 2008-06-09 2009-12-17 株式会社村田製作所 弾性表面波装置及びその製造方法
JP5200800B2 (ja) * 2008-09-16 2013-06-05 富士ゼロックス株式会社 撮影装置及び撮影システム
US20100218100A1 (en) * 2009-02-25 2010-08-26 HNTB Holdings, Ltd. Presentation system
JP2011151764A (ja) * 2009-06-09 2011-08-04 Ricoh Co Ltd 描画画像共有装置
JP5412227B2 (ja) * 2009-10-05 2014-02-12 日立コンシューマエレクトロニクス株式会社 映像表示装置、および、その表示制御方法
TW201122706A (en) * 2009-12-31 2011-07-01 Hon Hai Prec Ind Co Ltd Front projection system and method
JP5560722B2 (ja) * 2010-01-12 2014-07-30 セイコーエプソン株式会社 画像処理装置、画像表示システム、および画像処理方法
JP5560721B2 (ja) * 2010-01-12 2014-07-30 セイコーエプソン株式会社 画像処理装置、画像表示システム、及び画像処理方法
JP5740822B2 (ja) * 2010-03-04 2015-07-01 ソニー株式会社 情報処理装置、情報処理方法およびプログラム
CN101907952A (zh) * 2010-03-25 2010-12-08 上海电子艺术发展有限公司 桌面互动点餐系统及其使用方法
CN102223508A (zh) * 2010-04-14 2011-10-19 鸿富锦精密工业(深圳)有限公司 前投影控制系统及方法
JP5656002B2 (ja) * 2010-05-12 2015-01-21 セイコーエプソン株式会社 プロジェクターおよび制御方法
CN102314259B (zh) * 2010-07-06 2015-01-28 株式会社理光 一种在显示区域内检测物体的方法和设备
JP5304848B2 (ja) * 2010-10-14 2013-10-02 株式会社ニコン プロジェクタ
JP5633320B2 (ja) * 2010-11-05 2014-12-03 株式会社リコー 描画画像共有装置
JP5834690B2 (ja) * 2011-09-22 2015-12-24 カシオ計算機株式会社 投影装置、投影制御方法及びプログラム
US8896688B2 (en) * 2011-11-04 2014-11-25 Hewlett-Packard Development Company, L.P. Determining position in a projection capture system
DE102012206851A1 (de) * 2012-04-25 2013-10-31 Robert Bosch Gmbh Verfahren und Vorrichtung zum Ermitteln einer im Lichtkegel eines projizierten Bildes ausgeführten Geste
JP6135239B2 (ja) * 2012-05-18 2017-05-31 株式会社リコー 画像処理装置、画像処理プログラム、画像処理方法
JP2014092715A (ja) * 2012-11-05 2014-05-19 Toshiba Corp 電子機器、情報処理方法及びプログラム
JP6106565B2 (ja) * 2013-09-25 2017-04-05 日立マクセル株式会社 映像投射装置
US20150102993A1 (en) * 2013-10-10 2015-04-16 Omnivision Technologies, Inc Projector-camera system with an interactive screen
JP2015094768A (ja) * 2013-11-08 2015-05-18 セイコーエプソン株式会社 表示装置、表示システムおよび制御方法
JP6330292B2 (ja) 2013-11-20 2018-05-30 セイコーエプソン株式会社 プロジェクター、及び、プロジェクターの制御方法
JP6343910B2 (ja) 2013-11-20 2018-06-20 セイコーエプソン株式会社 プロジェクター、及び、プロジェクターの制御方法
JP6307852B2 (ja) * 2013-11-26 2018-04-11 セイコーエプソン株式会社 画像表示装置、及び、画像表示装置の制御方法
CN104683720B (zh) * 2013-11-28 2019-12-03 联想(北京)有限公司 一种电子设备及控制方法
JP6127958B2 (ja) * 2013-12-19 2017-05-17 ソニー株式会社 情報処理装置、情報処理方法、並びにプログラム
JP6267520B2 (ja) * 2014-01-21 2018-01-24 キヤノン株式会社 画像処理装置およびその制御方法、画像処理システム
JP6425312B2 (ja) * 2014-04-22 2018-11-21 日本電信電話株式会社 動的錯覚呈示装置、その方法、プログラム
JP6292007B2 (ja) * 2014-04-28 2018-03-14 富士通株式会社 画像検索装置、画像検索方法及びプログラム
CN104052977B (zh) * 2014-06-12 2016-05-25 海信集团有限公司 一种交互式图像投影方法和装置
JP6385729B2 (ja) 2014-06-13 2018-09-05 株式会社東芝 画像処理装置および画像投影装置
JP6482196B2 (ja) * 2014-07-09 2019-03-13 キヤノン株式会社 画像処理装置、その制御方法、プログラム、及び記憶媒体
CN105991950A (zh) * 2015-03-06 2016-10-05 江苏宜清光电科技有限公司 一种可自定义显示宽高比的投影机
JP6032319B2 (ja) * 2015-04-17 2016-11-24 ソニー株式会社 情報処理装置、情報処理方法およびプログラム
CN105892663B (zh) * 2016-03-31 2021-02-19 联想(北京)有限公司 一种信息处理方法及电子设备
JP6816402B2 (ja) * 2016-08-12 2021-01-20 セイコーエプソン株式会社 表示装置および表示装置の制御方法
CN108540782B (zh) * 2017-03-03 2020-09-22 株式会社理光 获取差分图像的方法和装置
JP6275312B1 (ja) * 2017-06-02 2018-02-07 キヤノン株式会社 投写装置およびその制御方法、プログラム
JPWO2019176594A1 (ja) * 2018-03-16 2021-02-04 富士フイルム株式会社 投影制御装置、投影装置、投影制御方法、及び投影制御プログラム
JP7163947B2 (ja) * 2020-10-22 2022-11-01 セイコーエプソン株式会社 投写領域の設定支援方法、設定支援システム、及びプログラム
JP2023007083A (ja) 2021-07-01 2023-01-18 セイコーエプソン株式会社 画像表示システム

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20010028341A1 (en) 2000-02-08 2001-10-11 Takayuki Kitazawa System and method of pointed position detection, presentation system, and program
JP2005141151A (ja) * 2003-11-10 2005-06-02 Seiko Epson Corp プロジェクタおよびプロジェクタの機能設定方法
JP2005267034A (ja) * 2004-03-17 2005-09-29 Brother Ind Ltd 画像入力装置
WO2006104132A1 (ja) * 2005-03-28 2006-10-05 Matsushita Electric Industrial Co., Ltd. ユーザインタフェイスシステム

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0793089A (ja) * 1993-09-22 1995-04-07 Hitachi Ltd 画像編集装置
US5528263A (en) * 1994-06-15 1996-06-18 Daniel M. Platzker Interactive projected video image display system
JP3270643B2 (ja) * 1994-12-22 2002-04-02 キヤノン株式会社 指示位置検出方法及び装置
US6346933B1 (en) * 1999-09-21 2002-02-12 Seiko Epson Corporation Interactive display presentation system
GB2374663A (en) 2001-04-18 2002-10-23 Nokia Corp Presentation of images
US6802611B2 (en) * 2002-10-22 2004-10-12 International Business Machines Corporation System and method for presenting, capturing, and modifying images on a presentation board
US6979087B2 (en) * 2002-10-31 2005-12-27 Hewlett-Packard Development Company, L.P. Display system with interpretable pattern detection
JP2004199299A (ja) * 2002-12-18 2004-07-15 Casio Comput Co Ltd 手書き情報記録方法、投影記録装置
US6840627B2 (en) * 2003-01-21 2005-01-11 Hewlett-Packard Development Company, L.P. Interactive display device
CN1918532A (zh) * 2003-12-09 2007-02-21 雷阿卡特瑞克斯系统公司 自容交互式视频显示系统
JP4533641B2 (ja) 2004-02-20 2010-09-01 オリンパス株式会社 携帯型プロジェクタ
JP4345755B2 (ja) * 2006-02-16 2009-10-14 セイコーエプソン株式会社 入力位置設定方法、入力位置設定装置、入力位置設定プログラムおよび情報入力システム
US8356254B2 (en) * 2006-10-25 2013-01-15 International Business Machines Corporation System and method for interacting with a display
US7862179B2 (en) * 2007-11-07 2011-01-04 Omnivision Technologies, Inc. Dual-mode projection apparatus and method for locating a light spot in a projected image

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20010028341A1 (en) 2000-02-08 2001-10-11 Takayuki Kitazawa System and method of pointed position detection, presentation system, and program
JP2005141151A (ja) * 2003-11-10 2005-06-02 Seiko Epson Corp プロジェクタおよびプロジェクタの機能設定方法
JP2005267034A (ja) * 2004-03-17 2005-09-29 Brother Ind Ltd 画像入力装置
WO2006104132A1 (ja) * 2005-03-28 2006-10-05 Matsushita Electric Industrial Co., Ltd. ユーザインタフェイスシステム

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP2191350A4 *

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102118594B (zh) * 2009-12-31 2014-01-22 鸿富锦精密工业(深圳)有限公司 前投影系统及方法
CN102193730A (zh) * 2010-03-10 2011-09-21 索尼公司 图像处理设备,图像处理方法和程序
CN102193730B (zh) * 2010-03-10 2016-07-06 索尼公司 图像处理设备,图像处理方法和程序
RU2598598C2 (ru) * 2010-10-04 2016-09-27 Сони Корпорейшн Устройство обработки информации, система обработки информации и способ обработки информации
US9860484B2 (en) 2010-10-04 2018-01-02 Saturn Licensing Llc Information processing apparatus, information processing system and information processing method

Also Published As

Publication number Publication date
CN101681217A (zh) 2010-03-24
CN101681217B (zh) 2012-06-27
EP2191350A1 (en) 2010-06-02
JP2009064110A (ja) 2009-03-26
EP2191350B1 (en) 2014-05-14
US20100157254A1 (en) 2010-06-24
US8118433B2 (en) 2012-02-21
JP4829855B2 (ja) 2011-12-07
EP2191350A4 (en) 2010-12-08
KR20090087960A (ko) 2009-08-18

Similar Documents

Publication Publication Date Title
US8118433B2 (en) Image projection apparatus and control method for same
JP4991458B2 (ja) 画像表示装置及びその制御方法
US8249305B2 (en) Information processing apparatus, information processing method, program, and recording medium
US11272093B2 (en) Image capture control apparatus, display control apparatus, and control method therefor to track a target and to determine an autofocus position
US10649313B2 (en) Electronic apparatus and method for controlling same
US11388331B2 (en) Image capture apparatus and control method thereof
JP4245185B2 (ja) 撮像装置
US20130021491A1 (en) Camera Device Systems and Methods
JP7467114B2 (ja) 撮像装置およびその制御方法
JP6757268B2 (ja) 撮像装置及びその制御方法
KR20090031725A (ko) 디지털 이미지들 획득 관련 방법
JP2003283964A (ja) 映像表示装置
CN101562703A (zh) 用于在成像设备内执行基于触摸的调整的方法和装置
US20170104922A1 (en) Electronic apparatus and control method thereof
US20230042807A1 (en) Electronic device
US10771678B2 (en) Autofocus control apparatus and method for selecting a target of a detected object
US10958825B2 (en) Electronic apparatus and method for controlling the same
US10652442B2 (en) Image capturing apparatus with operation members provided on different sides, control method of the same, and storage medium
JP2010034820A (ja) プロジェクタ、プロジェクタの制御方法、及び、制御プログラム
JP5914824B2 (ja) 撮像装置
US11831976B2 (en) Display apparatus
JP7423325B2 (ja) 表示制御装置およびその制御方法
US20190007622A1 (en) Display control apparatus, control method, and program
CN110418122A (zh) 一种触控投影仪的控制方法、装置、设备及存储介质
CN116916144A (zh) 拍摄控制方法及视频控制方法

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 200880018468.6

Country of ref document: CN

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 08792795

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 12447920

Country of ref document: US

WWE Wipo information: entry into national phase

Ref document number: 1020097014401

Country of ref document: KR

WWE Wipo information: entry into national phase

Ref document number: 2008792795

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: DE