WO2010110391A1 - 携帯電子機器 - Google Patents
携帯電子機器 Download PDFInfo
- Publication number
- WO2010110391A1 WO2010110391A1 PCT/JP2010/055278 JP2010055278W WO2010110391A1 WO 2010110391 A1 WO2010110391 A1 WO 2010110391A1 JP 2010055278 W JP2010055278 W JP 2010055278W WO 2010110391 A1 WO2010110391 A1 WO 2010110391A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- unit
- electronic device
- portable electronic
- control unit
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B29/00—Combinations of cameras, projectors or photographic printing apparatus with non-photographic non-optical apparatus, e.g. clocks or weapons; Cameras having the shape of other objects
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B21/00—Projectors or projection-type viewers; Accessories therefor
- G03B21/14—Details
- G03B21/142—Adjusting of projection optics
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B21/00—Projectors or projection-type viewers; Accessories therefor
- G03B21/14—Details
- G03B21/147—Optical correction of image distortions, e.g. keystone
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/02—Constructional features of telephone sets
- H04M1/0202—Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
- H04M1/026—Details of the structure or mounting of specific components
- H04M1/0272—Details of the structure or mounting of specific components for a projector or beamer module assembly
Definitions
- the present invention relates to a portable electronic device.
- This projector is mainly a so-called stationary apparatus that is supplied with power from a commercial power source and is used in a fixed state.
- This stationary projector projects an image on a fixed wall or screen in a fixed state.
- Patent Document 1 includes a projector-equipped mobile phone that includes an upper cabinet, a lower cabinet, a hinge portion that rotatably connects the upper cabinet and the lower cabinet, and a projector having a lens and a light source.
- the terminal is listed.
- a portable projector as described in Patent Document 1 can be carried, and the position where an image is irradiated can be manually set. There is an advantage that it can be easily adjusted.
- the portable projector has an advantage of being able to project an image at an arbitrary position. However, since the irradiation surface on which the image is projected and the projector are at an arbitrary position, depending on the conditions during use, The size of the image projected on the irradiation surface changes.
- the image processing apparatus further includes: an imaging unit that captures an image of the irradiation surface; and an image analysis unit that analyzes an image captured by the imaging unit and extracts the target object, and the control unit includes: It is preferable to determine the arrangement of the dividing lines based on the configuration of the object detected based on the detection result.
- control unit preferably adjusts the size of the image based on the detected size of the irradiation surface and the size of the target object so that the target object is projected in a life-size manner on the irradiation surface.
- the distance measuring sensor includes a light emitting element and a light receiving element, and the light emitting position and the irradiation are obtained by receiving light emitted from the light emitting element and reflected by the irradiation surface by the light receiving element. It is preferable to detect the distance to the surface.
- the portable electronic device detects the distance from the irradiation surface irradiated with light by the distance detection unit, and adjusts the image to be projected based on the detected distance, so that it is more convenient and useful. There is an effect that a high image can be projected.
- FIG. 1 is a perspective view illustrating a schematic configuration of an embodiment of a portable electronic device.
- FIG. 2 is a block diagram showing a schematic configuration of functions of the portable electronic device shown in FIG.
- FIG. 3 is an explanatory diagram showing a state in which an image is displayed on the portable electronic device shown in FIG.
- FIG. 4 is a flowchart illustrating an example of the operation of the mobile electronic device.
- FIG. 5 is a flowchart showing another example of the operation of the portable electronic device.
- FIG. 6 is a flowchart illustrating another example of the operation of the portable electronic device.
- FIG. 7 is an explanatory diagram illustrating an example of an image projected by the mobile electronic device.
- FIG. 8 is a flowchart illustrating another example of the operation of the portable electronic device.
- FIG. 1 is a perspective view illustrating a schematic configuration of an embodiment of a portable electronic device.
- FIG. 2 is a block diagram showing a schematic configuration of functions of the portable electronic device shown in FIG.
- the operation unit 28 includes, for example, an operation key 13 and a dedicated key 14 to which various functions such as a power key, a call key, a numeric key, a character key, a direction key, a determination key, and a call key are assigned.
- a key is input by a user operation, a signal corresponding to the operation content is generated.
- the generated signal is input to the control unit 22 as a user instruction.
- the distance measuring sensor 40 measures the distance from the surface on which the projector 34 emits light, that is, the surface on which the image of the projection area emitted from the projector 34 arrives and the image is displayed (hereinafter referred to as “irradiation surface”). It is a measuring instrument to measure.
- the distance measuring sensor 40 is disposed on the upper surface of the housing 11 and transmits a measurement unit such as an infrared ray, an ultrasonic wave, or a laser beam, and a reception unit that is disposed on the upper surface of the housing 11 and receives the measurement wave. And a measurement wave that is emitted from the transmission unit 40a and is reflected by the target object is received by the reception unit 40b.
- the distance measurement sensor 40 is based on the intensity of the measurement wave received by the reception unit 40b, the incident angle of the measurement wave, the time from transmission by the transmission unit 40a to reception by the reception unit 40b, and the like. And the distance from the irradiation surface.
- the portable electronic device 10 is basically configured as described above.
- the control unit 22 After calculating the size of the irradiation surface in step S16, the control unit 22 determines whether or not the image ⁇ the irradiation surface as step S18. That is, the control unit 22 compares the size on the irradiation surface of the image to be projected with the size of the irradiation surface, and determines whether the image can enter the irradiation surface.
- step S18 When the control unit 22 determines in step S18 that the image ⁇ the irradiation surface (Yes), that is, the image size is the same as the irradiation surface size or the image size is smaller than the irradiation surface size, the control unit 22 performs step S20.
- the entire image is projected by the projector 34. Specifically, based on the size of the image and the size of the irradiation surface, the image to be projected is scaled and projected by the projector 34 so that the image projected on the irradiation surface becomes an actual size.
- the size of the image is the same as the size of the irradiation surface, the projected image is projected as it is without being enlarged or reduced.
- the control unit 22 ends the process.
- step S22 the control unit 22 issues an instruction to move the portable electronic device 10. Display. Specifically, in order to project the entire image in actual size, it is necessary to increase the distance from the irradiation surface, and to indicate that the portable electronic device 10 needs to be moved in a direction away from the irradiation surface. It is displayed on 32 displays 12.
- control unit 22 determines in step S24 that the terminal has not moved (No), it projects a part of the image in step S26. Specifically, the control unit 22 calculates the size of an image that can be projected on the irradiation surface based on the size of the irradiation surface, and selects a portion to be projected from the image based on the calculated size, that is, selects a certain region. Cut out and project an image of the selected part in actual size. Note that a method for selecting a portion to be projected from the image is not particularly limited, and an image having an irradiation surface size around the center of the image is automatically selected even when the operator selects the projection. It may be. When the control unit 22 projects a part of the image in step S26, the process ends.
- the mobile electronic device 10 detects the distance to the irradiation surface by the distance measuring sensor 40, and based on the detection result, the image is displayed on the irradiation surface with the actual size (the size stored in the image data). That is, the subject included in the image can be projected. Thereby, the portable electronic device 10 can project the subject included in the image with a certain size regardless of the distance from the irradiation surface. As described above, the operator can accurately recognize the size of the subject, and even if the subject (object) has a limit of display on the display 12 and is difficult to imagine, the actual size can be easily set. It is possible to estimate.
- the size can be estimated.
- the subject is clothes
- the subject is furniture or the like, by projecting the image at a position where it is assumed that the furniture is to be arranged, it is possible to easily grasp whether the furniture enters the space or the layout.
- the scale image may be displayed on the entire irradiation surface or only on a part thereof.
- the position where the scale is displayed in the screen is not particularly limited, and the scale is displayed only in the horizontal direction, in the vertical direction, in both directions, or in the oblique direction. Alternatively, it may be displayed in a matrix like a graph paper.
- the method for creating the scale to be displayed is not limited, for example, a ruled image having the maximum possible length is stored, and an area to be used is determined in the ruled image based on the length of the irradiation area. The rule image of the determined area may be projected as an image.
- FIG. 8 is a flowchart illustrating another example of the operation of the mobile electronic device
- FIG. 9 is an explanatory diagram illustrating an irradiation surface onto which an image is projected by the mobile electronic device. Note that the flowchart shown in FIG. 8 is an example of an operation for detecting an obstacle from the shape of an object of an image (an object placed on an irradiation surface), and projecting and displaying a dividing line while avoiding the obstacle.
- the control unit 22 of the portable electronic device 10 detects the circular division number n input by the operator as step S100.
- the control unit 22 displays a circular image to be projected on the display unit 32 in step S102. That is, the control unit 22 creates an image obtained by dividing a circle based on the number of divisions input in step S100, and causes the display unit 32 to display the created image.
- the image created in step S102 is also displayed, can be relatively moved by the user's operation, and the target can be specified by overlapping the position of the created circular image and the target in the captured image. it can.
- the control unit 22 analyzes the image of the object and identifies the position of an obstacle on the surface of the object (that is, a region where it is necessary to avoid the arrangement of dividing lines). For example, when the object is a cake, the obstacle is a fruit (strawberry), a plate chocolate, a decoration (a craftwork) or the like placed on the surface of the cake.
- the user can set the criteria for determining whether or not to make an obstacle. For example, strawberry is not an obstacle but may be set with a bar chocolate as an obstacle, and both a strawberry and a bar chocolate may be used as obstacles.
- the control unit 22 calculates the angle of one section and sets the control value m in step S106.
- n is the number of divisions detected in step S100.
- the control value m is a value used when determining the position of the dividing line. As the control value m, the number n of dividing lines is set as an initial value.
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Signal Processing (AREA)
- Optics & Photonics (AREA)
- Projection Apparatus (AREA)
- Telephone Function (AREA)
- Studio Devices (AREA)
- Controls And Circuits For Display Device (AREA)
- Control Of Indicators Other Than Cathode Ray Tubes (AREA)
- Transforming Electric Information Into Light Information (AREA)
Abstract
Description
11 筐体
12 ディスプレイ
13 操作キー
14 専用キー
15 マイク
16 レシーバ
22 制御部
24 記憶部
26 送受信部
26a アンテナ
28 操作部
30 音声処理部
32 表示部
34 プロジェクタ
34a 光射出部
36 加速度センサ
38 カメラ
38a 撮像部
40 測距センサ
40a 送信部
40b 受信部
Claims (17)
- 操作部と、
対象物に向かって画像を投影する画像投影部と、
各部の動作を制御する制御部と、を備え、
前記制御部は、前記対象物の分割数の情報が前記操作部から入力されると、前記対象部を前記分割数に分割するための画像を前記画像投影部に投影させることを特徴とする携帯電子機器。 - 前記画像投影部から、前記画像投影部により射出された光が照射される照射面までの距離を検出する距離検出部と、をさらに備え、
前記制御部は、前記画像投影部で投影する画像の画像処理を行い、かつ、前記画像投影部の動作を制御し、
投影させる画像の大きさ情報と、前記分割数の情報とが前記操作部により入力されたら、分割するための画像を生成し、前記距離検出部で検出した距離から、前記照射面のサイズを検出し、検出した前記照射面のサイズに基づいて、前記画像投影部から投影する画像のサイズを決定し、前記照射面に前記分割数で分割した画像を投影させることを特徴とする請求項1に記載の携帯電子機器。 - 前記照射面の画像を撮影する撮像部と、前記撮像部で撮影した画像を解析し前記対象物を抽出する画像解析部と、をさらに有し、
前記制御部は、前記画像解析部の検出結果に基づいて検出した前記対象物の構成に基づいて分割線の配置を決定することを特徴とする請求項2に記載の携帯電子機器。 - 前記制御部は、前記対象物の構成に基づいて、前記分割線の形状を決定することを特徴とする請求項3に記載の携帯電子機器。
- 前記制御部は、前記対象物の構成から、前記分割線の配置禁止区域を決定し、前記配置禁止区域と重ならない位置に前記分割線を配置することを特徴とする請求項3または4に記載の携帯電子機器。
- 前記制御部は、前記距離検出部で検出した前記照射面と前記光射出位置との距離に基づいて、前記画像解析部で抽出した対象物のサイズを算出し、算出した対象物のサイズを前記投影する画像の大きさ情報とすることを特徴とする請求項2から5のいずれか1項に記載の携帯電子機器。
- 前記画像投影部の光の照射方向と、前記照射面とのなす角を検出する角度検出部をさらに有し、
前記制御部は、前記角度検出部に基づいて、前記対象物に投影される画像が画像データと同一形状となるように、前記投影させる画像、または、前記画像投影部による画像投影領域を補正することを特徴とする請求項1から5のいずれか1項に記載の携帯電子機器。 - 前記制御部は、前記画像投影部の光の照射方向と、前記照射面とが直交している設定で画像を生成することを特徴とする請求項1に記載の携帯電子機器。
- 前記制御部は、検出した前記照射面のサイズと前記対象物のサイズとに基づいて、前記照射面に前記対象物が等身大で投影されるように画像のサイズを調整することを特徴とする請求項2に記載の携帯電子機器。
- 前記分割するための画像は、物差しあるいは分度器もしくはこれらの目盛りが含まれていることを特徴とする請求項9に記載の携帯電子機器。
- 前記距離検出部は、測距センサであることを特徴とする請求項2に記載の携帯電子機器。
- 前記測距センサは、発光素子と受光素子とで構成され、前記発光素子から発光され、前記照射面で反射された光を前記受光素子で受光することで、前記光射出位置と前記照射面との距離を検出することを特徴とする請求項11に記載の携帯電子機器。
- 前記測距センサは、受光素子を備え、前記画像投影部から照射され、前記照射面で反射された光を前記受光素子で受光することで、前記光射出位置と前記照射面との距離を検出することを特徴とする請求項11に記載の携帯電子機器。
- 前記測距センサは、赤外光を受光することで、前記光射出位置と前記照射面との距離を検出することを特徴とする請求項12または13に記載の携帯電子機器。
- 前記距離検出部は、撮影機構のオートフォーカス機能を用いて距離を検出することを特徴とする請求項2に記載の携帯電子機器。
- 操作部と
表示部と、
撮像部と、
各部を制御する制御部と、を備え、
前記制御部は、前記撮像部で撮像した対象物を分割するための分割数の情報が前記操作部から入力されると、前記対象物を前記分割数に分割するための画像と、前記対象物の画像とが重ねて表示されるように、前記表示部を制御することを特徴とする携帯電子機器。 - 前記撮像部で撮影した画像を解析し被対象物を抽出する画像解析部と、をさらに備え、
前記制御部は、分割数の情報が前記操作部に入力されたら、分割線の位置を決定し、前記分割数に分割した画像を生成し、前記分割数に分割した画像を、前記画像解析部の検出結果に基づいて検出した前記対象物の画像に重ねて前記表示部に表示させることを特徴とする請求項16に記載の携帯電子機器。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/260,536 US20120019441A1 (en) | 2009-03-26 | 2010-03-25 | Mobile electronic device |
JP2011506124A JP5232911B2 (ja) | 2009-03-26 | 2010-03-25 | 携帯電子機器 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2009-077784 | 2009-03-26 | ||
JP2009077784 | 2009-03-26 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2010110391A1 true WO2010110391A1 (ja) | 2010-09-30 |
Family
ID=42781081
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2010/055278 WO2010110391A1 (ja) | 2009-03-26 | 2010-03-25 | 携帯電子機器 |
Country Status (3)
Country | Link |
---|---|
US (1) | US20120019441A1 (ja) |
JP (2) | JP5232911B2 (ja) |
WO (1) | WO2010110391A1 (ja) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2434771A1 (en) * | 2010-09-27 | 2012-03-28 | Sony Corporation | Projection device, projection control method and program |
JP2012138673A (ja) * | 2010-12-24 | 2012-07-19 | Kyocera Corp | 携帯電子機器 |
JP2015158891A (ja) * | 2014-01-21 | 2015-09-03 | セイコーエプソン株式会社 | 位置検出装置、及び調整方法 |
JP2018501930A (ja) * | 2014-12-12 | 2018-01-25 | アントニオ・メーレAntonio MELE | ケーキ、ピザ、又は同様の食べ物の均等部分を手に入れるための切断経路を示すバッテリ駆動のレーザデバイス |
JP2018181139A (ja) * | 2017-04-19 | 2018-11-15 | 東芝情報システム株式会社 | 切り分け補助装置及び切り分け補助装置用プログラム |
JP2019213168A (ja) * | 2018-06-08 | 2019-12-12 | パナソニックIpマネジメント株式会社 | 投影装置 |
Families Citing this family (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20150041453A (ko) * | 2013-10-08 | 2015-04-16 | 엘지전자 주식회사 | 안경형 영상표시장치 및 그것의 제어방법 |
JP6304618B2 (ja) * | 2013-11-05 | 2018-04-04 | パナソニックIpマネジメント株式会社 | 照明装置 |
US9710160B2 (en) * | 2014-10-21 | 2017-07-18 | International Business Machines Corporation | Boundless projected interactive virtual desktop |
CN104754265A (zh) * | 2015-03-16 | 2015-07-01 | 联想(北京)有限公司 | 一种数据处理方法及电子设备 |
CN106990650B (zh) * | 2016-01-20 | 2020-05-22 | 中兴通讯股份有限公司 | 投影画面的调节方法及装置 |
US10499026B1 (en) * | 2016-06-27 | 2019-12-03 | Amazon Technologies, Inc. | Automation correction of projection distortion |
CN116863107A (zh) * | 2017-03-06 | 2023-10-10 | 连株式会社 | 增强现实提供方法、装置以及非暂时性计算机可读介质 |
US10576893B1 (en) * | 2018-10-08 | 2020-03-03 | Ford Global Technologies, Llc | Vehicle light assembly |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005275327A (ja) * | 2003-09-19 | 2005-10-06 | Fuji Electric Systems Co Ltd | 投射表示装置 |
JP2007096542A (ja) * | 2005-09-27 | 2007-04-12 | Sharp Corp | プロジェクタ機能付携帯端末 |
JP2007205915A (ja) * | 2006-02-02 | 2007-08-16 | Seiko Epson Corp | 投写装置、プログラムおよび情報記憶媒体 |
JP2008033049A (ja) * | 2006-07-28 | 2008-02-14 | Ricoh Co Ltd | 対象物指示装置 |
JP2010112875A (ja) * | 2008-11-07 | 2010-05-20 | Sharp Corp | 投影装置、投影装置制御方法、及び投影装置制御プログラム |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7075556B1 (en) * | 1999-10-21 | 2006-07-11 | Sportvision, Inc. | Telestrator system |
US7978184B2 (en) * | 2002-11-08 | 2011-07-12 | American Greetings Corporation | Interactive window display |
JP2005064985A (ja) * | 2003-08-15 | 2005-03-10 | Fujinon Corp | 資料提示装置 |
JP2006311063A (ja) * | 2005-04-27 | 2006-11-09 | Fujinon Corp | 資料提示装置およびその動作方法 |
US7984995B2 (en) * | 2006-05-24 | 2011-07-26 | Smart Technologies Ulc | Method and apparatus for inhibiting a subject's eyes from being exposed to projected light |
US8103150B2 (en) * | 2007-06-07 | 2012-01-24 | Cyberlink Corp. | System and method for video editing based on semantic data |
US8290208B2 (en) * | 2009-01-12 | 2012-10-16 | Eastman Kodak Company | Enhanced safety during laser projection |
-
2010
- 2010-03-25 JP JP2011506124A patent/JP5232911B2/ja not_active Expired - Fee Related
- 2010-03-25 US US13/260,536 patent/US20120019441A1/en not_active Abandoned
- 2010-03-25 WO PCT/JP2010/055278 patent/WO2010110391A1/ja active Application Filing
-
2013
- 2013-02-04 JP JP2013019461A patent/JP5563678B2/ja not_active Expired - Fee Related
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005275327A (ja) * | 2003-09-19 | 2005-10-06 | Fuji Electric Systems Co Ltd | 投射表示装置 |
JP2007096542A (ja) * | 2005-09-27 | 2007-04-12 | Sharp Corp | プロジェクタ機能付携帯端末 |
JP2007205915A (ja) * | 2006-02-02 | 2007-08-16 | Seiko Epson Corp | 投写装置、プログラムおよび情報記憶媒体 |
JP2008033049A (ja) * | 2006-07-28 | 2008-02-14 | Ricoh Co Ltd | 対象物指示装置 |
JP2010112875A (ja) * | 2008-11-07 | 2010-05-20 | Sharp Corp | 投影装置、投影装置制御方法、及び投影装置制御プログラム |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2434771A1 (en) * | 2010-09-27 | 2012-03-28 | Sony Corporation | Projection device, projection control method and program |
US10205993B2 (en) | 2010-09-27 | 2019-02-12 | Sony Corporation | Controlling projection of a screen |
JP2012138673A (ja) * | 2010-12-24 | 2012-07-19 | Kyocera Corp | 携帯電子機器 |
JP2015158891A (ja) * | 2014-01-21 | 2015-09-03 | セイコーエプソン株式会社 | 位置検出装置、及び調整方法 |
JP2018501930A (ja) * | 2014-12-12 | 2018-01-25 | アントニオ・メーレAntonio MELE | ケーキ、ピザ、又は同様の食べ物の均等部分を手に入れるための切断経路を示すバッテリ駆動のレーザデバイス |
JP2018181139A (ja) * | 2017-04-19 | 2018-11-15 | 東芝情報システム株式会社 | 切り分け補助装置及び切り分け補助装置用プログラム |
JP2019213168A (ja) * | 2018-06-08 | 2019-12-12 | パナソニックIpマネジメント株式会社 | 投影装置 |
Also Published As
Publication number | Publication date |
---|---|
US20120019441A1 (en) | 2012-01-26 |
JPWO2010110391A1 (ja) | 2012-10-04 |
JP5563678B2 (ja) | 2014-07-30 |
JP2013102536A (ja) | 2013-05-23 |
JP5232911B2 (ja) | 2013-07-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5563678B2 (ja) | 携帯電子機器 | |
JP5259010B2 (ja) | 携帯電子機器および投影システム | |
JP5420365B2 (ja) | 投影装置 | |
JP4867041B2 (ja) | 投影装置、投影装置制御方法、及び投影装置制御プログラム | |
JP5409263B2 (ja) | 携帯電子機器及び携帯電話機 | |
US20160103209A1 (en) | Imaging device and three-dimensional-measurement device | |
US20120069308A1 (en) | Mobile electronic device | |
US9097966B2 (en) | Mobile electronic device for projecting an image | |
WO2012026406A1 (ja) | 携帯電子機器及び携帯電子機器の使用方法 | |
EP3370403B1 (en) | Reading device and mobile terminal | |
JP5615651B2 (ja) | 電子機器および投影システム | |
JP2011106931A (ja) | 3次元形状測定システムおよび携帯電話機 | |
US20160343306A1 (en) | Image display device and image display method | |
JP2006323762A (ja) | 情報処理装置、撮像方法およびプログラム | |
JP5595834B2 (ja) | 携帯電子機器及び携帯電子機器の使用方法 | |
JP4815881B2 (ja) | 投影装置、位相差センサを用いた測距方法及びプログラム | |
KR101948674B1 (ko) | 포터블 스캐너 및 그 스캐닝 방법 | |
US20220397481A1 (en) | Data processing apparatus, data processing method, and data processing program | |
JP5650468B2 (ja) | 携帯電子機器及び携帯電子機器の使用方法 | |
JP5646918B2 (ja) | 携帯電子機器及び携帯電子機器の使用方法 | |
JP2021165895A (ja) | 光学読取システム | |
JP2013041012A (ja) | 撮像装置および撮像方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 10756187 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2011506124 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13260536 Country of ref document: US |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 10756187 Country of ref document: EP Kind code of ref document: A1 |