WO2007141860A1 - 誘導装置および方法 - Google Patents
誘導装置および方法 Download PDFInfo
- Publication number
- WO2007141860A1 WO2007141860A1 PCT/JP2006/311531 JP2006311531W WO2007141860A1 WO 2007141860 A1 WO2007141860 A1 WO 2007141860A1 JP 2006311531 W JP2006311531 W JP 2006311531W WO 2007141860 A1 WO2007141860 A1 WO 2007141860A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- virtual image
- display
- guidance
- posture
- shape
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 28
- 238000005259 measurement Methods 0.000 claims abstract description 5
- 230000036544 posture Effects 0.000 claims description 48
- 238000012937 correction Methods 0.000 claims description 25
- 238000001514 detection method Methods 0.000 claims description 11
- 238000010586 diagram Methods 0.000 description 30
- 238000012545 processing Methods 0.000 description 14
- 238000003384 imaging method Methods 0.000 description 7
- 239000011521 glass Substances 0.000 description 5
- 210000003811 finger Anatomy 0.000 description 4
- 230000010287 polarization Effects 0.000 description 4
- 238000007689 inspection Methods 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 2
- 230000003213 activating effect Effects 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 210000004204 blood vessel Anatomy 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000003825 pressing Methods 0.000 description 1
- 238000012913 prioritisation Methods 0.000 description 1
- 238000001028 reflection method Methods 0.000 description 1
- 210000003813 thumb Anatomy 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T1/00—General purpose image data processing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/107—Static hand or arm
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/60—Static or dynamic means for assisting the user to position a body part for biometric acquisition
- G06V40/67—Static or dynamic means for assisting the user to position a body part for biometric acquisition by interactive indications to the user
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10004—Still image; Photographic image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30244—Camera pose
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/14—Vascular patterns
Definitions
- the present invention relates to an apparatus such as an inspection apparatus or an authentication apparatus that images a subject without contact.
- the present invention relates to a guidance device and method for guiding a subject to an appropriate position / posture with respect to an imaging device.
- Patent Document 1 in a recognition device that performs non-contact blood vessel imaging, a handprint that serves as a model for pressing a subject is pasted on the surface of the imaging device, and the position of the hand is detected to bring the hand closer.
- a technique for notifying the user when the voice / bell lamp is turned on is disclosed.
- Patent Document 2 discloses a technique of displaying a stereoscopic image of a finger with a stereoscope, detecting the overlapping state of the stereoscopic display and the finger, and capturing an image when they match.
- Patent Document 1 International Publication No. 04Z021884 Pamphlet
- Patent Document 2 JP-A-5-108808
- an object of the present invention is to provide an efficient guidance device and method that can be easily positioned.
- the guidance device of the present invention is a device that performs measurement while holding an object in a non-contact manner, and a virtual image display unit that displays a virtual image and the object are finally arranged.
- Example display generating means for displaying a virtual image as an example on the virtual image display means based on the shape of the object, and a form detection means for detecting the position and posture of the object.
- Guidance display generation means for displaying a virtual image as an example on the virtual image display means based on the shape of the object, and a form detection means for detecting the position and posture of the object.
- Guidance display generation means for calculating a correction amount calculation for a difference between the position / posture of the object and the position / posture of the virtual image, and causing the virtual
- the virtual image has the same size and shape as the object.
- the size 'shape of the virtual image is determined by data recorded on a storage medium.
- the guidance display is an arrow display indicating a movement of a position from the current state of the object to the virtual image, a rotation of a posture, or a deformation of the shape.
- the guidance device of the present invention it is desirable that the guidance is an animation display of a movement of a position from the current state of the object to the virtual image, a rotation of a posture, or a deformation of the shape.
- the guidance device of the present invention separately displays the movement of the position, the rotation of the posture, or the deformation of the shape.
- the guidance device of the present invention displays auxiliary lines indicating the position and attitude of each of the object and the virtual image.
- auxiliary surfaces indicating their respective positions at the periphery of the object and the virtual image.
- FIG. 1 is a diagram showing a configuration of a guidance device according to an embodiment of the present invention.
- FIG. 2 is a diagram showing a processing flow of a processing unit of the guidance device according to the embodiment of the present invention.
- FIG. 3 is a diagram showing a configuration of a display device of the guidance device according to the embodiment of the present invention.
- FIG. 4 is a diagram showing the principle of virtual image display by a concave mirror.
- FIG. 5 is a diagram showing a display device using a stereoscope.
- FIG. 6 is a diagram showing a display device using polarized glasses.
- FIG. 7 is a diagram for explaining the degree of freedom of a subject.
- FIG. 8 is a diagram for explaining posture change of a subject.
- FIG. 9 is a diagram showing a change in the shape of a subject.
- FIG. 10 is a diagram showing a guidance display according to an embodiment of the present invention when the position is shifted.
- FIG. 11 is a diagram showing a guidance display according to an embodiment of the present invention when the posture is deviated.
- FIG. 12 is a diagram showing a guidance display according to an embodiment of the present invention when there is a deformation.
- FIG. 13 is a flowchart of guidance processing of the guidance device according to the embodiment of the present invention.
- FIG. 14 is a flowchart when a modified animation is used for guidance display.
- FIG. 15 is a diagram for explaining a detection method according to an embodiment of the present invention when a model is hidden.
- FIG. 16 is a diagram showing auxiliary display using auxiliary lines according to the embodiment of the present invention.
- FIG. 17 is a diagram showing auxiliary display by the auxiliary surface according to the embodiment of the present invention.
- FIG. 1 is a diagram showing a configuration of a guidance device according to an embodiment of the present invention.
- the guidance device of the present invention includes a display device 101 for displaying a virtual image 106 serving as a model and a guidance display 107 indicating the amount to be corrected, and a position for detecting a form such as the position, posture, and shape of the subject 108.
- Posture / shape detection device (body detection device) 102, storage medium 103 storing the shape and size of subject 108, processing unit 104 for processing each signal, and subject 108 Consists of a shooting camera 105 for shooting.
- the storage medium 103 is a portable storage medium such as an IC card or a magnetic card, and records the size and shape of a subject such as a user's hand. Since the size and shape of a subject such as a hand vary from user to user, the size and shape data recorded in the storage medium 103 indicates the same shape and size as the subject such as a user's hand. It should be data. In use, the user sets the storage medium 103 in the reading unit of the guidance device. Further, the storage medium 103 may be a storage device built in the guidance device or a storage device connected via a network.
- the guidance device of the present invention displays the virtual image 106 of the model so that it can be seen in front of the photographic camera based on the shape and size of the subject 108 and the appropriate position with respect to the photographic camera 105.
- the user brings the subject 108 closer, the subject 108 is photographed by the photographing camera 105, and the position, posture, and shape of the subject 108 are detected by the position “posture” shape detection device 102.
- the position, posture, and shape of the subject 108 are compared with the model information recorded in the storage medium 103, the amount to be corrected is calculated by the processing unit 104, and a display indicating the amount to be corrected is displayed on the model. Displayed near virtual image 106. The user moves the subject to an appropriate position while simultaneously observing the virtual image 106, the guidance display 107, and the subject 108 of the model.
- FIG. 2 is a diagram showing a processing flow of the processing unit of the guidance device according to the embodiment of the present invention.
- step S201 the model data generation unit in the processing unit of the guidance device reads the shape * size information of the subject read from the storage medium 103 and the appropriate position information on the imaging camera 105 held inside the guidance device ( (Not shown) Force Model data that is correct is calculated, and model data is sent to the model display generator and correction amount calculator in the processing unit of the guidance device
- a model display signal is generated by a model display generating unit in the processing unit of the guidance device based on the calculated model data, and is output to the display device 101.
- step S203 the position / posture / shape of the subject detected by the position 'posture' shape detection device 102 is compared with the model data by the correction amount calculation unit in the processing unit of the guidance device. The difference is calculated as a correction amount and output to the guidance display generation unit in the processing unit of the guidance device.
- step S204 a guidance display signal is generated by the guidance display generation unit based on the calculated correction amount, and is output to the display device 101, so that the guidance display 107 is displayed.
- FIG. 3 is a diagram showing the configuration of the display device of the guidance device according to the embodiment of the present invention.
- the display device 101 displays a virtual image using the concave mirror 301.
- FIG. 4 is a diagram showing the principle of virtual image display using a concave mirror.
- FIG. 4 when two concave mirrors 401 and 402 are placed facing each other and one focal point is placed at the center of the other concave mirror 402, the image power of the object placed at the center of one concave mirror 402 is the other. Appears in the center of concave mirror 401. Since the concave mirror functions even if only a part of it is cut off, only a part of the concave mirror 301 may be used as shown in FIG. In FIG. 3, when a display panel 302 such as a liquid crystal display is placed at one center of the concave mirror 301, an image for displaying a virtual image and a guidance display of the model generated by the processing unit of FIG. 2 is displayed. The virtual images 106 and 107 appear in front of the lens of the photographing camera 105 due to the reflection of the concave mirror 301.
- a display panel 302 such as a liquid crystal display
- the means for realizing the virtual image display is not limited to the above-described means, and other means may be used. That is,
- FIG. 5 is a diagram showing a display device using a stereoscope.
- the display device using the stereoscope is a device that displays images with parallax separately on the left and right eyes, and has display panels 501 and 502 in the scope, and the display panels 501 and 502 have parallax.
- the displayed image is displayed, and the left and right eyes show the image reflected on the half mirrors 503 and 504, thereby showing a virtual image in front. Because it is projected onto the half mirror, the outside real world can be observed simultaneously.
- FIG. 6 is a diagram showing a display device using polarized glasses.
- a display device using polarized glasses 601 displays a parallax image on a display panel 602 that can display two types of polarization states, or projects images with different polarization states on two projectors.
- a three-dimensional image is shown by observing with glasses wearing polarization filters with different polarization directions on the left and right.
- a parallax image is displayed with a time difference and left and right shirts are alternately opened with a time difference, and a parallax image is displayed with different colors such as red and blue. There are things to see with a color filter.
- a virtual image may be realized by providing a prism on the surface of the display panel and showing separate images on the left and right eyes.
- FIG. 7 is a diagram for explaining a change in the position of the subject.
- the subject has the freedom to translate in the direction of the orthogonal x, y, and z axes.
- the position detection means detecting how much the subject has moved from the correct position.
- the image force obtained by the photographing camera 105 can be easily detected.
- it is difficult to detect the position in the z direction with high image power accuracy it can be easily detected by, for example, a distance sensor using an infrared reflection method.
- FIG. 8 is a diagram for explaining the posture change of the subject.
- the subject has the freedom to rotate around a certain position.
- Posture detection means detecting how much the subject has rotated from the correct state (which direction and force).
- the posture change around the z axis can be easily detected from the image captured by the photographing camera 105.
- the technology for detecting the image force around the x and y axes is disclosed in International Publication No. 04Z021884 “Individual Recognition Device” previously proposed by the applicant.
- Also disclosed is a technique for detecting a posture by an imaging apparatus having a plurality of distance sensors as described in the pamphlet of International Publication No. 04Z084140, “imaging apparatus”.
- FIG. 9 is a diagram showing a change in the shape of the subject.
- FIG. 9 shows how to open a finger as an example. Shows changes. In the right figure in Fig. 9, the circled fingers are open, and in the left figure they are closed. The change in shape can be detected by comparing the shape obtained from the image of the photographing camera 105 with the shape of the model image.
- FIG. 10 is a diagram showing the guidance display according to the embodiment of the present invention when the position is shifted.
- the model virtual image 106 is displayed at a position where the subject 108 is to be placed. Therefore, when the position of the subject 108 is inappropriate, the model virtual image 106 and the subject 108 are visually recognized separately.
- the current position force of the subject 108 is also displayed as a symbol display in which the model virtual image 106 moves to the correct position, or the current position force is directed to the correct position with a length corresponding to the moving distance. There is a way to display such arrows.
- FIG. 11 is a diagram showing a guidance display according to the embodiment of the present invention when the posture is deviated.
- the model virtual image 106 and the subject 108 are visually recognized as in the case of the positional deviation.
- FIG. 12 is a diagram showing guidance display according to the embodiment of the present invention when there is a deformation.
- Fig. 12 shows an example where the thumb is bent. Deformation! The part is the force that the model virtual image 106 is hiding in the subject 108. In the deformed part, the model virtual image 106 and the subject 108 are visible. Here, a graphic showing how the deformed portion should be powered is superimposed on the model virtual image 106 and displayed. For example, as shown in FIG. 12, there is a method of displaying an animation showing the movement from the current state to the correct answer, or showing an arrow display. [0032] When the guidance display is performed in this way, if all the correction elements of the movement 'posture' deformation are displayed at a time, there is a risk that it will become a weak force. Therefore, prioritize and guide one by one. As a method of prioritization,
- the correction rate is defined as a ratio of the correction amount with respect to the reference and a certain reference for each of position / posture 'deformation.
- the change in position is the amount of change in position when 5 cm is 100%
- the change in posture is the amount of change in angle when 30 ° is 100%
- the deformation is the amount of correction for the area where the model is projected onto the z-axis.
- the same sound “bell” lamp as in the prior art may be used in an auxiliary manner from the start to the end of the above guidance.
- FIG. 13 is a flowchart of the guidance process of the guidance device according to the embodiment of the present invention.
- FIG. 13 shows a case where guidance is performed in descending order of correction rate.
- step S1301 a recording medium such as an IC card is inserted into the reading unit.
- step S1302 information on the shape and size of the subject is read from the storage medium, and a model image is generated.
- step S1303 the position of the subject is detected.
- step S1304 it is determined whether the subject is within the guidance range. If it is within the guidance range, the process proceeds to step S1305. If it is not within the guidance range, the process returns to step S1303.
- the guidance range is set to a range where the display device can display a virtual image, for example.
- step S 1305 the example generated in step S 1302 is displayed.
- step S 1306 the position, posture, and shape of the subject are detected.
- step S1307 the correction rate for each of the position, orientation, and shape is calculated.
- step S1308 it is determined whether all correction rates are within the standard. If it is within the standard, the process proceeds to step S 1309, and if it is within the standard, the process proceeds to step S 1311.
- step S1309 the subject is photographed.
- step S1310 the user is notified that the process has ended, and the model display is cleared.
- step S1311 the correction rates of the position, orientation, and shape are compared, and the maximum one is selected.
- step S1312 guidance display is performed. Thereafter, the process returns to step S1306.
- FIG. 14 is a flowchart in the case where the correction animation is used for the guidance display.
- step S1401 the variable i is set to 1.
- step S1402 an image in which the subject is corrected by ⁇ (total correction amount Zn) * i ⁇ with respect to the model is generated.
- n may be determined in advance or may be determined according to the total correction amount.
- step S 1403 if there is a modified image that has been displayed, it is erased, and the display image is updated by displaying the image generated in step S 140 2.
- step S1404 i is incremented.
- n 10. If the subject position is 5 cm away from the model virtual image, the total correction is 5 cm. When 5cm is divided by 10, it is 0.5cm, so a virtual image that is 0.5cm closer to the virtual image from the subject is displayed first, and then a virtual image that is 0.5cm, that is, lcm closer is displayed. In the same way, animations that display corrections are constructed by sequentially displaying virtual images that gradually approach the model by 0.5 cm.
- FIG. 15 is a diagram for explaining a detection method according to an embodiment of the present invention when the model is hidden.
- the subject 108 is covered with a virtual image 106 that also looks at the user's viewpoint.
- the model virtual image 106 may become invisible while the correction is incomplete.
- it is desirable to perform auxiliary display such as auxiliary lines and auxiliary surfaces.
- the guidance display is performed only when the subject is within the detectable range, as shown in FIG. 15, the region connecting the detectable image and the virtual image 106 of the model from the line of sight is indicated by the oblique line overlapping.
- the auxiliary display may be performed when the subject 108 enters the range.
- a force that requires a margin in an area that requires auxiliary display due to the height difference of the user Since the user uses the device from a certain direction, the user's viewpoint position can be limited to some extent. Measure and determine in advance the area that requires auxiliary display.
- FIG. 16 is a diagram showing auxiliary display using auxiliary lines according to the embodiment of the present invention.
- FIG. 16 shows an example in which auxiliary lines indicating the positions and postures are displayed so as to surround the subject and the virtual image of the model.
- the auxiliary line is centered on the subject 108 and the virtual image 106 of the model.
- the guidance display such as an arrow is displayed not on the subject 108 itself but on the auxiliary line.
- FIG. 17 is a diagram showing auxiliary display by the auxiliary surface according to the embodiment of the present invention.
- Fig. 17 shows an example of displaying auxiliary planes indicating the position and orientation of the subject and the model around the virtual image. Guidance is performed by superimposing an animation display and an arrow display in which the auxiliary surface indicating the subject 108 moves to the virtual image position of the model.
- auxiliary line and the auxiliary surface may be displayed only when the subject is present at a position where the virtual image of the model may be hidden or may be always displayed, but only when necessary. Appropriate to display.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Processing Or Creating Images (AREA)
- User Interface Of Digital Computer (AREA)
- Image Input (AREA)
Abstract
Description
Claims
Priority Applications (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2008520096A JP4712874B2 (ja) | 2006-06-08 | 2006-06-08 | 誘導装置および方法 |
PCT/JP2006/311531 WO2007141860A1 (ja) | 2006-06-08 | 2006-06-08 | 誘導装置および方法 |
KR1020087029771A KR100999989B1 (ko) | 2006-06-08 | 2006-06-08 | 유도 장치 및 방법 |
EP06757178A EP2031556A4 (en) | 2006-06-08 | 2006-06-08 | GUIDANCE APPARATUS AND METHOD |
CNA2006800548823A CN101460972A (zh) | 2006-06-08 | 2006-06-08 | 引导装置及方法 |
US12/329,897 US20090091531A1 (en) | 2006-06-08 | 2008-12-08 | Guidance device and method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2006/311531 WO2007141860A1 (ja) | 2006-06-08 | 2006-06-08 | 誘導装置および方法 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/329,897 Continuation US20090091531A1 (en) | 2006-06-08 | 2008-12-08 | Guidance device and method |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2007141860A1 true WO2007141860A1 (ja) | 2007-12-13 |
Family
ID=38801132
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2006/311531 WO2007141860A1 (ja) | 2006-06-08 | 2006-06-08 | 誘導装置および方法 |
Country Status (6)
Country | Link |
---|---|
US (1) | US20090091531A1 (ja) |
EP (1) | EP2031556A4 (ja) |
JP (1) | JP4712874B2 (ja) |
KR (1) | KR100999989B1 (ja) |
CN (1) | CN101460972A (ja) |
WO (1) | WO2007141860A1 (ja) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2011013710A (ja) * | 2009-06-30 | 2011-01-20 | Nec Corp | 生体パターン撮像装置 |
WO2012014304A1 (ja) * | 2010-07-29 | 2012-02-02 | 富士通株式会社 | 生体認証装置および生体認証プログラム |
WO2013005306A1 (ja) * | 2011-07-05 | 2013-01-10 | 富士通株式会社 | 認証装置、電子装置、方法及びプログラム |
WO2013005305A1 (ja) * | 2011-07-05 | 2013-01-10 | 富士通株式会社 | 認証装置、電子装置、方法及びプログラム |
JP2013047918A (ja) * | 2011-08-29 | 2013-03-07 | Fujitsu Ltd | 電子装置、生体画像認証装置、生体画像認証プログラムおよび生体画像認証方法 |
JP2014115939A (ja) * | 2012-12-12 | 2014-06-26 | Sony Corp | 情報処理装置、プログラム及び情報処理方法 |
WO2017082100A1 (ja) * | 2015-11-10 | 2017-05-18 | 株式会社日立製作所 | 生体情報を用いた認証装置及び認証方法 |
JP2020077209A (ja) * | 2018-11-07 | 2020-05-21 | 日立オムロンターミナルソリューションズ株式会社 | 画像読取り装置及び方法 |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9113074B2 (en) * | 2010-12-22 | 2015-08-18 | Olympus Corporation | Imaging apparatus, imaging method, and computer readable storage medium for applying special effects processing to an automatically set region of a stereoscopic image |
US10140537B2 (en) * | 2012-10-26 | 2018-11-27 | Daon Holdings Limited | Methods and systems for capturing biometric data |
CN105791660A (zh) * | 2014-12-22 | 2016-07-20 | 中兴通讯股份有限公司 | 一种纠正被摄物体拍摄倾斜的方法、装置及移动终端 |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0421884A (ja) | 1990-05-17 | 1992-01-24 | Canon Inc | 画像形成装置 |
JPH05108808A (ja) | 1991-10-14 | 1993-04-30 | Sharp Corp | 指紋入力装置 |
JPH1091784A (ja) * | 1996-09-13 | 1998-04-10 | Toshiba Corp | 個人認証装置 |
JP2003108983A (ja) * | 2001-09-28 | 2003-04-11 | Matsushita Electric Ind Co Ltd | 目画像撮像装置及び虹彩認証装置並びに虹彩認証機能付き携帯端末装置 |
WO2004021884A1 (ja) | 2002-09-03 | 2004-03-18 | Fujitsu Limited | 個人認識装置 |
WO2004084140A1 (ja) | 2003-03-18 | 2004-09-30 | Fujitsu Limited | 撮影装置 |
JP2006026427A (ja) * | 2005-08-25 | 2006-02-02 | Hitachi Ltd | 個人認証装置 |
Family Cites Families (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4889425A (en) * | 1987-11-02 | 1989-12-26 | The Boeing Company | Laser alignment system |
US5054090A (en) * | 1990-07-20 | 1991-10-01 | Knight Arnold W | Fingerprint correlation system with parallel FIFO processor |
US5574511A (en) * | 1995-10-18 | 1996-11-12 | Polaroid Corporation | Background replacement for an image |
US5828773A (en) * | 1996-01-26 | 1998-10-27 | Harris Corporation | Fingerprint sensing method with finger position indication |
US6862098B1 (en) * | 1999-02-26 | 2005-03-01 | Anritsu Corporation | Apparatus and method for measuring displacement |
JP2001266133A (ja) * | 2000-03-16 | 2001-09-28 | Yamatake Corp | 指紋照合装置 |
JP3825222B2 (ja) * | 2000-03-24 | 2006-09-27 | 松下電器産業株式会社 | 本人認証装置および本人認証システムならびに電子決済システム |
US7379077B2 (en) * | 2001-08-23 | 2008-05-27 | Siemens Corporate Research, Inc. | Augmented and virtual reality guided instrument positioning using along-the-line-of-sight alignment |
US8190239B2 (en) * | 2002-09-03 | 2012-05-29 | Fujitsu Limited | Individual identification device |
JP3860552B2 (ja) * | 2003-03-25 | 2006-12-20 | 富士通株式会社 | 撮影装置 |
US7369759B2 (en) * | 2003-03-27 | 2008-05-06 | Matsushita Electric Industrial Co., Ltd. | Eye image pickup apparatus, iris authentication apparatus and portable terminal device having iris authentication function |
JP4207717B2 (ja) * | 2003-08-26 | 2009-01-14 | 株式会社日立製作所 | 個人認証装置 |
WO2005106774A2 (en) * | 2004-04-23 | 2005-11-10 | Validity Sensors, Inc. | Methods and apparatus for acquiring a swiped fingerprint image |
JP4515850B2 (ja) * | 2004-07-30 | 2010-08-04 | 富士通株式会社 | 生体認証装置の誘導画面制御方法、生体認証装置及びそのプログラム |
JP2007099261A (ja) * | 2005-09-12 | 2007-04-19 | Aisin Aw Co Ltd | 駐車支援方法及び駐車支援装置 |
JP4704185B2 (ja) * | 2005-10-27 | 2011-06-15 | 富士通株式会社 | 生体認証システム及び生体認証方法 |
-
2006
- 2006-06-08 KR KR1020087029771A patent/KR100999989B1/ko not_active IP Right Cessation
- 2006-06-08 CN CNA2006800548823A patent/CN101460972A/zh active Pending
- 2006-06-08 JP JP2008520096A patent/JP4712874B2/ja not_active Expired - Fee Related
- 2006-06-08 WO PCT/JP2006/311531 patent/WO2007141860A1/ja active Application Filing
- 2006-06-08 EP EP06757178A patent/EP2031556A4/en not_active Withdrawn
-
2008
- 2008-12-08 US US12/329,897 patent/US20090091531A1/en not_active Abandoned
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0421884A (ja) | 1990-05-17 | 1992-01-24 | Canon Inc | 画像形成装置 |
JPH05108808A (ja) | 1991-10-14 | 1993-04-30 | Sharp Corp | 指紋入力装置 |
JPH1091784A (ja) * | 1996-09-13 | 1998-04-10 | Toshiba Corp | 個人認証装置 |
JP2003108983A (ja) * | 2001-09-28 | 2003-04-11 | Matsushita Electric Ind Co Ltd | 目画像撮像装置及び虹彩認証装置並びに虹彩認証機能付き携帯端末装置 |
WO2004021884A1 (ja) | 2002-09-03 | 2004-03-18 | Fujitsu Limited | 個人認識装置 |
WO2004084140A1 (ja) | 2003-03-18 | 2004-09-30 | Fujitsu Limited | 撮影装置 |
JP2006026427A (ja) * | 2005-08-25 | 2006-02-02 | Hitachi Ltd | 個人認証装置 |
Non-Patent Citations (1)
Title |
---|
See also references of EP2031556A4 |
Cited By (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2011013710A (ja) * | 2009-06-30 | 2011-01-20 | Nec Corp | 生体パターン撮像装置 |
JP5747916B2 (ja) * | 2010-07-29 | 2015-07-15 | 富士通株式会社 | 生体認証装置および生体認証プログラム |
WO2012014304A1 (ja) * | 2010-07-29 | 2012-02-02 | 富士通株式会社 | 生体認証装置および生体認証プログラム |
US9122900B2 (en) | 2010-07-29 | 2015-09-01 | Fujitsu Limited | Biometric authentication device and computer readable, non-transitory medium |
WO2013005305A1 (ja) * | 2011-07-05 | 2013-01-10 | 富士通株式会社 | 認証装置、電子装置、方法及びプログラム |
WO2013005306A1 (ja) * | 2011-07-05 | 2013-01-10 | 富士通株式会社 | 認証装置、電子装置、方法及びプログラム |
JP2013047918A (ja) * | 2011-08-29 | 2013-03-07 | Fujitsu Ltd | 電子装置、生体画像認証装置、生体画像認証プログラムおよび生体画像認証方法 |
JP2014115939A (ja) * | 2012-12-12 | 2014-06-26 | Sony Corp | 情報処理装置、プログラム及び情報処理方法 |
US9465971B2 (en) | 2012-12-12 | 2016-10-11 | Sony Corporation | Information processing device, program, and information processing method |
WO2017082100A1 (ja) * | 2015-11-10 | 2017-05-18 | 株式会社日立製作所 | 生体情報を用いた認証装置及び認証方法 |
JP2017091186A (ja) * | 2015-11-10 | 2017-05-25 | 株式会社日立製作所 | 生体情報を用いた認証装置及び認証方法 |
JP2020077209A (ja) * | 2018-11-07 | 2020-05-21 | 日立オムロンターミナルソリューションズ株式会社 | 画像読取り装置及び方法 |
JP7164405B2 (ja) | 2018-11-07 | 2022-11-01 | 日立チャネルソリューションズ株式会社 | 画像読取り装置及び方法 |
Also Published As
Publication number | Publication date |
---|---|
KR100999989B1 (ko) | 2010-12-10 |
CN101460972A (zh) | 2009-06-17 |
EP2031556A1 (en) | 2009-03-04 |
EP2031556A4 (en) | 2013-01-02 |
US20090091531A1 (en) | 2009-04-09 |
JPWO2007141860A1 (ja) | 2009-10-15 |
KR20090010099A (ko) | 2009-01-28 |
JP4712874B2 (ja) | 2011-06-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP4712874B2 (ja) | 誘導装置および方法 | |
JP6678122B2 (ja) | 画像処理装置、画像処理方法及び画像処理プログラム | |
US9595127B2 (en) | Three-dimensional collaboration | |
JP6074494B2 (ja) | 形状認識装置、形状認識プログラム、および形状認識方法 | |
JP5646263B2 (ja) | 画像処理プログラム、画像処理装置、画像処理システム、および、画像処理方法 | |
US20120162384A1 (en) | Three-Dimensional Collaboration | |
CN107209564A (zh) | 将现实世界比例应用于虚拟内容 | |
KR20150093831A (ko) | 혼합 현실 환경에 대한 직접 상호작용 시스템 | |
WO2017126172A1 (ja) | 情報処理装置、情報処理方法、及び記録媒体 | |
KR20160005762A (ko) | 홍채 영상 장치의 차선으로 최적화된 배향을 보상하는 방법 및 장치 | |
CN109478227A (zh) | 计算设备上的虹膜或其他身体部位识别 | |
EP2981074B1 (en) | Display device, display method, and display program | |
JP5602702B2 (ja) | 画像処理プログラム、画像処理装置、画像処理システム、および、画像処理方法 | |
CN110278366A (zh) | 一种全景图像虚化方法、终端及计算机可读存储介质 | |
CN111491159A (zh) | 一种增强现实的显示系统及方法 | |
CN109084679A (zh) | 一种基于空间光调制器的3d测量及获取装置 | |
EP4394706A1 (en) | Spatial positioning method and apparatus | |
JP2011113196A (ja) | 顔方向特定装置及び撮像装置 | |
JPH10188029A (ja) | 仮想空間生成装置 | |
WO2013005306A1 (ja) | 認証装置、電子装置、方法及びプログラム | |
KR20150137908A (ko) | 홀로그래피 터치 방법 및 프로젝터 터치 방법 | |
KR101591038B1 (ko) | 홀로그래피 터치 방법 및 프로젝터 터치 방법 | |
JP2014204289A (ja) | 画像表示装置、画像表示システム、および画像表示方法 | |
JP2011059854A (ja) | 画像表示装置及び画像表示方法 | |
KR20160017020A (ko) | 홀로그래피 터치 방법 및 프로젝터 터치 방법 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 200680054882.3 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 06757178 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2008520096 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2006757178 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1020087029771 Country of ref document: KR |