WO2004100069A1 - 画像処理装置、および画像処理方法 - Google Patents
画像処理装置、および画像処理方法 Download PDFInfo
- Publication number
- WO2004100069A1 WO2004100069A1 PCT/JP2004/006328 JP2004006328W WO2004100069A1 WO 2004100069 A1 WO2004100069 A1 WO 2004100069A1 JP 2004006328 W JP2004006328 W JP 2004006328W WO 2004100069 A1 WO2004100069 A1 WO 2004100069A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- data
- image data
- pixel data
- pixel
- image
- Prior art date
Links
- 238000012545 processing Methods 0.000 title claims abstract description 324
- 238000003672 processing method Methods 0.000 title claims description 24
- 238000013507 mapping Methods 0.000 claims abstract description 28
- 238000000034 method Methods 0.000 claims description 110
- 230000008569 process Effects 0.000 claims description 109
- 238000003384 imaging method Methods 0.000 claims description 77
- 238000003379 elimination reaction Methods 0.000 claims description 16
- 230000008030 elimination Effects 0.000 claims description 14
- 230000003044 adaptive effect Effects 0.000 claims description 5
- 238000012935 Averaging Methods 0.000 claims description 4
- 238000010586 diagram Methods 0.000 description 95
- 238000004891 communication Methods 0.000 description 40
- 230000006870 function Effects 0.000 description 28
- 210000004204 blood vessel Anatomy 0.000 description 26
- 230000007850 degeneration Effects 0.000 description 24
- 210000003462 vein Anatomy 0.000 description 13
- 238000006243 chemical reaction Methods 0.000 description 12
- 239000000284 extract Substances 0.000 description 12
- 238000000605 extraction Methods 0.000 description 12
- 230000004069 differentiation Effects 0.000 description 9
- 230000010339 dilation Effects 0.000 description 8
- 238000001914 filtration Methods 0.000 description 7
- 230000003287 optical effect Effects 0.000 description 7
- 230000014509 gene expression Effects 0.000 description 5
- 230000001678 irradiating effect Effects 0.000 description 5
- 230000008859 change Effects 0.000 description 4
- 238000003708 edge detection Methods 0.000 description 4
- 238000009499 grossing Methods 0.000 description 4
- 239000011159 matrix material Substances 0.000 description 4
- 239000000470 constituent Substances 0.000 description 3
- 238000001514 detection method Methods 0.000 description 3
- 230000008802 morphological function Effects 0.000 description 3
- 230000009467 reduction Effects 0.000 description 3
- 102000001554 Hemoglobins Human genes 0.000 description 2
- 108010054147 Hemoglobins Proteins 0.000 description 2
- 101100381996 Saccharomyces cerevisiae (strain ATCC 204508 / S288c) BRO1 gene Proteins 0.000 description 2
- 239000008280 blood Substances 0.000 description 2
- 210000004369 blood Anatomy 0.000 description 2
- 230000007246 mechanism Effects 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 230000002792 vascular Effects 0.000 description 2
- 238000012795 verification Methods 0.000 description 2
- 241001270131 Agaricus moelleri Species 0.000 description 1
- 241001061260 Emmelichthys struhsakeri Species 0.000 description 1
- 125000002066 L-histidyl group Chemical group [H]N1C([H])=NC(C([H])([H])[C@](C(=O)[*])([H])N([H])[H])=C1[H] 0.000 description 1
- 101100123718 Neurospora crassa (strain ATCC 24698 / 74-OR23-1A / CBS 708.71 / DSM 1257 / FGSC 987) pda-1 gene Proteins 0.000 description 1
- 108010000020 Platelet Factor 3 Proteins 0.000 description 1
- 230000017531 blood circulation Effects 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 230000008602 contraction Effects 0.000 description 1
- 239000013256 coordination polymer Substances 0.000 description 1
- 230000006837 decompression Effects 0.000 description 1
- 229910052736 halogen Inorganic materials 0.000 description 1
- 150000002367 halogens Chemical class 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000001788 irregular Effects 0.000 description 1
- 230000031700 light absorption Effects 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 230000035699 permeability Effects 0.000 description 1
- 238000011946 reduction process Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/40—Image enhancement or restoration using histogram techniques
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/90—Dynamic range modification of images or parts thereof
- G06T5/94—Dynamic range modification of images or parts thereof based on local image properties, e.g. for local contrast enhancement
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/11—Region-based segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/107—Static hand or arm
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30101—Blood vessel; Artery; Vein; Vascular
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/14—Vascular patterns
Definitions
- the present invention relates to, for example, an image processing apparatus and an image processing method for processing image data obtained by imaging a subject.
- an identification device that performs personal identification processing using image data obtained by capturing an image of a living body (subject) is known (see, for example, Japanese Patent Application Laid-Open No. H10-127609). .
- the transmitted light of the subject's hand is imaged, and binarized image data is generated based on a predetermined threshold value for the pixel value of the image data to perform the identification processing.
- the identification device performs an identification process based on a pattern indicating the arrangement of blood vessels in the binary image data.
- the distribution of the pixel values of the imaging data differs for each subject.
- image data of a subject having a large amount of fat components has a wider range of pixel value distribution data and relatively higher average values of pixel values than image data of a subject having a small amount of fat components.
- the above-described conventional identification device performs binarization processing based on a predetermined threshold value, for example, image data of a subject having a small amount of fat components can generate appropriate binary image data. With image data of a subject with a large amount of fat components, binarized image data having biased pixel values may be generated and binarization processing may not be properly performed. Therefore, improvement is desired.
- the image data obtained by imaging the subject described above includes an area of a minute size corresponding to a noise component, and this noise component has a large size in the accuracy of the identification processing. Affect. Therefore, there is a demand for removing a region of a predetermined size corresponding to a noise component from image data.
- the line-shaped pattern in the image data is important, but the line-shaped pattern may be divided by noise or the like, and the line-shaped pattern may not be clearly visible. Therefore, there is a demand that image data including a clear linear pattern be obtained by connecting pixel data that are close to each other in consideration of noise and the like. Disclosure of the invention
- an image processing apparatus comprises: first image data obtained by imaging an object; and a pixel value in a first range defined in advance.
- Distribution data generating means for generating distribution data indicating the distribution of the pixel data for the plurality of pieces of pixel data to be displayed; and 2 of the first range based on the distribution data generated by the distribution data generating means.
- the distribution data generating means is configured to constitute first image data obtained by imaging the subject and to define pixels in a first range defined in advance. For a plurality of pixel data indicating values, distribution data indicating a distribution of the pixel data is generated.
- the specifying means specifies a second range to be binarized in the first range based on the distribution data generated by the distribution data generating means.
- the mapping means maps pixel data in the second range specified by the specifying means from the plurality of pieces of pixel data to the first range, and generates second image data composed of the mapped pixel data. Generate.
- the binarizing means binarizes the second image data generated by the mapping means based on a threshold defined in the first range to generate third image data.
- an image processing apparatus includes a plurality of pixel data indicating pixel values, which constitute first image data obtained by imaging an object.
- a first processing unit that sets the minimum pixel data among the pixel data in the first area around the pixel data as the pixel data; and
- second processing means for generating second image data using the largest pixel data among the pixel data in the second area larger than the first area surrounding the pixel data as the pixel data.
- an image processing method comprises: a first image data unit configured to form first image data obtained by imaging an object; A first step of generating distribution data indicating a distribution of the pixel data for a plurality of pieces of pixel data indicating values, and the first range based on the distribution data generated in the first step A second step of specifying a second range to be binarized among the plurality of pieces of pixel data; anda pixel data in the second range specified in the second step among the plurality of pixel data. A third step of mapping to the range of 1 to generate second image data composed of the pixel data that has been mapped, and the second image data generated in the third step.
- an image processing method includes a plurality of pieces of pixel data indicating pixel values, constituting first image data obtained by imaging an object.
- an image processing apparatus and an image processing method that can appropriately perform binarization processing even when pixel value distribution data differs for each subject.
- an image processing apparatus and an image processing method capable of removing an area smaller than a predetermined size from image data obtained by imaging a subject and connecting pixel data close to a certain extent can be provided.
- FIG. 1 is an overall conceptual diagram showing a first embodiment of a data processing device according to the present invention.
- FIG. 2 is a hardware block diagram of the data processing device shown in FIG.
- FIG. 3 is a functional block diagram of the data processing device shown in FIG.
- FIG. 4A to 4E are diagrams for explaining the operation of the data processing device shown in FIG.
- FIG. 4A is a diagram showing an example of the image data S11.
- FIG. 4B is a diagram showing an example of the image data S1081.
- FIG. 4C is a diagram showing an example of the distribution data d1.
- FIG. 4D is an enlarged view of the distribution data.
- FIG. 4E is a diagram showing an example of the image data S 1084.
- FIGS. 5A and 5B are diagrams for explaining the operation of the specifying unit shown in FIG.
- FIG. 5A is a diagram illustrating an example of the distribution data d1.
- Figure 5B shows an example of distribution data d 1 ' FIG.
- FIG. 6 is a flowchart for explaining the operation related to the mapping process of the data processing device shown in FIG.
- FIG. 7 is a functional block diagram related to a filtering process of the data processing device shown in FIG.
- FIG. 8 is a diagram for explaining a Gaussian filter.
- FIGS. 9A to 9F are diagrams for explaining a Gaussian Laplacian filter.
- FIG. 9A is a diagram showing an example of the pixel values of the step shape.
- FIG. 9B is a diagram illustrating an example of a pixel value.
- FIG. 9C is a diagram showing the pixel values subjected to the primary differentiation processing.
- FIG. 9D is a diagram illustrating an example of a pixel value.
- FIG. 9E is a diagram showing an example of a pixel value that has been subjected to the primary differentiation processing.
- FIG. 9F is a diagram illustrating an example of a pixel value that has been subjected to the secondary differentiation processing.
- FIGS. 10A to 10C are diagrams for explaining the noise removal processing of the data processing device shown in FIG.
- FIG. 1OA shows an example of the image data S 1804.
- FIG. 10B is a diagram showing an example of the image data S 1805.
- FIG. 10C is a diagram showing an example
- FIG. 11 is a flowchart for explaining the operation of the data processing device shown in FIG.
- FIGS. 12A to 12D are conceptual diagrams for explaining the operation of the data processing device shown in FIG.
- FIG. 12A is a diagram showing an example of image data including a noise component.
- FIG. 12B is a diagram illustrating an example of image data that has been subjected to noise removal processing.
- FIG. 12C is a diagram showing an example of image data.
- FIG. 12D is a diagram showing an example of the image data subjected to the connection processing.
- FIGS. 13A to 13F are diagrams for explaining the degeneration processing and the expansion processing of the data processing device shown in FIG.
- FIG. 13A is a diagram showing an example of pixel data.
- FIG. 13B is an example of pixel data when a degeneration process is performed based on pixels in a cross-shaped element.
- Figure 13C performs dilation processing based on the pixels in the cross-shaped element. It is an example of the pixel data at the time of the connection.
- FIG. 13D is a diagram showing an example of pixel data.
- FIG. 13E is an example of pixel data when the degeneration processing is performed based on the pixels in the 3 ⁇ 3 element.
- FIG. 13F is an example of pixel data when the dilation processing is performed based on the pixels in the 3 ⁇ 3 element.
- FIG. 14A to 14C are diagrams for explaining the operation of the data processing device shown in FIG.
- FIG. 14A is a diagram illustrating an example of the image data S 1807.
- FIG. 14B is a diagram illustrating an example of the image data S 1808.
- FIG. 14C is a diagram showing an example of the image data S1810.
- FIG. 15 is a flowchart for explaining the operation of the data processing device shown in FIG.
- FIGS. 16A to 16F are diagrams for explaining the operation of the first low-pass filter processing of the data processing device shown in FIG.
- FIG. 16A is a diagram illustrating an example of a reference region in a two-dimensional Fourier space.
- FIG. 16B is a diagram showing an example of an area obtained by enlarging the reference area by a predetermined factor.
- FIG. 16C is a diagram illustrating an example of a low-pass filter.
- FIG. 16D is a diagram showing an example of image data.
- FIG. 16E is a diagram showing an example of image data subjected to the low-pass filter processing.
- FIG. 16F is a diagram showing an example of image data subjected to the binarization processing.
- FIGS. 17 to 17E are diagrams for explaining the operation of the second low-pass filter processing of the low-pass filter unit.
- FIG. 17A is a diagram illustrating an example of a reference region on a two-dimensional Fourier space.
- FIG. 17B is a diagram illustrating an example of a low-pass filter.
- FIG. 17C is a diagram showing an example of image data.
- FIG. 17D is a diagram illustrating an example of image data that has been subjected to the low-pass filter processing.
- FIG. 17E is a diagram showing an example of image data that has been subjected to the binary image processing.
- FIGS. 18A to 18E are diagrams for explaining the operation of the third mouth-to-pass filter process of the mouth-to-pass filter unit.
- FIG. 18A is a diagram showing an example of a reference region on a two-dimensional Fourier space.
- FIG. 17B is a diagram illustrating an example of a low-pass filter.
- Figure 1 FIG. 8C is a diagram showing an example of image data.
- FIG. 18D is a diagram illustrating an example of image data that has been subjected to the low-pass filter processing.
- FIG. 18E is a diagram showing an example of image data that has been subjected to the binary image processing.
- FIGS. 19 to 19F are diagrams for explaining the operation of the low-pass filter unit of the data processing device shown in FIG.
- FIG. 19A is a diagram showing an example of the image data S1810.
- FIG. 19B is a diagram showing an example of the image data S 18102.
- FIG. 19C is a diagram showing an example of the image data S18103.
- FIG. 19D is a diagram showing an example of the image data S18102.
- FIG. 19E is a diagram showing an example of the image data S18104.
- FIG. 19F is a diagram showing an example of the image data S 1810 05.
- FIGS. 2OA to 20C are diagrams for explaining the operation of the low-pass filter unit of the data processing device shown in FIG.
- FIG. 2OA shows an example of the image data S 1804.
- FIG. 20B is a diagram showing an example of the image data S181006.
- FIG. 20C is a diagram showing an example of the image data S 1811.
- FIG. 21 is a flowchart for explaining the operation of the low-pass filter unit of the data processing device shown in FIG.
- FIGS. 22A to 22C are diagrams for explaining operations of the mask unit and the skeleton unit of the data processing device shown in FIG.
- FIG. 22A is a diagram illustrating an example of a mask pattern.
- FIG. 22B is a diagram showing an example of the image data S1812.
- FIG. 22C is a diagram illustrating an example of the image data S1813.
- FIG. 23 is a flowchart for explaining the overall operation of the data processing device shown in FIG.
- FIG. 24 is a diagram for explaining a second embodiment of the remote control device using the data processing device according to the present invention.
- FIG. 25 is a flowchart for explaining the operation of the remote controller 1a shown in FIG.
- FIG. 26 shows a third embodiment of a data processing system using the data processing device according to the present invention. It is a figure for explaining an embodiment.
- FIG. 27 is a flowchart for explaining the operation of the data processing system shown in FIG.
- FIG. 28 is a diagram for explaining a fourth embodiment of a portable communication device using the data processing device according to the present invention.
- FIG. 29 is a flowchart for explaining the operation of the data processing device shown in FIG.
- FIG. 30 is a diagram for explaining a fifth embodiment of the data processing device according to the present invention.
- FIG. 31 is a flowchart for explaining the operation of the telephone shown in FIG. 30.
- FIG. 32 is a diagram for explaining a sixth embodiment of the data processing device according to the present invention.
- FIG. 33 is a diagram for explaining a seventh embodiment of the data processing device according to the present invention.
- An image processing apparatus includes: a plurality of pixel data that form image data based on image data obtained by imaging a subject and that indicate pixel values in a first predetermined range; Generates distribution data indicating the distribution of the pixel data, specifies a second range to be binarized, maps pixel data in the second range to the first range, and configures the mapped pixel data. Image data to be converted, and the image data is binarized based on a threshold value defined in the first range to generate binary image data.
- the image processing apparatus removes an area smaller than a predetermined size from image data obtained by imaging the subject, and connects pixel data that is close to some extent. Continue.
- the image processing apparatus includes, for each of a plurality of pixel data indicating a pixel value, which constitutes first image data obtained by imaging the subject, in a first area around the pixel data.
- the minimum pixel data among the pixel data of the pixel data and further, for each of the pixel data, the largest pixel data of the pixel data in a second area larger than the first area surrounding the pixel data.
- the second image data is generated using the data as pixel data.
- an image processing apparatus a portion of a living body as a subject h in which a blood vessel is formed is imaged to generate image data, and the image data is image-processed to generate a blood vessel.
- a data processing device that extracts information and performs an authentication process based on the extracted blood vessel information will be described.
- FIG. 1 is an overall conceptual diagram showing a first embodiment of a data processing device according to the present invention.
- the data processing device 1 includes an imaging system 101, an extraction unit 102, and an authentication unit 103.
- the data processing device 1 corresponds to an example of an image processing device according to the present invention.
- the imaging system 101 captures the subject h to generate image data, and outputs the image data to the extraction unit 102 as a signal S11.
- the imaging system 101 has an irradiation unit 101 and an optical lens 101.
- the irradiating unit 101 is configured by, for example, a halogen lamp or the like, and irradiates a part of the subject h with an electromagnetic wave, for example, near infrared rays by a control signal.
- near-infrared light in the red to infrared region in the wavelength range of about 600 nm to 13011 m is compared with electromagnetic waves in other wavelength areas.
- electromagnetic waves are absorbed by hemoglobin in the blood, so that the blood flows to a large blood vessel near the surface on the palm side. Image data is obtained in a region corresponding to that of the region other than the region corresponding to the blood vessel.
- the veins of blood vessels are acquired during the growth process, and the shape of the blood vessels varies greatly between individuals.
- image data obtained by imaging the blood vessel is used as identification information unique to an individual in the authentication process.
- the optical lens 101 forms an image of the transmitted light from the subject h on the imaging unit 11.
- the imaging unit 11 generates image data S11 based on the transmitted light formed by the optical lens 1012.
- the imaging unit 11 includes a charge-coupled device (CCD) image sensor or a complementary metal-oxide semiconductor (C-MOS) image sensor and extracts image data S11. Output to 102.
- the image data S11 may be an RGB (red-green-blue) signal, or may be image data of other colors or gray scales.
- the extraction unit 102 performs image processing based on the image data S 11, extracts image data used for authentication, for example, skeleton image data, and outputs it to the authentication unit 103 as a signal S 102.
- the authentication unit 103 performs a matching process with registered image data stored in advance based on the signal S102 from the extraction unit 102, and performs an authentication process.
- FIG. 2 is a hardware block diagram of the data processing device shown in FIG.
- the data processing device 1 includes an imaging unit 11, an input unit 12, an output unit 13, a communication interface (IZF) 14, a RAM (Random Access Memory) 15, a ROM (Read only). memory) 16, a storage unit 17, and a CPU 18.
- IZF communication interface
- RAM Random Access Memory
- ROM Read only memory
- Imager 11 input 12, output 13, communication interface (IZF) 14, RAM 15, ROM 16, storage 17, and CPU (Central Processing) Unit) 18 is connected by a bus BS.
- IZF communication interface
- ROM read-only memory
- CPU Central Processing Unit
- the imaging unit 11 generates image data of the subject h under the control of the CPU 18 and outputs the data as a signal S11.
- the input unit 12 outputs a signal corresponding to, for example, a user operation to the CPU 18.
- the input unit 12 includes a keyboard, a mouse, a touch panel, and the like.
- the output unit 13 performs output according to predetermined data under the control of the CPU 18.
- the output unit 13 is configured by a display device such as a display.
- the communication interface (I / F) 14 performs data communication with another data processing device under the control of the CPU 18 via, for example, a communication network (not shown).
- the RAMI 5 is used as, for example, a workspace of the CPU 18.
- the ROM 16 stores data such as initial values and initial parameters, and the data is used by the CPU 18.
- the storage unit 17 predetermined data is written and read by the CPU 18.
- the storage unit 17 is configured by a storage device such as an HDD (Hard disk drive).
- the storage unit 17 has a program PRG, image data D-P, and the like, for example, as shown in FIG.
- the program PRG includes a function according to an embodiment of the present invention, for example, a function of the extraction unit 102 and the authentication unit 103, and the function is realized by being executed by the CPU 18.
- the image data DP is image data such as registered image data used for an authentication process, for example.
- FIG. 3 is a functional block diagram of the data processing device shown in FIG.
- the CPU 18 executes the program PRG as shown in FIG. 3 so that the grayscale conversion unit 1801 and the distribution data generation Component section 1802, Specific section 1803, Mapping section 1804, Gaussian filter 1805, Gaussian Laplacian 1806, First degeneration processing section 1807, First month Peng Zhang processing section 1808, 2nd expansion processing section 1809, 2nd degeneration processing section 1810, low-pass filter section 1811, mask section 1812, and skeleton section 1 Implement the functions of 8 13.
- the present invention is not limited to this mode.
- the functions of the components shown in FIG. 3 may be realized by hardware.
- the distribution data generating unit 1802 corresponds to an example of the distribution data generating unit according to the present invention
- the specifying unit 1803 corresponds to an example of the specifying unit according to the present invention.
- the muting section 1804 corresponds to an example of the muving means according to the present invention
- the low-pass filter section 1811 corresponds to an example of the filter processing means according to the present invention.
- Gaussian filter 1805, Gaussian Laplacian 1806, first degeneration processing unit 1807, first expansion processing unit 18008, second expansion processing unit 18009, second Degeneration processing section 1810, low-pass filter section 1811, mask section 1812, and skeleton section 1813 3 These correspond to an example of the binarization means according to the present invention.
- the first degeneration processing unit 1807 corresponds to an example of a first processing unit according to the present invention
- the first expansion processing unit 18008 corresponds to an example of a fourth processing unit according to the present invention
- the second expansion processing section 18009 corresponds to an example of the second processing means according to the present invention
- the second degeneration processing section 18010 corresponds to an example of the third processing means according to the present invention. Is equivalent to
- the grayscale conversion unit 1801 converts the RGB signal S111 from the imaging unit 11 into grayscale and outputs it as a signal S1801 to the distribution data generation unit 1802. I do.
- the gray scale conversion section 1801 converts the RGB signal into a predetermined gradation from white to black, for example, 256 gradations. +
- the imaging unit 11 generates the RGB signal S 11, and the Gracecanoré conversion unit 1801 converts the signal S 11 into gray scale.
- this embodiment is not limited to this embodiment. It is not something that can be done.
- the imaging unit 11 When the data S111 is generated, the grayscale conversion unit 1801 is not provided.
- 4A to 4E are diagrams for explaining the operation of the data processing device shown in FIG.
- the imaging unit 11 captures, for example, a finger of a living body of the subject h and outputs RGB image data S11 as shown in FIG. 4A.
- the grayscale conversion unit 1801 generates, for example, grayscale image data S1802 as shown in FIG. 4B based on the image data S111, and a distribution data generation unit 180 Output to 2.
- the distribution data generation unit 1802 includes a plurality of values representing pixel values in a first range defined in the image data based on the signal S1801 from the grayscale conversion unit 1801. With respect to the pixel data of, distribution data d1 indicating the distribution of the pixel data is generated, and is output to the specifying unit 1803 as a signal S1802.
- the distribution data generating unit 1802 determines, for example, the horizontal axis c as a gradation value (also referred to as a pixel value) and the vertical axis f as the number of pixel data (frequency). Then, as shown in FIG. 4C, a histogram is generated as distribution data d 1 for pixel data indicating a pixel value in the range of 256 tones as the first range r 1. In FIG. 4C, a small pixel value corresponds to black, and a large pixel value corresponds to white.
- the distribution data generation unit 1802 generates distribution data d1 indicating the number of pixel data having pixel values for each pixel value in the first range r1.
- the specifying unit 1803 determines a range that is equal to or less than the largest pixel value among the pixel values of a predetermined number of pixel data in the first range r1 based on the signal S1802. , A second range r 2 to be binarized, and output as a signal S 1803.
- the identification unit 1803 determines the number of pixels of the distribution data d 1 within the first range r 1 by a predetermined threshold V ⁇ th.
- the range below the maximum pixel value r11 among the values r11, r12, r13, and r14 is defined as the second range. Specify as box r2.
- the specifying unit 1803 specifies a range of pixel values from 0 to 110 as a second range r2.
- the distribution data of the pixel values of the subject h is different for each subject h.
- the average value is relatively high.
- the specifying unit 1803 sets the pixel value r of a predetermined threshold V—th in the first range r 1.
- a range that is equal to or less than the maximum pixel value r 1 1 ′ among 1 1 ′, rl 2, r 1 3 ′, and r 14 ′ is specified as a second range r 2 ′.
- the mapping unit 1804 converts the pixel data in the second range r2 specified by the specifying unit 1803 out of the plurality of pixel data based on the signal S1803 into a first range r. 1 is generated, and second image data composed of the mapped pixel data is generated and output as a signal S1844.
- the mapping section 1804 sets the pixel value range from 0 to 110 to the second range r2 as shown in FIG. 4C, for example, as shown in FIG. 4D.
- the mapping is performed by enlarging the pixel data to the first range r1, which is the range of pixel values from 0 to 256, and as shown in Fig.4E, the central part of the image data without blood vessel information is enlarged. Then, the second image data S ′ l 804 is generated.
- FIG. 6 is a flowchart for explaining an operation related to the mapping process of the data processing device shown in FIG.
- the operations of the distribution data generation unit 1802, the identification unit 1803, and the mapping unit 1804 will be described with reference to FIGS. 4, 5A, 5B, and 6.
- FIG. 6 is a flowchart for explaining an operation related to the mapping process of the data processing device shown in FIG.
- the operations of the distribution data generation unit 1802, the identification unit 1803, and the mapping unit 1804 will be described with reference to FIGS. 4, 5A, 5B, and 6.
- the imaging unit 11 images the subject h and outputs image data S 11 to the grayscale conversion unit 1801.
- the image data S 11 is converted to the gray scale
- the signal is converted into a gray scale of 256 gradations and input to the distribution data generation unit 1802 as a signal S1801.
- step ST1 the distribution data generating section 1802 forms the image data S based on the signal S1801, for example, as shown in FIG. For a plurality of pixel data indicating a pixel value within 1, distribution data d1 indicating the number of pixel data having the pixel is generated and output as a signal S1802 as a specific unit 1803.
- step ST2 the specifying unit 1803 determines a predetermined number in the first range r1 based on the signal S1802, for example, a threshold V ⁇
- the maximum pixel value r 11 among the pixel values of the th pixel data is specified as the second range r 2 to be binarized, and the mapping unit 1 specifies the signal S 1 803 as the signal S 1803. Output to 804.
- step ST3 the mapping unit 1804 determines the second pixel data specified by the specifying unit 1803 out of the plurality of pixel data based on the signal S1803.
- Pixel data in the range r 2 is mapped to the first range r 1 to generate second image data composed of the mapped pixel data, and output the signal as a signal S 1804.
- step ST4 for example, the second image data S1804 generated by the matting unit 1804 by using the constituent elements 1805-1018
- the third image data is generated by performing a binary process based on a threshold value defined in the above, for example, 10'0 gradation.
- distribution data is generated by the distribution data generation unit 1802, and the second range is generated by the identification unit 1803. Then, the pixel data in the second range is mapped to the first range by the mapping unit 1804, and the first range r 1 is determined by the constituent elements 1805-1018, which will be described later. Since the image data is generated by binarizing based on the threshold specified in Even if the pixel value distribution data d1 differs for each pixel, the binarization process can be performed appropriately. Furthermore, since the pixel data in the specified second range is mapped to the first range, the contrast is increased. However, appropriate binarization processing can be performed.
- the data processing device 1 performs an edge enhancement process after performing a noise removal process on the image data generated in the above-described process. For example, the data processing device 1 performs any one of a plurality of different noise removal processes based on the signal S1804, and performs an edge enhancement process after the noise removal process.
- FIG. 7 is a functional block diagram related to a filtering process of the data processing device shown in FIG.
- the CPU 18 realizes the functions of the selection unit 1814 and the plurality of noise removal filters 1815 shown in FIG. 7 by executing, for example, a program PRG.
- the noise removal filter 1815 corresponds to an example of the noise removal unit according to the present invention.
- the selecting unit 1814 performs a plurality of different noise removing processes of the noise removing filter 1815, and outputs a signal S 1814 for selecting any one of the noise removing filters to the noise removing filter 1815.
- the selection unit 1814 detects the noise distribution characteristic of the signal S 1804 and outputs a signal S 1814 for selecting a noise removal filter suitable for the noise characteristic based on the detection result.
- the selection unit 1814 may output a signal S 1814 for selecting a noise removal filter based on a signal from the input unit 12 according to a user operation.
- the noise elimination filter 1815 is a filter for a plurality of noise elimination processes, such as a Gaussian filter 1815-1, a median filter 1815_2, a maximum: a filter 1815-3, a minimum value filter 1815-4, and a two-dimensional adaptive noise elimination filter 1815—. 5, Neighborhood filter 1815-6, Averaging filter 1815-7, It has a Gaussian low-pass filter 1815-8, a two-dimensional Laplacian approximation filter 1815-9, and a Gaussian Laplacian filter 1815-10. For example, any one (at least one) according to the signal S 1814 from the selection unit 1814 A noise removal filter is selected, the signal S 1804 is subjected to noise removal processing by the selected noise removal filter, and image data S 1806 is generated.
- image data u (n 1, ⁇ 2) with grid points (nl, n 2) on a two-dimensional plane as variables are filtered by a filter h (n 1, n 2), and the following equation (1) is obtained.
- the image data V (nl, n2) is generated as shown in FIG.
- the convolution is expressed as '*'.
- the Gaussian finoleta 1815_1 convolves the Gaussian function h g (n, ⁇ 2) as shown in Expression (2) using, for example, the standard deviation ⁇ .
- ( ⁇ 1, ⁇ 2) is used and the removal process is performed.
- FIG. 8 is a diagram for explaining a Gaussian filter.
- the Gaussian filter 18 15-1 is a smoothing filter, for example, as shown in FIG. 8, performs an arithmetic operation with weighting according to a two-dimensional Gaussian distribution centering on the pixel data of interest and performing smoothing processing.
- the median filter 18 15-2 illustrated in FIG. 8 as the pixel data of interest (0, 0) has, for example, a case where pixel data of the N ⁇ n station Jffg area are arranged in order around the pixel data of interest. The pixel value of the pixel data in the middle of the order is set as the pixel value of the target pixel data.
- the maximum value filter 1815-3 sets, for example, the pixel value of the maximum value among the pixel data of the local area of n X n around the target pixel as the pixel value of the target pixel data.
- the minimum value filter 1815-5-4 sets the minimum pixel value of the pixel data of the n ⁇ n local area around the target pixel as the pixel value of the target pixel data.
- the two-dimensional adaptive noise elimination filter 18 15-5-5 is, for example, a so-called Wiener filter, and performs image processing to minimize the mean square error with the image data to improve the image. .
- the neighborhood filter 18 15-6 is a filter process for calculating an output pixel based on the pixel value of, for example, n ⁇ n pixels in the image data.
- the neighborhood filter 1815-5-6 performs a filtering process based on the maximum value, the minimum value, and the standard deviation from the neighborhood value according to the data.
- the averaging filter 18 15-5-7 performs a filter process of calculating an average value of pixel values of, for example, n ⁇ n pixels in the image data and setting the pixel value as an output pixel value.
- the Gaussian low-pass filter 1 8 1 5—8 performs noise reduction and smoothing processing. Do. Specifically, the Gaussian low-pass filter 1815-8 performs a smoothing process on the image data based on the Gaussian weighting.
- the two-dimensional Laplacian approximation filter 1815-9 performs second derivative processing based on image data, and performs edge detection and the like.
- Laplacian which will be described in detail below, can be expressed in a two-dimensional Euclidean coordinate system as shown in, for example, Expression (4).
- the Laplacian can be displayed in a 3 ⁇ 3 matrix as shown in Expression (5) using, for example, a predetermined number.
- the pixel of interest is the center of the matrix.
- Equation (6) for example, using the standard deviation ⁇ , Equation (6)
- the convolution of the Gaussian function hg (nl, n 2) is performed as shown in Fig. Specifically, as shown in Equations (7) and (1), the noise removal processing is performed using a Gaussian Laplace filter h (nl, n2).
- the Laplacian of the Gaussian filter can be expressed as shown in Expression (8), for example, by performing matrix display using a predetermined value ⁇ .
- the pixel of interest is the center of the matrix.
- FIGS. 9A to 9F are diagrams for explaining a Gaussian Laplacian filter.
- the image data is described as one-dimensional.
- Etsu Di can be detected by performing spatial differentiation. For example, there are 17 differentials and 2nd derivatives in spatial differentiation.
- the vertical axis is the pixel value
- the horizontal axis is the X axis.
- the first pixel value f 1 and the second pixel value f 2 continuously change with a predetermined width L.
- first-order differentiation processing for example, as shown in FIG.
- an edge is identified by detecting a sharp change in f ′ (x) of the image after the 17-fire differentiation processing.
- the edge detection processing may be detected by a second differentiation processing (Laplacian).
- a second differentiation processing Laplacian
- the image data is a pixel value f (x) shown in FIG. 9D
- the second derivative f "(X) shown in Fig. 9F is obtained.
- the sign of the second derivative f ,, (x) changes at the point where the slope is the largest in the slope of the edge. Therefore, the point at which the second derivative crosses the X-axis (called the zero-crossing point) P — cr indicates the position of the edge.
- This image data is two-dimensional data, and at the time of actual edge detection, the position of the zero-crossing point P—cr in the image data subjected to the second derivative processing is specified as an edge.
- the selection unit 1814 has selected a Gaussian filter 1815-1 and a Gaussian Laplacian filter 1815-10 as noise removal processing.
- the Gaussian filter 1805 is a Gaussian filter 1815-1
- the Gaussian Laplacian filter 1806 is a Gaussian Labrian filter 1815-10.
- FIGS. 1OA to 10C are diagrams for explaining the noise removal processing of the data processing device shown in FIG.
- FIG. 11 illustrates the operation of the data processing device shown in FIG. It is a flowchart for the. The operation of the data processing device, particularly the operation related to the noise removal processing, will be described with reference to FIGS. 10A to 10C and FIG.
- the selection unit 1814 detects the noise distribution characteristic of the signal S1804, and based on the detection result, sends the signal S1814 for selecting a noise removal filter suitable for the noise characteristic to the noise removal filter 1815. Output.
- the selection unit 1814 outputs a signal S 1814 for selecting the Gaussian filter 1815-1 and the Gaussian Laplacian filter 1815-10 as noise removal processing to the noise removal filter 1815.
- the noise elimination filter 1815 selects any (at least one) noise elimination filter based on the signal S1814, and performs noise elimination processing on the signal S1804 using the selected noise elimination filter. Then, the image data S 1806 is generated.
- the noise removal filter 1815 selects a Gaussian filter 1815-1 and a Gaussian Laplacian filter 1815-10.
- the Gaussian filter 1815-1 and the Gaussian Laplacian filter 1815-10 will be described as a Gaussian filter 1805 and a Gaussian Laplacian filter 1806, respectively.
- the Gaussian filter 1805 performs the noise removal processing shown in equations (1) and (3) based on the signal S1804 shown in FIG. 1OA, for example, and generates image data S1805 shown in FIG. 10B, for example. And outputs the result to the Gaussian Laplacian filter 1806.
- the Gaussian Laplacian filter 1806 performs edge enhancement processing based on, for example, a signal S1805 as shown in FIG. 10B, and generates and outputs image data S1806 as shown in FIG. 10C, for example. I do.
- This image data S 1806 is binary image data.
- the binarization processing is performed based on the threshold value defined in the first range r1 shown in 4C.
- the filter selected by the selection unit 1814 performs noise removal processing based on the signal S1804, and then performs edge enhancement processing using the Gaussian Laplacian filter 1806 to binarize.
- the object h To remove noise due to random reflection of living organisms and devices such as the imaging unit 11 and generate appropriately binarized image data based on a threshold of 1 in a predetermined first range. Can be.
- the selection unit 1814 selects a filter according to the noise characteristic, noise can be removed with high accuracy.
- noise can be removed with high accuracy by using image data generated by imaging transmitted light from a site including a blood vessel of the subject h, and a pattern representing a blood vessel can be appropriately binarized to obtain a pattern.
- a viewable image can be generated.
- FIGS. 12A to 12D are conceptual diagrams for explaining the operation of the data processing device shown in FIG.
- the data processing device 1 based on the binarized image data S 186 generated by the above-described process, for example, as shown in FIG. Pixels having a noise component smaller than _th1 are subjected to removal processing as shown in FIG. 12B. Further, the data processing device 1 generates pixel data g 21, g 2 of the same pixel value within a predetermined distance ar_th 2 based on the binarized image data S 186 shown in FIG. 12C, for example. Two Is performed to generate image data having, for example, a linear pattern g2 shown in FIG. 12D. In the present embodiment, this linear pattern corresponds to an example of a pattern indicating a blood vessel.
- the data processing device 1 determines, for each of a plurality of pieces of pixel data indicating pixel values, which constitute the image data, a minimum pixel data among pixel data in a first area around the pixel data in a predetermined manner. Decompression processing to be pixel data, and for each piece of pixel data by the degeneration processing, expansion processing of the largest pixel data among pixel data in a second area larger than the first area surrounding the pixel data as predetermined pixel data Is performed to generate image data including a linear pattern.
- FIGS. 13A to 13F are diagrams for explaining the degeneration processing and the expansion processing of the data processing device shown in FIG.
- the first degeneration (erode) processing unit 1807 generates, for each of a plurality of pixel data indicating the pixel values constituting the image data S 186, the pixel data based on the image data S 186.
- the first degeneration processing unit 1807 generates a pixel in the cross-shaped element EL1 centered on the target pixel data g—attt.
- the smallest pixel data among the data is set as the pixel value of the target pixel g—att.
- the minimum value 0 is set as the target pixel data g—att.
- the first dilation processing unit 1808 based on the image data S1807, generates a plurality of pieces of pixel data indicating pixel values,
- the image data S 188 is generated as the maximum pixel data among the pixel data in the first area around the pixel data as the predetermined pixel data, and is output to the second expansion processing section 189. Power.
- the first expansion processing unit 1808 generates pixel data in a cross-shaped element EL 1 centered on the target pixel data g—attt as the first area.
- the target pixel g—attt is the pixel value of the target pixel g—att.
- the maximum value 1 is set as the target pixel data g—att.
- the second expansion processing unit 1809 based on the image data S18008, for each of a plurality of pixel data indicating the pixel value constituting the image data S18008, the periphery of the pixel data
- the image data S 1809 is generated as the maximum pixel data of the pixel data in the second area larger than the first area as the predetermined pixel data, and is output to the second degeneration processing section 1810 .
- the second expansion processing unit 1809 generates a second area larger than the first area as 3X centered on the target pixel data g—attt.
- the maximum pixel data among the pixel data in the three rectangular elements EL2 is set as the pixel value of the target pixel g—att.
- the maximum value 1 is set as the target pixel data g—att.
- the second degeneration processing unit 1810 is configured to generate a plurality of pieces of pixel data indicating pixel values, which form the image data S 1809, based on the image data S 1809.
- the image data S 1810 is generated with the minimum pixel data among the pixel data in the second area larger than the first area as the predetermined pixel data.
- the second degeneracy processing unit 1810 generates a second range larger than the first area as 3X centered on the target pixel data g—attt.
- the minimum pixel data among the pixel data in the three rectangular elements EL2 is set as the pixel value of the target pixel g—att.
- FIGS. 10C, 14A to 14C are diagrams for explaining the operation of the data processing device shown in FIG.
- FIG. 15 is a flowchart for explaining the operation of the data processing device shown in FIG. The operation of the data processing device, particularly the reduction process and the expansion process, will be described with reference to FIGS. 10C, 14A to 14C, and 15.
- step ST21 the first degeneration processing unit 1807, based on the image data S1806 shown in FIG. 10C, for example, as a first area as shown in FIG.
- the smallest pixel data among the pixel data in the cross-shaped element EL1 is set as the pixel value of the target pixel g-att, and an image S1807 as shown in FIG. 14A is generated.
- the first reduction processing unit 1807 generates image data S 1807 from which pixel data smaller than a predetermined size has been removed.
- step ST22 the first dilation processing unit 1808, for example, based on the image data S1807 shown in FIG. 14A, for example, as shown in FIG.
- the maximum pixel data among the pixel data in the center cross-shaped element ELI is set as the pixel value of the pixel of interest g-att, and the image data S 1808 shown in FIG. 14B is generated.
- the second expansion processing unit 1809 determines, for example, based on the image data S1808 shown in FIG. 14B as a second area larger than the first area as shown in FIG.
- Image data S 1808 is generated with the largest pixel data among the pixel data in the SX 3 rectangular element EL 2 around the pixel data g—attt as the pixel value of the target pixel g_att.
- the first expansion processing section 1808 and the second expansion processing section connect pixel data of the same pixel value within a predetermined distance ar-th2, Image data having a linear pattern is generated.
- the second degeneration processing unit 1810 Based on the SI809, for example, as shown in FIG. 13D, as a second area larger than the first area, a 3 ⁇ 3 rectangular element EL 2 centered on the target pixel data g—attt
- image data S 1810 as shown in FIG. 14C is generated by setting the smallest pixel data among the pixel data in the pixel as the pixel of the target pixel g_attt.
- the minimum pixel data among the pixel data in the first area around the pixel data is determined for each of the plurality of pixel data representing the pixel values that constitute the image data S1866.
- a first degeneration processing unit 1807 that generates image data S 1807 as pixel data of the pixel data, and a plurality of pixel data that constitutes the image data S 1807
- a first dilation processing unit 1808 that generates image data S1880 using the largest pixel data among the pixel data in the area 1 as predetermined pixel data, and image data S18808
- the largest pixel data among the pixel data in the second area larger than the first area around the pixel data is set as the predetermined pixel data
- the image data S 18 09 A second dilation processing unit 1809 for generating the image data and
- image data S1810 is generated with predetermined minimum pixel data among pixel data in a second area larger than the first area surrounding the pixel data as predetermined pixel data. Since the second degeneracy processing section 1810 is provided, it is possible to leave a linear pattern and remove a minute pattern as a noise component.
- the low-pass filter unit 1811 1 performs a filtering process that leaves a linear pattern on the basis of, for example, the image data S 18 10 to generate image data S 18 11.
- the low-pass filter unit 1811 determines that the frequency component on the two-dimensional Fourier space obtained by performing the two-dimensional Fourier transform on the image data S1810 is lower than the threshold value that leaves a linear pattern.
- the frequency component data is specified, and the specified data is subjected to inverse two-dimensional Fourier transform processing to generate image data S 1811.
- FIGS. 16A to 16F are diagrams for explaining the operation of the first low-pass filter processing of the data processing device shown in FIG. Refer to Fig. 16 to Fig. 16 F The operation of the one-pass filter unit 1811 will be described.
- the low-pass filter unit 1811 performs low-pass filter processing by changing the threshold value a plurality of times, for example, three times, in order to extract a linear pattern with high accuracy.
- a threshold value of a frequency component that leaves a linear pattern will be described.
- the low-pass filter unit 1811 sets an area a r_ref which is a reference value of the threshold in Fourier space as shown in FIG. 16A. .
- a diamond-shaped reference area ar-ref is set on a Fourier space of 360 ⁇ 360 around the origin O.
- Figure 16 includes a reference region a r_r ef, as shown in B, and the reference region a r_r ef 'Set, the area ar-ref' region a r_r ef being enlarged by a predetermined expansion rate rover scan '
- a one-pass filter a r_LPF 1 is set so as to cut a region a r_h indicating a high-frequency component on the Fourier space.
- the area a r_h corresponds to, for example, a geometrically symmetric pattern in the real space, for example, a substantially circular pattern. By forcing this region ar-h, the above-mentioned geometrically symmetric pattern can be removed. For example, as shown in Fig.
- the low-pass filter 1811 converts the image data shown in FIG. Based on the data S101, low-frequency component data in an area ar-LPF 1 is specified in Fourier space as shown in FIG. 16C. Then, for example, when the specified low-frequency component data is subjected to an inverse two-dimensional Fourier transform process, for example, an image S102 shown in FIG. 16E is obtained. For example, when binarization processing (for example, 5 rounding off) is performed on the pixel value of the image data S102, image data S103 shown in FIG. 16F is obtained.
- binarization processing for example, 5 rounding off
- FIGS. 17 to 17E are diagrams for explaining the operation of the second low-pass filter processing of the low-pass filter unit.
- the low-pass filter unit 1811 performs a filtering process a plurality of times by setting a region as a threshold value of the low-pass filtering process, that is, a region that is larger than the LPF1.
- the mouth-to-mouth finolator 1811 sets an area larger than the area a r- LPF 1 shown in FIG. 17A as described above, for example, an area a r-L P F 2 as shown in FIG. 17B.
- a threshold value for example, as shown in FIG. 17B, on a two-dimensional Fourier space, (180, 156), (156, 180), (1-156, 180), (-180 , 156), (-180, -156), (-156, -180), (156, -180), (180, -156)
- the area ar- LPF2 is set.
- the low-pass filter section 1811 In the second mouth-pass filter processing, the low-pass filter section 1811 generates, for example, the image data after the first low-pass filter processing based on the image data S102 shown in FIG. 16C and FIG. Specify the low-frequency component data in the area ar- LPF 2 on the Fourier space shown in. For example, when the specified low-frequency component data is subjected to an inverse two-dimensional Fourier transform process, an image S104 shown in FIG. 17D is obtained. For example, when binarization processing (for example, 5 rounding off) is performed on the elementary value of the image data S104, image data S105 shown in FIG. 17E is obtained.
- binarization processing for example, 5 rounding off
- FIGS. 18A to 18E are diagrams for explaining the operation of the third low-pass filter processing of the low-pass filter unit.
- the low-pass filter unit 1811 sets, as the third low-pass filter processing, an area larger than the area ar_LPF 2 shown in FIG. 18A as described above, for example, an area ar—LPF 3 as shown in FIG. 18B. .
- a threshold value for example, as shown in FIG. 18B, (180, 157), (157, 180), (-157, 180), Set the area enclosed by (-180, 157), (-180, one 157), (-157, -180), (157, -180), (180, -157) ar- LPF3.
- the low-pass filter unit 1811 is configured as shown in FIG. 18A based on the image data S shown in FIGS. 17D and 18A, for example, as image data after the second low-pass filter processing. Identify the low frequency component data in the region a r_L PF 3 in Fourier space.
- an image S106 shown in FIG. 18D is obtained.
- the pixel value of the image data S106 is subjected to a binarization process (eg, rounded off to the nearest 5)
- the image data S107 shown in FIG. 18E is obtained.
- FIGS. 19A to 19F and FIGS. 20A to 20C are diagrams for explaining the operation of the low-pass filter unit of the data processing device shown in FIG.
- FIG. 21 is a flowchart for explaining the operation of the low-pass filter unit of the data processing device shown in FIG. The operation of the low-pass filter section 1811 will be described with reference to FIGS. 14C, 19A to 19F, 2OA to 20C, and 21.
- the low-pass filter unit 1811 performs a two-dimensional Fourier transform process on the image data S 1810 shown in FIGS. 14C and 19A as a first low-pass filter process, for example, as shown in FIG. 16C.
- the area ar-LPF1 is set so as to cut the corner ar-h which is a high-frequency component on the Fourier space, the low-frequency component data in the area ar-LPF1 is specified, and the inverse two-dimensional Fourier Strange
- the image data S 18 • 11 shown in FIG. 19B is generated by performing the conversion process (ST 32). For example, when the image data S 18011 is subjected to a binarization process (for example, rounding off to the nearest 5), image data S 18103 shown in FIG. 19C is obtained.
- a binarization process for example, rounding off to the nearest 5
- step ST33 as a second low-pass filter process, the low-pass finalizer section 1811 performs a two-dimensional Fourier transform process on the basis of the image data S 18102 shown in FIGS. 19B and 19D.
- a large area for example, an area ar_LPF2 shown in FIG. 17B is set, low-frequency component data in the area ar_LPF2 is specified, and an inverse two-dimensional Fourier transform process is performed, and the image data shown in FIG. S 18014 is generated (ST 33).
- a binarization process for example, rounded off to the nearest 5
- step ST34 as a third low-pass filter process, the low-pass filter unit 1811 performs a two-dimensional Fourier transform process based on the image data S18104 shown in FIG. 19E and FIG. 2OA, for example, from the area ar—LPF2.
- a large area for example, the area shown in FIG. 18B ar—LPF 3 is set (ST34), the low frequency component data in the area ar—LPF 3 is specified (ST 35), and the inverse two-dimensional Fourier is set.
- a conversion process is performed to generate image data S 18106 shown in FIG. 20B, and the image data S 18106 is subjected to a binarization process (for example, rounded off to the nearest 5) to generate image data S 1811 shown in FIG. 19F.
- a binarization process for example, rounded off to the nearest 5
- the one-pass filter unit 181 1 Force A linear pattern with frequency components in a two-dimensional Fourier space obtained by performing a two-dimensional Fourier transform on the image data so as to leave a linear pattern in the image data
- a linear pattern can be extracted.
- a geometrically symmetric pattern for example, a substantially circular pattern can be removed.
- the low-pass filter section 1811 performs the low-pass filter processing a plurality of times by increasing the filter area ar-LPF, so that a linear pattern can be extracted with higher accuracy.
- FIGS. 22A to 22C are diagrams for explaining operations of the mask unit and the skeleton unit of the data processing device shown in FIG.
- the data processing device 1 extracts an area used for authentication from the image data.
- the data processing device 1 extracts a region including a pattern indicating a blood vessel in the image data as a region used for authentication.
- the mask section 1812 extracts, for example, an area PN used for authentication in the image data S1811 shown in FIG. 20C, and removes a pattern P-ct not used for authentication.
- the mask section 1812 is used to extract an area P—N used for recognition and verification in the image data S1811 based on the image data S1811.
- a mask pattern P-M is generated, an area indicated by the mask pattern P-M is extracted from the image data S1811, and for example, the image data S1812 shown in FIG. Generate. ,
- the skeleton section 1813 performs skeleton processing on the basis of the image data S1812 to generate image data S1813.
- the skeletonoleton unit 1813 outputs the image data S1813 to the authentication unit 103 as a signal S102.
- the skeleton unit 18 13 performs a degeneration process using a morphological function based on the image data S 18 12 shown in FIG. 22B, for example, and pays attention as shown in FIG. 22C.
- a pattern for example, a pattern indicating a blood vessel is narrowed to generate image data S 18 13 by extracting only the central portion of the pattern.
- Image data S 18 13 shown in FIG. 22C shows an image in which white and black are inverted for simple explanation. '
- FIG. 23 is a flowchart for explaining the overall operation of the data processing device shown in FIG. It is a chart. The operation of the data processing device 1 will be briefly described with reference to FIG. In the present embodiment, a case will be described in which a living body of a subject h, for example, a finger is imaged to generate image data, a pattern indicating a finger vein in the image data is extracted, and an authentication process is performed based on the pattern. .
- step ST101 the CPU 18 causes the irradiating section 1011 of the imaging system 101 to irradiate the finger of the subject h with near-infrared rays, for example.
- the imaging unit 11 generates RBG image data S 11 based on the subject h and the transmitted light input through the optical lens # 012.
- the gray scale conversion unit 1801 converts the RGB signal S11 into, for example, a gray scale of 256 gradations based on the RGB signal S11 and outputs the converted signal to the distribution data generation unit 1802 as a signal S 1801.
- the imaging system 101 generates the RGB image data S11, but this is not a limitation.
- the imaging system 101 when the imaging system 101 generates the grayscale image data S11, the image data S11 is output to the distribution data generation unit 1802 without performing the processing of the grayscale conversion unit 1801 in step ST102. . '
- step ST103 based on the signal S1801, the distribution data generation unit 1802, for example, plots the horizontal axis c as the gradation value (also called pixel value) and the vertical axis ⁇ ⁇ as the number of pixel data (also called frequency). ), For example, as shown in FIG. 4C, a histogram is generated as distribution data d 1 for pixel data indicating a pixel value in a range of 256 gradations as the first range r 1.
- the identification unit 1803 uses the signal S 1802 as shown in FIG. 5A, for example, as shown in FIG. Among the pixel values r 11, r 12, r 13, and r 14 of the number th, a range having a maximum pixel value r 11 or less is specified as a second range r 2, and is output as a signal S 180 3.
- the mapping unit 1804 maps the pixel data in the second range r 2 specified by the specifying unit 1803 to the first range r 1 based on the signal S 1803 and performs the mapping.
- the second image data composed of pixel data is generated and output to the Gaussian filter 1805 as a signal S 1804.
- mapping unit 1804 converts the pixel data as shown in FIG. Mapping is performed by enlarging to a first range r1, which is a range of pixel values from 0 to 256, and as shown in FIG.
- the data S 1804 is generated (ST 105).
- the selection unit 1814 detects the noise distribution characteristic of the signal S1804, and, based on the detection result, selects one of a plurality of noise removal filters suitable for the noise characteristic.
- the signal S 1814 for selecting the (at least one) noise elimination filter is output to the noise elimination filter 1815.
- the selection unit 1814 outputs a signal S 1814 for selecting the Gaussian filter 1815-1 and the Gaussian Laplacian filter 1815-10 as noise removal processing to the noise removal filter 1815.
- the noise elimination filter 1815 selects one of the noise elimination filters according to the signal S 1814, and selects, for example, a Gaussian filter 1815-1 and a Gaussian Laplacian filter 1815-10. For convenience of explanation, they will be described as a Gaussian filter 1805 and a Gaussian Laplacian filter 1806, respectively.
- the Gaussian filter 1805 performs, for example, the noise removal processing shown in Expressions (1) and (3) based on the signal S 1804 shown in FIG. 10A, generates the image data S 1805 shown in FIG. Output to Laplacian filter 1806.
- the Gaussian Laplacian filter 1806 performs edge enhancement processing based on, for example, the signal S1805 shown in FIG. W
- the Gaussian Laplacian filter 1806 When performing the binarization processing, the Gaussian Laplacian filter 1806 performs the binarization processing based on, for example, a threshold defined in the first range r1 shown in FIG. 4C.
- the first degeneration processing unit 1807 based on the image data S1806 shown in FIG. 10C, for example, as a first area as shown in FIG. The smallest pixel data among the pixel data in the cross-shaped element EL1 is used as the pixel value of the target pixel g-att, and an image S1807 as shown in FIG. 14A is generated.
- step ST109 the first dilation processing unit 1808, based on the image data S1807 shown in FIG. 14A, for example, as the first region as shown in FIG.
- the maximum pixel data among the pixel data in the cross-shaped element EL1 at the center is taken as the pixel of the target pixel g-att, and image data S1808 shown in FIG. 14B is generated.
- the second expansion processing unit 1809 determines, for example, based on the image data S1808 shown in FIG. 14B as a second area larger than the first area as shown in FIG.
- Image data S 1809 is generated by using the largest pixel data among the pixel data in the 3 ⁇ 3 rectangular element EL 2 centered on the pixel data g—attt as the pixel value of the target pixel g—att.
- the second degeneration processing unit 1810 based on the image data S1809, for example, as shown in FIG. 13D, sets the pixel data of interest g_att as a second area larger than the first area as shown in FIG. 13D.
- image data S 1810 as shown in FIG. 14C is generated using the minimum pixel data among the pixel data in the 3 ⁇ 3 rectangular element EL 2 at the center as the pixel value of the target pixel g—att.
- the low-pass filter section 1811 performs, for example, image data S1810 shown in FIG. 14C and FIG. 19A as the first low-pass filter processing. Is subjected to a two-dimensional Fourier transform process. For example, as shown in FIG.
- a region a r_L PF 1 is set so as to force a corner ar—h which is a high-frequency component in a Fourier space, and the region ar—
- the low-frequency component data in the LPF 1 is specified, and inverse two-dimensional Fourier transform processing is performed to generate image data S 18011 shown in FIG. 19B.
- the low-pass filter unit 1811 performs a two-dimensional Fourier transform process based on the image data S 18102 shown in FIGS. 19B and 19D.
- the area of the area a r_LPF 2 shown in FIG. 17B is set, the low frequency component data in the area a r_LPF 2 is specified, the inverse two-dimensional Fourier transform processing is performed, and the image data S 1801 4 shown in FIG.
- the low-pass filter unit 1811 performs a two-dimensional Fourier transform process based on the image data S 18104 shown in FIGS. 19E and 20A, for example, an area ar—an area larger than LPF2, For example, the area of the area a r_LPF 3 shown in FIG. 18B is set, the low-frequency component data in the area a r_LPF 3 is specified, the inverse two-dimensional Fourier transform processing is performed, and the image data S 18016 shown in FIG. Then, the image data S 18016 is binarized (eg, rounded off to the nearest 5) (ST113) to generate image data S1811 shown in FIG. 19F.
- a two-dimensional Fourier transform process based on the image data S 18104 shown in FIGS. 19E and 20A, for example, an area ar—an area larger than LPF2, For example, the area of the area a r_LPF 3 shown in FIG. 18B is set, the low-frequency component data in the area a r_LPF 3 is specified
- step ST114 the mask unit 1812 extracts the mask pattern PM from the image data S 1811 as shown in FIG. Then, an area indicated by a mask pattern PM is extracted from the image data S 1811 to generate, for example, image data S 1812 shown in FIG. 22B.
- the Schenoreton unit 1813 performs a degeneration process using a morphological function based on, for example, the image data S1812 shown in FIG. 22B, and shows a pattern of interest, for example, a blood vessel, as shown in FIG.
- the image data S 1813 in which only the central part of the pattern is extracted by narrowing the pattern is generated, and the signal S 102 and the signal S 102 are generated. And outputs it to the authentication unit 1 ⁇ 3.
- the authentication unit 103 performs a verification process based on the signal S102 with, for example, the registered image data DIP stored in the storage unit 17 in advance.
- the data processing device 1 generates the distribution data by the distribution data generation unit 1802 as shown in, for example, FIG. 5A and FIG.
- the range of 2 is specified, the second range is mapped to the first range by the mapping section 1804, and specified within the first range r1 by the constituent elements 1805-1018, etc. Since the third image data is generated by binarizing based on the determined threshold value, even if the pixel value distribution data d1 differs for each subject h, the binarization process can be appropriately performed. .
- a selection unit 1814 for selecting a noise removal process corresponding to a laser deviation among a plurality of noise removal processes, and a noise removal filter 1815 having a plurality of different types of noise removal filters, for example, are provided.
- a noise removal process is performed based on the finoletaka S signal S1804 selected by the selection unit 1814, and then an edge enhancement process is performed by the Gaussian Laplacian finoleator 1806, and then the process is performed.
- an edge enhancement process is performed by the Gaussian Laplacian finoleator 1806, and then the process is performed.
- the image data S 1804 noise due to irregular reflection of the living body of the subject h and the depth of the imaging unit 11 is removed, and based on a predetermined threshold value of the first range r 1, Appropriate binarized image data can be generated.
- the minimum pixel data among the pixel data in the first area around the pixel data is defined as predetermined pixel data.
- a first degeneration processing unit 1807 for generating the image data S 1807, and a first area around the pixel data for each of a plurality of pixel data constituting the image data S 1807 A first dilation processing unit 1808 that generates image data S1888 with the largest pixel data among the pixel data in the pixel data as predetermined pixel data, and image data S18808 For each of a plurality of pixel data to be generated, the largest pixel data among the pixel data in the second area larger than the first area around the pixel data is generated as the predetermined pixel data, and the image data S 1809 is generated.
- the second expansion processing section 1 8 9 For each of a plurality of pixel data constituting the image data S 1809, the smallest pixel data among the pixel data in the second area larger than the first area around the pixel data is determined as a predetermined pixel. Since the second degeneration processing unit 1810 that generates the image data S1810 as data is provided, the image data obtained by imaging the subject may be smaller than a predetermined size. In addition, the region can be removed, and pixel data close to some extent can be connected. Also, it is possible to remove the pattern as a noise component by leaving the linear pattern and the noise component. Also, the low-pass filter section can be used to perform two-dimensional Fourier transform processing on the image data so as to leave the linear pattern in the image data.
- the frequency component is specified to be lower than a threshold value that leaves a linear pattern, and the specified low-frequency component data is subjected to an inverse two-dimensional Fourier transform process to form a linear pattern.
- a threshold value that leaves a linear pattern
- the specified low-frequency component data is subjected to an inverse two-dimensional Fourier transform process to form a linear pattern.
- geometrically symmetric patterns can be removed.
- a pattern indicating a blood vessel of the subject h can be extracted with high accuracy.
- the data processing apparatus 1 in the conventional data processing apparatus, complicated processing of using an AI filter for vascular tracing was performed based on vascular information from image data, but the data processing apparatus 1 according to the present embodiment is For example, since a pattern indicating a blood vessel can be extracted with high accuracy based on image data obtained by imaging the subject h, the processing load is reduced as compared with the related art.
- the skeletonoleton section 1813 extracts the central part of the pattern showing the blood vessels when performing the skeleton processing.
- the skeleton image is less affected by the expansion and contraction of the blood vessels due to the change in the physical condition of the subject h. Data can be generated. Since the authentication unit 103 uses this image data for the authentication process, even if the physical condition of the subject h changes, Authentication processing can be performed with high accuracy.
- FIG. 24 is a diagram for explaining a second embodiment of the remote control device using the data processing device according to the present invention.
- a remote controller 1a incorporates the data processing device 1 according to the first embodiment into a general remote controller.
- the remote control device la includes, for example, an imaging unit 11, an input unit 12, an output unit 13, a communication interface 14, a RAM I, and a data processing device according to the first embodiment shown in FIG. 5, a ROM 16, a storage unit 17, and a CPU 18. Only differences from the data processing device 1 according to the first embodiment will be described.
- the remote control device 1a is provided with, for example, an irradiation unit 101, an optical lens 101, and an imaging unit 11 as an imaging system 101 in a main body unit 100.
- the output unit 13 is for example. Under the control of PU 18, a control signal for causing television apparatus m_tv to perform predetermined processing is transmitted using infrared rays as carrier waves.
- the output unit 13 is configured by an infrared light emitting element.
- the television device m-tV performs a predetermined process according to the control signal received by the light receiving unit m_r, for example, displays a predetermined image on the display unit m-m.
- the storage unit 17 stores, for example, data Dt indicating a user's preference, specifically, a preference list Dt as shown in FIG.
- the data Dt is read and written by the CPU 18 as needed.
- the CPU 18 performs a process corresponding to the data Dt, for example.
- FIG. 25 is a flowchart for explaining the operation of the remote controller 1a shown in FIG.
- step ST201 it is determined whether or not the user has touched the imaging system 101 provided on the side surface of the main body unit 100. For example, when a finger touches the imaging system 101, the process proceeds to step ST202.
- Step ST202 the CPU 18 irradiates the finger of the subject h with the near-infrared ray from the irradiating unit 1011, and causes the imaging unit 11 to generate image data of the vein of the finger based on the transmitted light.
- the light emitted from the irradiation unit 1011 is used, but the present invention is not limited to this mode.
- the imaging unit 11 may generate image data based on transmitted light of the subject h by natural light.
- step ST203 the CPU 18 extracts image data used for authentication by the extraction unit 102, for example, skeleton image data indicating a pattern indicating a blood vessel by the extraction unit 102 in the same manner as in the first embodiment, and outputs the signal S102 to the authentication unit 103 as a signal S102. Output.
- step ST204 the CPU 18 causes the authentication / authentication unit 103 to perform an authentication process by comparing the signal S102 with the registered image data DP of the user stored in the storage unit 17 in advance.
- step ST 205 If it is determined in step ST 205 that the user is not stored in advance by authentication section 103, the process returns to step ST201.
- step ST205 if the authentication unit 103 identifies that the user is a user stored in advance, the CPU 18 determines the user's preference data D-t stored in the storage unit 17 according to the data D-t. Perform the following processing. For example, a control signal corresponding to the data D_t is output to the television device m-tV.
- the remote control device including the data processing device according to the first embodiment since the remote control device including the data processing device according to the first embodiment is provided, for example, the television device m-tV can be controlled based on the authentication result.
- information such as age is included in the data DT. If the authentication unit 103 identifies that the user is a minor as a result of the authentication by the authentication unit 103, the CPU 18 disables a specific button and restricts the television device number a from being viewed. , An age restriction function can be realized.
- the data DT includes the display of a program guide customized for each user (such as a favorite list and a history) and the use of a scheduled recording list.
- the CPU 18 controls the data so that the data can be used when the authentication is successful, thereby performing a process corresponding to each user.
- FIG. 26 is a diagram for explaining a third embodiment of the data processing system using the data processing device according to the present invention.
- the data processing system 10b includes a remote controller 1a, a recording medium (also referred to as a medium) lb, a data processor 1c, and a television m-tv. Having. Only differences from the first embodiment and the second embodiment will be described.
- the above-described identification processing is performed on both the remote control device 1a and the recording medium 1b, and the processing according to both identification results is performed.
- the processing according to both identification results is performed.
- predetermined data stored in the recording medium 1b is read and written.
- the remote control device 1a has substantially the same configuration as the remote control device 1a according to the second embodiment, and includes the data processing device 1 according to the first embodiment.
- the recording medium 1b includes, for example, the data processing device 1 according to the first embodiment.
- the recording medium lb is a magnetic recording medium such as a video tape, an optical disk, a magneto-optical disk, or a data recording medium such as a semiconductor memory.
- the recording medium lb includes an imaging unit 11, an input unit 12, an output unit 13, a communication interface 14, a RAM I5, a ROM 16, and a storage unit. 17 and CPU 18. Only differences from the data processing device 1 according to the first embodiment will be described.
- the recording medium 1 b includes, for example, an irradiation unit 100 1 1 as an imaging system 101 in the main body 100 b.
- An optical lens 1012 and an imaging unit 11 are provided.
- the imaging system 101 is provided at a position where the user touches the main body 100b.
- the imaging unit 11 is provided not in one place but in an area of the main unit 1 • 0b that may be touched by the user.
- the data processing device 1c can read and write data stored in the recording medium 1b, for example, when the authentication process is normally performed.
- the data processing device 1c includes the data processing device according to the first embodiment.
- the data processing device 1c includes an imaging unit 11, an input unit 12, an output unit 13, a communication interface 14, a RAMI 5, a ROM 16, a storage unit 17, and a CPU 18 as in the first embodiment shown in FIG. It has. Only differences from the data processing device 1 according to the first embodiment will be described.
- the data processing device 1c includes, for example, a holding unit mh holding the recording medium 1b, a driver reading and writing data of the recording medium 1b held by the holding unit m__h, and a light receiving unit mr Etc.
- the television device m-tv has a display unit m_m that displays an image based on data from a driver of the data processing device 1c, for example.
- FIG. 27 is a flowchart for explaining the operation of the data processing system shown in FIG. The operation of the data processing system 10b will be described with reference to FIG. 27 only for differences from the first embodiment and the second embodiment.
- steps ST301 to ST304 The operation of the remote control device 1a in steps ST301 to ST304 is the same as that in steps ST201 to ST204 of the second embodiment, and a description thereof will be omitted.
- step ST304 the CPU 18 of the remote control device 1a causes the authentication unit 103 to perform the authentication process by comparing the signal S102 with the registered image data DP of a plurality of users stored in the storage unit 17 in advance. Let it do.
- step ST305 if the authentication unit 103 of the remote control device 1a does not identify the user as the user to be stored in advance, the process returns to step ST301. On the other hand, if it is determined in step ST305 that the user is a user who is stored in advance by authentication section 103, CPU 18 stores the identification result as A in storage section 17 (ST 306).
- step ST307 for example, the user sets the recording medium 1b in the holding unit mh of the data processing device (also referred to as a reproducing device) 1c.
- step ST308 for the recording medium 1c, for example, it is determined whether or not the user has touched the imaging system 101 provided on the side surface of the main body 100b. If, for example, a finger touches the imaging system 101, the process proceeds to step ST309.
- Step ST309 the CPTJ18 of the recording medium 1b irradiates the near-infrared ray from the irradiation unit 1011 to the finger of the subject h, and causes the imaging unit 11 to generate image data of a finger vein based on the transmitted light.
- step ST310 the CPU 18 of the recording medium lb extracts image data used for authentication by the extraction unit 102, for example, skenoreton image data indicating a pattern indicating a blood vessel by the extraction unit 102 as in the first embodiment, and outputs it as a signal S102.
- image data used for authentication by the extraction unit 102 for example, skenoreton image data indicating a pattern indicating a blood vessel by the extraction unit 102 as in the first embodiment
- step ST311 the CPU 18 of the recording medium 1b causes the authentication unit 103 to perform an authentication process by comparing the signal S102 with the registered image data DP of a plurality of users previously stored in the storage unit 17.
- step ST312 If it is determined in step ST312 that the authentication unit 103 of the recording medium 1b has not identified the user as a user who stores the information in advance, the process returns to step ST308. On the other hand, in the determination of step ST312, if the authentication unit 103 of the recording medium 1b identifies that the user is a user to be stored in advance, the CPU 18 of the recording medium 1b sets the identification result to B, for example (ST313 ).
- step ST 314 the identification result A in step ST 306 is compared with the identification result B in step ST 313 to determine whether or not the users are the same.
- the determination process may be performed on the recording medium 1b.
- the recording medium 1b performs the identification based on the identification result A transmitted from the remote operation device 1a and the identification result B using the recording medium 1b.
- This determination process may be performed by the data processing device 1c, for example.
- the data processing device 1c performs the identification based on the identification result A transmitted from the remote control device 1a and the identification result B by the recording medium 1b.
- the recording medium lb permits the data processing device 1c to read or write the built-in data, for example, playback or recording ( If it is determined that the users are not the same user (ST 315), for example, the data processing device 1c prohibits the data processing device 1c from reproducing or recording the data contained in the recording medium lb (ST 316).
- the data processing device 1c reads the data contained in the recording medium 1b, and displays an image corresponding to the data on a display device m_tv on a television device m_tv. Display on m.
- the remote control device 1a since identification is performed by both the remote control device 1a and the recording medium 1b, for example, when the identification result is the same user, data is stored and read out on the recording medium lb. For example, it is possible to prevent data tampering, eavesdropping, and overwriting of data by others.
- FIG. 28 is a diagram illustrating a portable communication device using a data processing device according to a fourth embodiment of the present invention.
- the portable communication device 1d according to the present embodiment includes the data processing device 1 according to the first embodiment.
- the portable communication device 1 d has a general telephone call function, an e-mail function, and a function such as an address book, etc .:
- a function such as an address book, etc .
- the portable communication device 1 d includes, for example, an imaging unit 11, an input unit 12, an output unit 13, a communication interface 14, a RAMI 5, a ROM 16, a storage unit 17, and a CPU 18 as in the first embodiment shown in FIG. Having. Only differences from the data processing device 1 according to the first embodiment will be described. '
- an imaging system 101 is provided on a call button bt or the like as the input unit 12 (all buttons bt may be used).
- the portable communication device 1d obtains an image of a finger vein when the button bt is operated by the user, and activates the communication function as a mobile phone when the image is identified as an individual registered in advance.
- a desired call function is executed via a base station (not shown).
- FIG. 29 is a flowchart for explaining the operation of the data processing device shown in FIG. The differences between the operation of the portable communication device 1d and the data processing devices according to the first to third embodiments will be described with reference to FIG.
- step ST401 it is determined whether or not the user has touched the imaging system 101 provided on the communication button bt or the like as the input unit 12. If, for example, a finger touches the imaging system 101, the process proceeds to step ST402.
- step ST402 the CPU 18 irradiates the finger of the subject h with the near-infrared ray from the irradiating unit 1011 and causes the imaging unit 11 to generate image data of the vein of the finger based on the transmitted light.
- step ST403 as in the first embodiment, the CPU 18 extracts image data used for authentication, for example, skeleton image data indicating a pattern indicating a blood vessel by the extraction unit 102 and authenticates it as a signal S102, as in the first embodiment. Output to section 103.
- image data used for authentication for example, skeleton image data indicating a pattern indicating a blood vessel by the extraction unit 102 and authenticates it as a signal S102, as in the first embodiment.
- step ST404 the CPU 18 causes the authentication unit 103 to perform the authentication process by comparing the signal S102 with the registered image data DIP of the user stored in the storage unit 17 in advance.
- step ST405 if the authentication unit 103 identifies that the user is a pre-stored user, the communication function as a mobile phone is activated, and the user is notified of the mobile phone. Is allowed (ST406).
- the portable communication device Id is operated by a button unique to the owner user. It is determined whether or not it has been performed (ST408).
- step ST408 when the CPU 18 determines that the specific button b t has been operated, the CPU 18 enables the predetermined function even when used by another person.
- the user who is the owner lends the portable communication device 1d to another person in this state (ST409).
- step ST408 when the CPU 18 determines that the specific button bt is not operated by the user, and in step ST407.
- step, if it is not to be lent to another person a series of processing ends.
- step ST 405 authentication section 103 identifies that the user is not a pre-stored user, and if the identification processing has not failed a plurality of times (ST 410), returns to the processing of step ST 401.
- the CPU 18 prohibits the authentication process (ST411), and the CPU 18 notifies the data communication device PC registered in advance that the authentication process has failed multiple times. Is transmitted (ST412).
- the portable communication device Id operates a predetermined function when the user is authenticated as the owner who has been registered in advance as a result of the above-described authentication processing. For example, even if it is lost, it can be prevented from being used by others.
- the information such as the current position of the portable communication device 1d can be obtained when, for example, a key is pressed more than once by a user other than a pre-registered user (when identification fails multiple times). It can be sent to the pre-registered contact data communication device PC.
- GPS global positioning system: global positioning system
- the user's address book data ad is stored in a server device s V that is accessible separately through a communication network (not shown) separately from the portable communication device 1 d.
- the CPU 18 of the portable communication device 1 d accesses the server device s V via a communication network (not shown) to download the address book data ad of the user.
- unauthorized reading of the address book by another user can be prevented.
- the portable communication device If the authentication is performed correctly in 1d, the same address book data ad can be used.
- the owner of the portable communication device 1 d needs to lend the portable communication device 1 d to another person with his / her consent, the owner himself operates the dedicated button bt. This allows others to use it. That is, when a specific button bt is operated, the CPU 18 operates a predetermined function without performing an authentication process even when used by another person.
- FIG. 30 is a diagram for explaining a fifth embodiment of the data processing device according to the present invention.
- a telephone 1e using the data processing device according to the present embodiment includes the data processing device 1 according to the first embodiment, and has a personal authentication function using a finger vein.
- the telephone 1e according to the present embodiment is similar to the portable communication device 1d according to the fourth embodiment, for example, a specific button bt or the like provided in each home (all buttons or the main body is also possible). Is provided with an imaging system 101.
- the configuration of the phone le is the same as that of the mobile phone according to the fourth embodiment. It is almost the same as the belt type communication device 1d. Only the differences will be described.
- the telephone 1e acquires an image of a finger vein, for example, when the button bt is pressed.
- the telephone 1e has a use restriction function and a personal identification function.
- the telephone 1e has a limit function in which a maximum available time is set in advance for each user, and a telephone call cannot be made when a predetermined time is reached.
- an imaging system 101 is provided in 1e-r, and the authentication process is continuously performed by periodically imaging the vein of the finger of the subject h.
- the telephone 1 e periodically performs an authentication process and updates the registered image data in the storage unit 17.
- FIG. 31 is a flowchart for explaining the operation of the telephone shown in FIG. The operation of the telephone 1e will be described with reference to FIG. For example, a case where the telephone 1e is used by a plurality of users will be described.
- step ST501 when the CPU 18 receives a signal indicating a call from another phone, the CPU 18 identifies the user based on, for example, the telephone number of the other party, and sets the ring tone associated with the user. From the speaker.
- the CPU 18 causes the imaging system 101 to irradiate the user's finger with light to image the finger (ST504).
- step ST505 the same processing as in the data processing device 1 according to the first embodiment is performed to generate image data including a pattern indicating a blood vessel.
- step ST506 the CPU 18 stores the generated image data, 17 is compared with a list of registered image data stored.
- step ST507 it is determined whether or not the identification timeout has occurred, that is, whether or not the processing time for the identification processing is longer than a predetermined time, and if it is within the processing time, it is determined whether or not the identification has been performed ( ST 508).
- step ST507 If it is determined in step ST507 that the call is properly identified, the CPU 18 sets a callable state (ST509).
- step ST507 determines whether the identification is appropriate. If it is determined in step ST507 that the identification is not appropriate, the process returns to step ST504, and the measurement is repeated. If a timeout occurs in step ST507, for example, a so-called answering machine function (ST 510).
- step ST502 when the sound is not the user's own sound, similarly, the function is switched to the answering machine function.
- step ST501 when making a call instead of receiving a telephone call, the CP 18 determines, for example, whether or not the user's finger has touched the imaging system 101 (ST 51 1). If, for example, a finger touches the imaging system 101, the process proceeds to step ST512.
- step ST512 the CPU 18 irradiates the finger of the subject h with the near-infrared ray from the irradiating unit 1011 and causes the imaging unit 11 to generate image data of the vein of the finger based on the transmitted light.
- step ST513 as in the first embodiment, the CPU 18 extracts image data used for authentication, for example, skeleton image data indicating a pattern indicating a blood vessel, by the extraction unit i02, and outputs the extracted data to the authentication unit 103 as a signal S102. I do.
- image data used for authentication for example, skeleton image data indicating a pattern indicating a blood vessel
- Step ST514 the CPU 18 causes the authentication unit 103 to perform authentication processing by comparing the signal S102 with the registered image data D_P of the user stored in the storage unit 17 in advance.
- step ST515 the authentication unit 103 recognizes that the user is stored in advance. If not, the process returns to step ST511.
- step ST515 if it is determined in step ST515 that the user is a user who is stored in advance by authentication section 103, CPU 18 displays the address book data ad of the identified user on the display section of output section 13 (ST516). ), Set to be able to talk (ST 51 7).
- step S518 it is determined in step S518 whether or not the use time has been set.
- CPU 18 determines whether or not it is within the available time (ST 519), and if it is within the available time, determines whether or not the call has ended (ST 520). .
- step ST519 if it is not within the available time, the user is warned, for example, a display indicating the warning is displayed on the display unit, and the call is forcibly terminated (ST521).
- step ST528 if the use time is not set in step ST518, and if the call is terminated in step ST520, a series of processes is terminated.
- the telephone 1e is Since the data processing device according to the first embodiment is built in and the use time can be set, for example, a long telephone call can be prevented.
- the imaging system 101 is provided on the button bt, but this is not a limitation. In this case, it may be provided in the receiver 1 e-r or the like, or may be provided in both the button b t and the receiver 1 e-r and used depending on the situation.
- a plurality of users for example, the whole family, can use the same telephone 1e, and an address book for a recognized user is displayed. Good.
- the ringtone when receiving a call, can be set for each user, and it can be set so that only the user can answer the call, so that security is high.
- security is high because personal identification is performed when the receiver is picked up, and it is possible to make a call for a preset user.
- the preset user when the preset user is away, for example, even if another family member is at home, the user can switch to answering machine, which is highly secure.
- FIG. 32 is a diagram for explaining a sixth embodiment of the data processing device according to the present invention.
- a PDA (Personal Digital Assistant) 1 # according to the present embodiment includes the data processing device 1 according to the first embodiment.
- the PDA 1f is provided with an imaging system 101 on the side surface of the main body, the button bt, and the like.
- private data can be displayed only if you are the principal.
- FIG. 33 is a diagram for explaining a seventh embodiment of the data processing device according to the present invention.
- the mouse 1 g according to the present embodiment is a so-called mouse as an input device of, for example, a personal computer PC, and includes the data processing device 1 according to the first embodiment.
- the mouse 1 g is provided with an imaging system 101 at the button bt or the like, for example.
- the personal computer can be set to log in only if the user is himself. For example, turning on the power of a personal computer PC or logging in It can be used to display a screen.
- the data processing device 1 is incorporated in a remote control or a portable communication device, the present invention is not limited to this embodiment.
- an imaging system 101 may be provided on a keyboard, and an image of the subject h may be imaged during key input, for example, and authentication processing may be performed based on the imaged data.
- the subject h may be imaged while the necessary items are being input, and the authentication process may be performed based on the imaged data.
- the system is such that the order cannot be made unless the person is the person.
- it can be managed twice and security is further improved.
- an imaging system 101 is provided in a touch panel such as an ATM (Automatic teller machine) of a bank, and when inputting necessary information, the subject h is imaged, and authentication processing is performed based on the imaging data. Good. For example, by setting cash to be able to be withdrawn when the individual is identified, security is improved.
- ATM Automatic teller machine
- an imaging system 101 may be provided in a house key, a mail box, or the like, an image of the subject h may be taken, and an authentication process may be performed based on the taken data.
- an authentication process may be performed based on the taken data.
- the data processing device 1 may be provided on a bicycle, an image of the subject h may be captured, and the authentication processing may be performed based on the captured data.
- security is improved.
- security is further improved.
- it is used as a substitute for a signature when using a credit card.
- a reader / writer such as a credit card
- the present invention is applicable to, for example, an image processing apparatus that processes image data obtained by imaging a subject.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Image Processing (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
- Image Input (AREA)
Abstract
Description
Claims
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US10/518,607 US7450757B2 (en) | 2003-05-06 | 2004-04-30 | Image processing device and image processing method |
EP04730682A EP1622079B1 (en) | 2003-05-06 | 2004-04-30 | Image processing device and image processing method |
US12/246,043 US7702174B2 (en) | 2003-05-06 | 2008-10-06 | Image processing apparatus and image processing method |
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2003164372A JP4547869B2 (ja) | 2003-05-06 | 2003-05-06 | 画像処理方法、および画像処理装置 |
JP2003164373A JP4389489B2 (ja) | 2003-05-06 | 2003-05-06 | 画像処理方法、および画像処理装置 |
JP2003-164372 | 2003-05-06 |
Related Child Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US10/518,607 A-371-Of-International US7450757B2 (en) | 2003-05-06 | 2004-04-30 | Image processing device and image processing method |
US12/246,043 Division US7702174B2 (en) | 2003-05-06 | 2008-10-06 | Image processing apparatus and image processing method |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2004100069A1 true WO2004100069A1 (ja) | 2004-11-18 |
Family
ID=40346603
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2004/006328 WO2004100069A1 (ja) | 2003-05-06 | 2004-04-30 | 画像処理装置、および画像処理方法 |
Country Status (4)
Country | Link |
---|---|
US (2) | US7450757B2 (ja) |
EP (1) | EP1622079B1 (ja) |
JP (2) | JP4389489B2 (ja) |
WO (1) | WO2004100069A1 (ja) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004329826A (ja) * | 2003-05-06 | 2004-11-25 | Sony Corp | 画像処理方法、および画像処理装置 |
CN111507310A (zh) * | 2020-05-21 | 2020-08-07 | 国网湖北省电力有限公司武汉供电公司 | 一种基于φ-otdr的光缆通道内人为触缆作业信号识别方法 |
Families Citing this family (27)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
TWI297875B (en) * | 2004-09-16 | 2008-06-11 | Novatek Microelectronics Corp | Image processing method and device using thereof |
JP3877748B1 (ja) * | 2005-10-28 | 2007-02-07 | 京セラ株式会社 | 生体認証装置 |
JP4992212B2 (ja) | 2005-09-06 | 2012-08-08 | ソニー株式会社 | 画像処理装置、画像判定方法及びプログラム |
JP4305431B2 (ja) | 2005-09-06 | 2009-07-29 | ソニー株式会社 | 画像処理装置、マスク作成方法及びプログラム |
JP2007207009A (ja) | 2006-02-02 | 2007-08-16 | Fujitsu Ltd | 画像処理方法及び画像処理装置 |
JP4816321B2 (ja) * | 2006-08-14 | 2011-11-16 | ソニー株式会社 | 認証装置及び認証方法並びにプログラム |
JP5121204B2 (ja) | 2006-10-11 | 2013-01-16 | オリンパス株式会社 | 画像処理装置、画像処理方法、および画像処理プログラム |
US8068673B2 (en) * | 2006-12-01 | 2011-11-29 | Beihang University | Rapid and high precision centroiding method and system for spots image |
JP4588015B2 (ja) * | 2006-12-25 | 2010-11-24 | 京セラ株式会社 | 生体認証装置 |
JP2008287436A (ja) * | 2007-05-16 | 2008-11-27 | Sony Corp | 静脈パターン管理システム、静脈パターン登録装置、静脈パターン認証装置、静脈パターン登録方法、静脈パターン認証方法、プログラムおよび静脈データ構造 |
JP2008287428A (ja) * | 2007-05-16 | 2008-11-27 | Sony Corp | 静脈パターン管理システム、静脈パターン登録装置、静脈パターン認証装置、静脈パターン登録方法、静脈パターン認証方法、プログラムおよび静脈データ構造 |
JP2008287432A (ja) * | 2007-05-16 | 2008-11-27 | Sony Corp | 静脈パターン管理システム、静脈パターン登録装置、静脈パターン認証装置、静脈パターン登録方法、静脈パターン認証方法、プログラムおよび静脈データ構造 |
JP2009187520A (ja) * | 2008-01-09 | 2009-08-20 | Sony Corp | 認証装置、認証方法、登録装置及び登録方法 |
JP4941311B2 (ja) | 2008-01-09 | 2012-05-30 | ソニー株式会社 | マウス |
EP2291822A1 (en) * | 2008-05-21 | 2011-03-09 | Koninklijke Philips Electronics N.V. | Image resolution enhancement |
US8238654B2 (en) * | 2009-01-30 | 2012-08-07 | Sharp Laboratories Of America, Inc. | Skin color cognizant GMA with luminance equalization |
US8798385B2 (en) * | 2009-02-16 | 2014-08-05 | Raytheon Company | Suppressing interference in imaging systems |
KR101129220B1 (ko) * | 2009-11-03 | 2012-03-26 | 중앙대학교 산학협력단 | 레인지 영상의 노이즈 제거장치 및 방법 |
US20110103655A1 (en) * | 2009-11-03 | 2011-05-05 | Young Warren G | Fundus information processing apparatus and fundus information processing method |
US20120019727A1 (en) * | 2010-07-21 | 2012-01-26 | Fan Zhai | Efficient Motion-Adaptive Noise Reduction Scheme for Video Signals |
KR20120089101A (ko) * | 2011-02-01 | 2012-08-09 | 삼성전자주식회사 | 터치 패널의 멀티 터치 검출 방법 및 이를 이용한 터치 스크린 장치의 동작 방법 |
JP2013011856A (ja) * | 2011-06-01 | 2013-01-17 | Canon Inc | 撮像システムおよびその制御方法 |
JP5178898B1 (ja) * | 2011-10-21 | 2013-04-10 | 株式会社東芝 | 画像信号補正装置、撮像装置、内視鏡装置 |
US9326008B2 (en) * | 2012-04-10 | 2016-04-26 | Google Inc. | Noise reduction for image sequences |
CA2906973C (en) * | 2013-04-04 | 2020-10-27 | Illinois Tool Works Inc. | Helical computed tomography |
US9105102B1 (en) * | 2013-10-01 | 2015-08-11 | The United States Of America As Represented By The Secretary Of The Navy | Method for processing radiographic images of rapidly moving objects such as shaped charge jet particles |
CA3002902C (en) * | 2015-12-18 | 2023-05-09 | Ventana Medical Systems, Inc. | Systems and methods of unmixing images with varying acquisition properties |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH06339028A (ja) * | 1993-05-27 | 1994-12-06 | Toshiba Corp | 画像表示装置 |
JPH07210655A (ja) * | 1994-01-21 | 1995-08-11 | Nikon Corp | 眼科用画像処理装置 |
JPH08287255A (ja) * | 1995-04-12 | 1996-11-01 | Nec Corp | 皮膚紋様画像の画像特徴抽出装置および画像処理装置 |
JP2001144960A (ja) * | 1999-11-18 | 2001-05-25 | Kyocera Mita Corp | 画像処理装置 |
JP2002092616A (ja) * | 2000-09-20 | 2002-03-29 | Hitachi Ltd | 個人認証装置 |
JP2002135589A (ja) * | 2000-10-27 | 2002-05-10 | Sony Corp | 画像処理装置および方法、並びに記録媒体 |
Family Cites Families (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3656695B2 (ja) * | 1997-09-30 | 2005-06-08 | 富士写真フイルム株式会社 | 骨計測方法および装置 |
JP2003098117A (ja) * | 1997-12-25 | 2003-04-03 | Nec Corp | 画像の欠陥検出装置及びその欠陥検出方法 |
JPH11224892A (ja) * | 1998-02-05 | 1999-08-17 | Nippon Inter Connection Systems Kk | テープキャリアの欠陥検出装置および欠陥検出方法 |
US6507670B1 (en) * | 1998-03-05 | 2003-01-14 | Ncr Corporation | System and process for removing a background pattern from a binary image |
US6714674B1 (en) * | 1998-04-30 | 2004-03-30 | General Electric Company | Method for converting digital image pixel values |
JP2000090290A (ja) * | 1998-07-13 | 2000-03-31 | Sony Corp | 画像処理装置および画像処理方法、並びに媒体 |
JP2000134491A (ja) * | 1998-10-26 | 2000-05-12 | Canon Inc | 画像処理装置及びその制御方法及び記憶媒体 |
JP2000134476A (ja) * | 1998-10-26 | 2000-05-12 | Canon Inc | 画像処理装置及びその制御方法及び記憶媒体 |
US6891967B2 (en) * | 1999-05-04 | 2005-05-10 | Speedline Technologies, Inc. | Systems and methods for detecting defects in printed solder paste |
JP2001076074A (ja) * | 1999-09-03 | 2001-03-23 | Canon Inc | 医療用画像処理装置 |
JP4046465B2 (ja) * | 1999-09-17 | 2008-02-13 | 株式会社リコー | 画像処理装置、画像処理方法、及び画像処理システム |
JP2001160902A (ja) * | 1999-12-01 | 2001-06-12 | Canon Inc | 画像処理装置およびその方法 |
JP4148443B2 (ja) * | 2001-05-21 | 2008-09-10 | 株式会社リコー | 画像形成装置 |
JP3742313B2 (ja) * | 2001-05-10 | 2006-02-01 | 日本電信電話株式会社 | 画像照合装置、画像照合方法、プログラム及び記録媒体 |
JP3983101B2 (ja) * | 2001-05-25 | 2007-09-26 | 株式会社リコー | 画像処理装置、画像読み取り装置、画像形成装置およびカラー複写装置 |
US7130463B1 (en) * | 2002-12-04 | 2006-10-31 | Foveon, Inc. | Zoomed histogram display for a digital camera |
JP4389489B2 (ja) * | 2003-05-06 | 2009-12-24 | ソニー株式会社 | 画像処理方法、および画像処理装置 |
-
2003
- 2003-05-06 JP JP2003164373A patent/JP4389489B2/ja not_active Expired - Fee Related
- 2003-05-06 JP JP2003164372A patent/JP4547869B2/ja not_active Expired - Fee Related
-
2004
- 2004-04-30 EP EP04730682A patent/EP1622079B1/en not_active Expired - Fee Related
- 2004-04-30 US US10/518,607 patent/US7450757B2/en active Active
- 2004-04-30 WO PCT/JP2004/006328 patent/WO2004100069A1/ja active Application Filing
-
2008
- 2008-10-06 US US12/246,043 patent/US7702174B2/en not_active Expired - Fee Related
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH06339028A (ja) * | 1993-05-27 | 1994-12-06 | Toshiba Corp | 画像表示装置 |
JPH07210655A (ja) * | 1994-01-21 | 1995-08-11 | Nikon Corp | 眼科用画像処理装置 |
JPH08287255A (ja) * | 1995-04-12 | 1996-11-01 | Nec Corp | 皮膚紋様画像の画像特徴抽出装置および画像処理装置 |
JP2001144960A (ja) * | 1999-11-18 | 2001-05-25 | Kyocera Mita Corp | 画像処理装置 |
JP2002092616A (ja) * | 2000-09-20 | 2002-03-29 | Hitachi Ltd | 個人認証装置 |
JP2002135589A (ja) * | 2000-10-27 | 2002-05-10 | Sony Corp | 画像処理装置および方法、並びに記録媒体 |
Non-Patent Citations (2)
Title |
---|
See also references of EP1622079A4 |
WIECEK B ET AL.: "Advanced thermal image processing, for medical and biological application", PROCEEDINGS OF THE 23RD. ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. 2001 CONFERENCE PROCEEDINGS (EMBS). ISTANBUL, TURKEY, OCT. 25 - 28, 2001, vol. 3, 25 October 2001 (2001-10-25), pages 2805 - 2807, XP010592244 |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004329826A (ja) * | 2003-05-06 | 2004-11-25 | Sony Corp | 画像処理方法、および画像処理装置 |
US7702174B2 (en) | 2003-05-06 | 2010-04-20 | Sony Corporation | Image processing apparatus and image processing method |
CN111507310A (zh) * | 2020-05-21 | 2020-08-07 | 国网湖北省电力有限公司武汉供电公司 | 一种基于φ-otdr的光缆通道内人为触缆作业信号识别方法 |
CN111507310B (zh) * | 2020-05-21 | 2023-05-23 | 国网湖北省电力有限公司武汉供电公司 | 一种基于φ-otdr的光缆通道内人为触缆作业信号识别方法 |
Also Published As
Publication number | Publication date |
---|---|
EP1622079A1 (en) | 2006-02-01 |
JP2004329826A (ja) | 2004-11-25 |
US20090041351A1 (en) | 2009-02-12 |
JP4389489B2 (ja) | 2009-12-24 |
EP1622079A4 (en) | 2010-09-29 |
JP4547869B2 (ja) | 2010-09-22 |
US7450757B2 (en) | 2008-11-11 |
US20050232483A1 (en) | 2005-10-20 |
US7702174B2 (en) | 2010-04-20 |
EP1622079B1 (en) | 2012-05-02 |
JP2004329825A (ja) | 2004-11-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2004100069A1 (ja) | 画像処理装置、および画像処理方法 | |
US8275174B2 (en) | Vein pattern management system, vein pattern registration apparatus, vein pattern authentication apparatus, vein pattern registration method, vein pattern authentication method, program, and vein data configuration | |
WO2022066955A1 (en) | Method to verify identity using a previously collected biometric image/data | |
US8111878B2 (en) | Vein authentication device and vein authentication method | |
JP5045344B2 (ja) | 登録装置、登録方法、認証装置及び認証方法 | |
US20160259960A1 (en) | Texture features for biometric authentication | |
US20140241598A1 (en) | Method and Apparatus for Processing Biometric Images | |
EP2148295A1 (en) | Vein pattern management system, vein pattern registration device, vein pattern authentication device, vein pattern registration method, vein pattern authentication method, program, and vein data structure | |
WO2005052859A1 (ja) | 眼画像入力装置および認証装置ならびに画像処理方法 | |
JP2003168084A (ja) | 本人認証システム及び方法 | |
CN101465736A (zh) | 一种身份认证方法和系统 | |
JP2005062990A (ja) | 虹彩認証装置 | |
EP2148302A1 (en) | Vein pattern management system, vein pattern registration device, vein pattern authentication device, vein pattern registration method, vein pattern authentication method, program, and vein data structure | |
US20040175023A1 (en) | Method and apparatus for checking a person's identity, where a system of coordinates, constant to the fingerprint, is the reference | |
US8320639B2 (en) | Vein pattern management system, vein pattern registration apparatus, vein pattern authentication apparatus, vein pattern registration method, vein pattern authentication method, program, and vein data configuration | |
JP2005027135A (ja) | 盗撮防止システム及び盗撮防止端末機 | |
CN110930154A (zh) | 身份验证方法和装置 | |
US20200074255A1 (en) | Graphic two-dimensional barcode and creating method thereof | |
CN110858360A (zh) | 扫图支付法 | |
JP2010020670A (ja) | 認証装置 | |
JP4935213B2 (ja) | 登録装置、認証装置、画像処理方法及びプログラム | |
Mil’shtein et al. | Applications of Contactless Fingerprinting | |
JP2001052178A (ja) | データの作成方法および作成装置ならびに照合方法およびシステム | |
WO2004086292A1 (en) | Method and device for classifying field types of a digital image | |
CN110910115A (zh) | 扫物支付法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): BW GH GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
WWE | Wipo information: entry into national phase |
Ref document number: 10518607 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2004730682 Country of ref document: EP |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWP | Wipo information: published in national office |
Ref document number: 2004730682 Country of ref document: EP |