WO2013145164A1 - 撮像装置 - Google Patents
撮像装置 Download PDFInfo
- Publication number
- WO2013145164A1 WO2013145164A1 PCT/JP2012/058163 JP2012058163W WO2013145164A1 WO 2013145164 A1 WO2013145164 A1 WO 2013145164A1 JP 2012058163 W JP2012058163 W JP 2012058163W WO 2013145164 A1 WO2013145164 A1 WO 2013145164A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- light
- distance
- subject
- light emitting
- imaging
- Prior art date
Links
- 238000003384 imaging method Methods 0.000 title claims abstract description 96
- 230000003287 optical effect Effects 0.000 claims abstract description 25
- 238000001514 detection method Methods 0.000 claims description 31
- 230000001678 irradiating effect Effects 0.000 claims description 2
- 239000000758 substrate Substances 0.000 description 23
- 238000000034 method Methods 0.000 description 19
- 238000012545 processing Methods 0.000 description 18
- 238000010586 diagram Methods 0.000 description 14
- 230000008569 process Effects 0.000 description 11
- 230000006870 function Effects 0.000 description 8
- 230000000052 comparative effect Effects 0.000 description 6
- 238000005286 illumination Methods 0.000 description 6
- 238000005259 measurement Methods 0.000 description 5
- 239000000284 extract Substances 0.000 description 4
- 239000011159 matrix material Substances 0.000 description 4
- 230000007480 spreading Effects 0.000 description 4
- 238000004422 calculation algorithm Methods 0.000 description 3
- 238000004891 communication Methods 0.000 description 3
- 230000007423 decrease Effects 0.000 description 3
- 230000001965 increasing effect Effects 0.000 description 3
- 210000003462 vein Anatomy 0.000 description 3
- 230000000295 complement effect Effects 0.000 description 2
- 238000013461 design Methods 0.000 description 2
- 230000001939 inductive effect Effects 0.000 description 2
- 229910044991 metal oxide Inorganic materials 0.000 description 2
- 150000004706 metal oxides Chemical class 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 229910000679 solder Inorganic materials 0.000 description 2
- 230000008901 benefit Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005401 electroluminescence Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000000630 rising effect Effects 0.000 description 1
- 238000004088 simulation Methods 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 230000017105 transposition Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/002—Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/02—Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness
- G01B11/026—Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness by measuring distance between sensor and object
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01J—MEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
- G01J1/00—Photometry, e.g. photographic exposure meter
- G01J1/02—Details
- G01J1/08—Arrangements of light sources specially adapted for photometry standard sources, also using luminescent or radioactive material
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01J—MEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
- G01J1/00—Photometry, e.g. photographic exposure meter
- G01J1/42—Photometry, e.g. photographic exposure meter using electric radiation detectors
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/02—Systems using the reflection of electromagnetic waves other than radio waves
- G01S17/06—Systems determining position data of a target
- G01S17/46—Indirect determination of position data
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/10—Image acquisition
- G06V10/12—Details of acquisition arrangements; Constructional details thereof
- G06V10/14—Optical characteristics of the device performing the acquisition or on the illumination arrangements
- G06V10/145—Illumination specially adapted for pattern recognition, e.g. using gratings
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/10—Image acquisition
- G06V10/12—Details of acquisition arrangements; Constructional details thereof
- G06V10/14—Optical characteristics of the device performing the acquisition or on the illumination arrangements
- G06V10/147—Details of sensors, e.g. sensor lenses
Definitions
- the present invention relates to an imaging apparatus.
- Patent Document 1 discloses an imaging device that includes a laser and a distance sensor that uses an optical triangulation method.
- Japanese Patent Application Laid-Open No. 2004-228561 discloses a technique for photographing a subject with a camera while condensing the LED light and projecting it on the subject.
- Patent Document 1 has a problem that the laser is expensive.
- Patent Document 2 an LED that is less expensive than a laser is used, but the degree of light collection is low. Therefore, it is difficult to use the technique of Patent Document 2 for a small-sized imaging device.
- the present invention has been made in view of the above problems, and an object of the present invention is to provide an imaging device that can be reduced in size while suppressing cost.
- an imaging apparatus disclosed in the specification includes an imaging element that images a subject, and a plurality of light sources that irradiate the subject with light, and an optical axis of the light source is that of the imaging element. Inclined outward with respect to the optical axis.
- (A) is a schematic diagram of an imaging device according to Comparative Example 1
- (b) is a schematic diagram of a compact imaging device according to Comparative Example 2. It is the schematic of the imaging device applied to an Example.
- (A) And (b) is a figure for demonstrating the definition of each variable of an imaging device.
- (A) is a top view of the imaging device according to the first embodiment, and (b) and (c) are side views of the light source.
- (A) is a schematic perspective view of a light emitting element, and (b) to (e) are examples of arrangement of the light emitting element.
- FIG. 1 is a block diagram illustrating a hardware configuration of a biometric authentication apparatus to which an imaging apparatus according to a second embodiment is applied, (b) is a top view of the imaging apparatus, and (c) is a side view of the imaging apparatus.
- FIG. It is a block diagram of each function implement
- (A) And (b) is a figure for demonstrating the example of a spot light image.
- (A) And (b) is a figure for demonstrating spot detection. It is an example of the calibration table acquired in advance. It is a figure for demonstrating the authenticable distance and the guidance possible distance. It is a figure for demonstrating the flowchart at the time of performing the guidance according to the said distance range. It is a figure for demonstrating the spreading angle (beta) of spot light.
- FIG. 1A is a schematic diagram of an imaging apparatus according to Comparative Example 1.
- the imaging device which concerns on the comparative example 1 is equipped with the light source which emits a spotlight.
- the imaging device detects a distance between the imaging element and the subject by detecting a position (hereinafter, a spot position) where the irradiated light strikes the subject.
- the light from the light source gradually spreads as it travels in the direction of travel.
- the imaging apparatus has a predetermined size, the spot lights do not interfere with each other, so that the spread of the light is not a big problem.
- the imaging device is required to be downsized. For example, when it is desired to make an imaging device function as a portable device such as a smartphone as well as a conventional imaging device based on stationary, downsizing of the imaging device is required. In fields where downsizing is required, in general, not an expensive device but a low cost is often required. Therefore, there is a limit to the degree of light collection.
- FIG. 1B is a schematic diagram of a small imaging device according to Comparative Example 2.
- the light sources are close to each other.
- the spot lights interfere with each other.
- the detection accuracy of the spot position decreases, and the detection accuracy of the distance between the image sensor and the subject decreases.
- FIG. 2 is a schematic diagram of an imaging apparatus 200 applied to the following embodiments.
- the imaging apparatus 200 has a structure in which a plurality of light sources 202 are provided around an imaging element 201.
- the imaging element 201 is not particularly limited as long as it is an imageable element, and is, for example, a CMOS (Complementary Metal Oxide Semiconductor) camera.
- the light source 202 is not particularly limited, and any light source can be used as long as the emitted light spreads as it travels in the traveling direction. For example, an LED or the like can be used as the light source 202.
- Each light source 202 is arranged such that the irradiation direction of the spot light is inclined outward with respect to the optical axis of the image sensor 201. In this case, it is possible to suppress the spot light of the light source 202 from spreading toward the image sensor 201 side. With such a configuration, even if the imaging device 200 is downsized, interference between spot lights can be suppressed. Moreover, it is not necessary to use an expensive light source with a high degree of light collection. That is, the imaging device 200 can be reduced in size while suppressing costs. In addition, it is preferable that the inclination directions of the spot optical axes of the two light sources 202 facing each other with the image sensor 201 interposed therebetween are opposite to each other.
- a lens for spot light it is possible to increase the light collecting degree by using a configuration in which an aspheric lens or a plurality of lenses are combined.
- there is a problem incurring an increase in cost for the configuration.
- the accuracy required when assembling the configuration is increased, it is not suitable for a small and inexpensive imaging apparatus.
- a very high level of alignment is required, and assembly accuracy must be improved. Therefore, the cost increases accordingly.
- an extra number of parts is required.
- the height of the imaging device itself is required, so that the imaging device is increased in size.
- the imaging apparatus 200 of FIG. 2 only the optical axis of the spot light is inclined outward with respect to the optical axis of the imaging element, and no additional parts or improvement in mounting accuracy is required. Therefore, there is a great advantage as a small-sized device because no additional component cost is required. Moreover, since the algorithm for measuring the distance can be the same as the current one, no new development cost is required.
- FIG. 3A and FIG. 3B are diagrams for explaining the definition of each variable of the imaging apparatus 200.
- the distance D represents the distance (mm) between the light sources 202.
- the distance x represents the distance (mm) between the image sensor 201 and the subject.
- the angle of view ⁇ represents the angle of view (rad) of the lens provided in the image sensor 201.
- the tilt angle ⁇ represents the tilt angle (rad) of the optical axis of the spot light with respect to the optical axis of the image sensor 201.
- the shooting range W (x) represents a shooting range (a range that can be shot with an image sensor) (mm) at a distance x (mm).
- the distance L (x) represents the distance (mm) from the center of the screen to the center of the spot position.
- the distance P (x) represents a pixel distance obtained by converting the distance L (x) into a pixel.
- K represents the number of pixels of the image element of the image sensor 201.
- the distance D, the angle of view ⁇ , and the inclination angle ⁇ are design values and fixed values.
- the distance L (x) and the distance P (x) are values determined according to the distance x at the time of shooting.
- the distance L (x) and the distance P (x) are the same length (distance from the center of the screen to the center of the spot position) expressed in mm and in pixels.
- the distance P (x) is a measurement amount observed when the spot position is imaged by the image sensor 201.
- the distance L (x) is expressed as the following formula (1). According to the following formula (1), the distance L (x) increases in proportion to the inclination angle ⁇ representing the inclination angle of the spot light.
- the shooting range W (x) at the distance x is expressed by the following formula (2) using the angle of view ⁇ of the image sensor 201.
- the variable K for example, 480 pixels
- the shooting range W (x) is shot corresponding to K pixels.
- the distance P (x) is expressed as the following formula (3).
- the following formula (3) is a formula representing the relationship between the center of the spot position and the distance x.
- the distance x can be calculated from the distance P (x) using the following formula (3).
- FIG. 4A is a top view of the imaging apparatus 100 according to the first embodiment.
- the imaging apparatus 100 has a configuration in which a plurality of light sources 20 are provided around the imaging element 10.
- the image sensor 10 is disposed at the center of a rectangular substrate 30, and each light source 20 is disposed at each corner of the substrate 30.
- FIG. 4B is a side view of the light source 20.
- the light source 20 includes a light emitting element 21 disposed on the substrate 30 and a condenser lens 22 disposed on the light emitting element 21.
- the optical axis of the light emitting element 21 is deviated from the center of the condenser lens 22.
- FIG. 4C when the optical axis of the light emitting element 21 coincides with the center of the condenser lens 22, the emitted light of the light emitting element 21 goes straight.
- FIG. 4C when the optical axis of the light emitting element 21 coincides with the center of the condenser lens 22, the emitted light of the light emitting element 21 goes straight.
- FIG. 4C when the optical axis of the light emitting element 21 coincides with the center of the condenser lens 22, the emitted light of the light emitting element 21 goes straight.
- FIG. 4C when the optical axis of the light emitting element 21 coincides with the center of the condens
- the light emitted from the light emitting element 21 can be tilted by shifting the optical axis of the light emitting element 21 from the optical axis of the condenser lens 22.
- the irradiation direction of the light emitting element 21 can be inclined outward with respect to the optical axis of the imaging element 10 by shifting the light emitting element 21 toward the imaging element 10 from the center of the condenser lens 22. .
- FIG. 5A is a schematic perspective view of the light emitting element 21.
- the light emitting element 21 has the structure by which the reflector 21b which reflects light was arrange
- a general LED element has a cubic shape or a rectangular parallelepiped shape, and the light emitting surface of the light emitting portion 21a has a rectangular shape (square shape or rectangular shape).
- the image of the spot position photographed by the image sensor 10 is a projection of the shape of the light emitting surface of the light emitting unit 21a.
- the diagonal line of the substrate 30 is a light emitting element with reference to FIG. Line segments passing through 21 may vary. This is because the position of the light emitting element 21 may be shifted from a desired position when the light emitting element 21 is mounted. For example, when the light emitting element 21 is mounted by solder, the light emitting element 21 is easily displaced from a desired position.
- the spot position is generally searched along the diagonal line of the substrate 30.
- the spot position detection accuracy decreases.
- the light emitting elements 21 are arranged so that one side of the light emitting element 21 faces the imaging element 10 as compared with the example of FIG.
- FIG. 5 (d) it is possible to suppress variations in the line segment through which the diagonal line of the substrate 30 passes through the light emitting element 21.
- the detection accuracy of the spot position can be improved.
- the light emitting element 21 so that the side of the light emitting element 21 facing the imaging element 10 is perpendicular to the diagonal line of the substrate 30, the line segment through which the diagonal line of the substrate 30 passes in each light emitting element 21. Be the same. Thereby, the detection accuracy of the spot position can be further improved.
- FIG. 6A is a block diagram illustrating a hardware configuration of the biometric authentication device 400 to which the imaging device 100a according to the second embodiment is applied.
- FIG. 6B is a top view of the imaging apparatus 100a.
- FIG. 6C is a side view of the imaging apparatus 100a.
- the biometric authentication device 400 has a configuration in which a terminal device 300 including a CPU 101, a RAM 102, a storage device 103, a display device 104, a communication unit 105, and the like is connected to the imaging device 100a. Each device in the terminal device 300 is connected by a bus or the like.
- a CPU (Central Processing Unit) 101 is a central processing unit. The CPU 101 includes one or more cores.
- a RAM (Random Access Memory) 102 is a volatile memory that temporarily stores programs executed by the CPU 101, data processed by the CPU 101, and the like.
- the storage device 103 is a nonvolatile storage device.
- a solid state drive (SSD) such as a ROM (Read Only Memory) or a flash memory, a hard disk driven by a hard disk drive, or the like can be used.
- the biometric authentication program is stored in the storage device 103.
- the display device 104 is a liquid crystal display, an electroluminescence panel, or the like, and displays a result of biometric authentication.
- the communication unit 105 is an interface for transmitting / receiving signals to / from an external device.
- the terminal device 300 and the imaging device 100a are connected via the communication unit 105.
- the imaging device 100a is a device that takes a biological body of a user as a subject and acquires a biological image.
- the imaging device 100a is a device that acquires a palm vein image of a user without contact.
- the imaging apparatus 100 a has a configuration in which the imaging element 10 is arranged at the center portion on the substrate 30, and the light source 20 and the illumination light source 40 are arranged around the imaging element 10.
- the image sensor 10 is a CMOS (Complementary Metal Oxide Semiconductor) camera or the like.
- the substrate 30 has a rectangular shape.
- the plurality of light sources 20 are arranged at each corner of the substrate 30. That is, four light sources 20 are arranged.
- the illumination light source 40 is an LED or the like that emits near-infrared light, and two light sources 20 are disposed between the light sources 20. That is, a total of eight illumination light sources 40 are arranged.
- the number of illumination light sources 40 is not particularly limited.
- the light source 20 has a configuration in which a light emitting element 21 is disposed on a substrate 30 and an aperture 23 and a condenser lens 22 are disposed on the light emitting element 21.
- the aperture 23 has a structure in which a hole is opened in the center portion, and has a function of increasing the degree of light collection by cutting excess light.
- the aperture 23 may be disposed on either the upper side or the lower side of the condenser lens 22 or may be disposed on both.
- the light emitting element 21 is shifted from the center of the condenser lens 22 toward the image pickup element 10. Thereby, the irradiation direction of the light emitting element 21 is inclined outward with respect to the optical axis of the imaging element 10. Note that the cost can be reduced by using a common light emitting element for the illumination light source 40 and the light emitting element 21.
- the layout of wiring on the substrate 30 can be changed relatively flexibly. Therefore, the degree of freedom of arrangement of the light emitting elements 21 on the substrate 30 is relatively high. From the above, the position of the light emitting element 21 may be shifted after the condenser lens 22 is fixed.
- the lower surface of the condenser lens 22 located on the light emitting element 21 side may have a planar shape, and the upper surface may have a spherical shape. By making the upper surface spherical, it is possible to efficiently collect the irradiation light spreading in a diffusing manner.
- the condenser lens 22 may have a spherical lower surface and a flat upper surface.
- the biometric authentication program stored in the storage device 103 is expanded in the RAM 102 so as to be executable.
- the CPU 101 executes a biometric authentication program expanded in the RAM 102.
- each process by the biometric authentication apparatus 400 is performed, for example, a biometric data registration process, a biometric authentication process, etc. are performed.
- the biometric data registration process is a process of registering feature data extracted from a biometric image of a new unregistered user in the database as registered feature data.
- the biometric authentication process is a process for identifying an authenticated user by personal authentication based on matching between matching feature data extracted from a biometric image acquired at the time of authentication and registered feature data.
- FIG. 7 is a block diagram of each function realized by executing the biometric authentication program.
- the overall control unit 11 controls the imaging unit 12, the detection unit 13, the guide unit 14, the authentication processing unit 15, and the registration database 16 are realized.
- the overall control unit 11 controls the imaging unit 12, the detection unit 13, the guidance unit 14, and the authentication processing unit 15.
- the imaging unit 12 controls the imaging device 100a and acquires a user's biological image from the imaging device 100a.
- the detection unit 13 detects the distance between the image sensor 10 and the subject and the tilt of the subject using the spot light image acquired by the imaging device 100a.
- the detection unit 13 may be included in the imaging apparatus 100a. Further, the detection unit may be distributed between the terminal device 300 and the imaging device 100a. In this case, the detection unit on the terminal device 300 side may measure the distance with high accuracy for the purpose of the authentication processing unit 15 using the authentication processing. Furthermore, the detection unit on the imaging apparatus 100a side can be used only for object detection and distance guidance, and can be configured to apply a simple calculation method (with a thinning process).
- the guidance unit 14 performs guidance processing on the subject according to the detection result of the detection unit 13.
- the guide unit 14 guides the user so that the distance x and the inclination of the subject are within appropriate ranges. For example, the guide unit 14 guides the user by displaying a message for the user on the display device 104.
- the authentication processing unit 15 extracts feature data from the biological image acquired by the imaging unit 12. For example, the authentication processing unit 15 extracts a vein pattern and the like. The authentication processing unit 15 extracts registered feature data and registers it in the registration database 16 during biometric data registration processing, and extracts matching feature data during biometric authentication processing. The authentication processing unit 15 identifies the authenticated user by matching the matching feature data with the registered feature data registered in the registration database 16.
- the imaging device 100 includes a control unit 50 and a storage device 60.
- the control unit 50 controls the image sensor 10, the light source 20, and the illumination light source 40 in accordance with instructions from the imaging unit 12.
- FIG. 8A and FIG. 8B are diagrams for explaining an example of a spot light image.
- FIG. 8A shows an example when the imaging device 100a is close to the subject
- FIG. 8B shows an example when the imaging device 100a is far from the subject.
- the spot light area on the screen is a spot light image.
- the spot light image is detected near the center of the screen.
- the detection unit 13 detects the position of the spot light from the image acquired by the image sensor 10. First, the detection unit 13 searches for a spot position along a diagonal line (45 ° line) of the substrate 30 with the center position O of the spot light image as a starting point. Specifically, the luminance values of the image on the 45 ° line are acquired in order, and it is determined that the spot is started when the luminance value exceeds a predetermined threshold value Th. Note that spot positions are detected by the number of light sources 20 mounted. In the present embodiment, since four light sources 20 are arranged, the detection unit 13 performs a total of four searches for each light source 20.
- the detection unit 13 acquires a distance P (x) between the center of the screen and the center of the spot position.
- P (x) the position of the “rise” or “fall” of the spot is not reliable because the spot light itself spreads with distance.
- FIG. 9A and FIG. 9B are diagrams for explaining spot detection.
- the horizontal axis represents the distance on the 45 ° line in pixel units
- the vertical axis represents the luminance value.
- the detection unit 13 obtains a range that exceeds a predetermined threshold Th on the 45 ° line, and sets the center of the range to P (x).
- the threshold Th may be variable according to the distance from the screen center.
- the detection unit 13 calculates the distance x according to the following formula (4) using the distance P (x) obtained above.
- An error may occur in the installation position of the light source 20. Thereby, an error also occurs in the detected position of the spot position.
- This error has a unique value for each individual imaging device 100a. Therefore, as a calibration at the time of product shipment, the correspondence relationship between the distance P (X) and the distance x may be measured in advance and recorded in the storage device 60 of the imaging device 100a.
- a subject may be set in advance at a distance x, and the distance P (x) measured at this time may be stored in a table.
- FIG. 10 is an example of a calibration table acquired in advance. With such a configuration, distance measurement with higher accuracy becomes possible. Note that when a calibration table as shown in FIG. 10 is acquired in advance, distances other than the distance held in the table can be calculated by a complementing process. For example, linear interpolation processing may be used. This approximates a straight line between two adjacent distances.
- the search start position of the spot light may be stored as calibration.
- the search start position of the spot position is the center of the screen.
- the accuracy of the procedure and the spot light source is higher than a certain level.
- the assembly accuracy of the spot light source is very low, there is a possibility that the spot will be off even if searching on the 45 ° line from the center of the screen. This case can be dealt with by storing the search start position as calibration data in advance.
- the optimum value of the inclination angle ⁇ is determined according to the operating conditions of the imaging apparatus 100a.
- the biometric authentication process is operated by dividing into two types of distance ranges, that is, an authenticable distance and an inducible distance.
- FIG. 11 is a diagram for explaining the authenticable distance and the inducible distance.
- the authenticable distance is a distance range in which shooting with four spot lights is guaranteed by specifications.
- the guideable distance is a distance that can be guided by at least one of the four-point spot lights. Due to the fact that the spot light is inclined outward, not all of the four spot lights hit the subject within the guideable distance. As a result, there is a case where the tilt detection of the subject cannot be executed within this distance range. In particular, in the case of a person with small hands, there is a high possibility that the spot light will be lost.
- FIG. 12 is a diagram for explaining a flowchart when performing guidance according to the distance range.
- detection unit 13 acquires distances r1 to r4 based on the spot positions of four light sources 20 (step S1).
- the detection unit 13 determines whether or not all four spot positions are within the authenticable distance range R1 (step S2).
- the detection part 13 determines whether one point or more exists in the guidance possible distance range R2 (step S3). If it is determined as “Yes” in step S3, the guiding unit 14 guides the subject (step S4). Thereafter, step S1 is executed again.
- Step S2 determines whether or not the distance and inclination of the subject are within a predetermined range (Step S5). When it is determined as “No” in step S5, the guiding unit 14 guides the subject (step S6). Thereafter, step S1 is executed again. If “Yes” is determined in step S5, the authentication processing unit 15 performs an authentication process (step S7). Through the above processing, the subject can be imaged at an appropriate distance and inclination.
- the positional deviation DX giving the inclination angle ⁇ can be measured in advance by simulation or actual measurement. If the inclination angle ⁇ is too small, the spot lights interfere with each other. Therefore, it is preferable to appropriately set the necessary minimum inclination angle ⁇ min .
- the inclination angle ⁇ min is a value determined depending on the spread angle ⁇ of the spot light with reference to FIG. That is, if the inclination angle ⁇ is set larger than the spread angle ⁇ of the spot light, the spot light sources are separated from each other by the distance D and thus do not interfere.
- the spread angle ⁇ of the spot light can be obtained by an optical simulator or actual measurement.
- the minimum inclination angle ⁇ min of the inclination angle ⁇ may be set. If the minimum inclination angle ⁇ min is determined, the corresponding LED positional deviation amount DX min can also be set.
- the maximum inclination angle ⁇ max of the inclination angle ⁇ may be set based on the authenticable distance that is the operation condition. Specifically, the maximum inclination angle ⁇ max of the inclination angle ⁇ may be determined from the assumed maximum value R 1max of the authenticable distance and the assumed minimum subject size L min . L min corresponds to the minimum value of the palm size in the example of palm vein authentication, but can be set to such a size that four spot lights are observed even when there is a hand position blur.
- the maximum inclination angle ⁇ max is obtained from the condition of the following equation (10) under the condition that the irradiated spot light falls within the range of the size L min at the position of the distance R 1max .
- the positional deviation amount DX max and the positional deviation amount DX min are obtained.
- the recording medium in which the software program for realizing the function of the biometric authentication device 400 is recorded may be supplied to the biometric authentication device 400, and the CPU 101 may execute the program.
- Examples of the storage medium for supplying the program include a CD-ROM, DVD, Blu-ray, or SD card.
- each function is realized by executing a program by the CPU.
- the present invention is not limited to this.
- each function may be realized using a dedicated circuit or the like.
Abstract
Description
20 光源
21 発光素子
22 集光レンズ
30 基板
100 撮像装置
Claims (6)
- 被写体を撮像する撮像素子と、
光を前記被写体に照射する複数の光源と、を備え、
前記光源の光軸は、前記撮像素子の光軸に対して外側に傾斜することを特徴とする撮像装置。 - 前記光源は、発光素子および集光レンズを含み、
前記発光素子は、前記発光素子の光軸が前記集光レンズの中心に対して前記撮像素子側にずらして配置されていることを特徴とする請求項1記載の撮像装置。 - 前記光源に含まれる発光素子の発光面は矩形であり、
前記矩形の一辺が前記撮像素子と対向することを特徴とする請求項1または2記載の撮像装置。 - 前記光源から照射される光が前記被写体に当たって現れるスポット位置を検出することによって、前記撮像素子と前記被写体との距離を検出する検出部を備えることを特徴とする請求項1~3のいずれか一項に記載の撮像装置。
- 前記スポット位置と、前記撮像素子と前記被写体との距離との対応関係を記憶する記憶部を備えることを特徴とする請求項4記載の撮像装置。
- 前記被写体は、生体であることを特徴とする請求項1~5のいずれか一項に記載の撮像装置。
Priority Applications (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201280071825.1A CN104220838B (zh) | 2012-03-28 | 2012-03-28 | 拍摄装置 |
JP2014507132A JP6079772B2 (ja) | 2012-03-28 | 2012-03-28 | 撮像装置 |
PCT/JP2012/058163 WO2013145164A1 (ja) | 2012-03-28 | 2012-03-28 | 撮像装置 |
KR1020147026798A KR101630558B1 (ko) | 2012-03-28 | 2012-03-28 | 촬상 장치 |
EP12873126.2A EP2833095B1 (en) | 2012-03-28 | 2012-03-28 | Imaging device |
US14/483,555 US9644943B2 (en) | 2012-03-28 | 2014-09-11 | Imaging device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2012/058163 WO2013145164A1 (ja) | 2012-03-28 | 2012-03-28 | 撮像装置 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/483,555 Continuation US9644943B2 (en) | 2012-03-28 | 2014-09-11 | Imaging device |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2013145164A1 true WO2013145164A1 (ja) | 2013-10-03 |
Family
ID=49258530
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2012/058163 WO2013145164A1 (ja) | 2012-03-28 | 2012-03-28 | 撮像装置 |
Country Status (6)
Country | Link |
---|---|
US (1) | US9644943B2 (ja) |
EP (1) | EP2833095B1 (ja) |
JP (1) | JP6079772B2 (ja) |
KR (1) | KR101630558B1 (ja) |
CN (1) | CN104220838B (ja) |
WO (1) | WO2013145164A1 (ja) |
Cited By (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2017038158A1 (ja) * | 2015-08-31 | 2017-03-09 | 富士フイルム株式会社 | 測距装置、測距用制御方法、及び測距用制御プログラム |
WO2017038157A1 (ja) * | 2015-08-31 | 2017-03-09 | 富士フイルム株式会社 | 測距装置、測距用導出方法、及び測距用導出プログラム |
JP2017162394A (ja) * | 2016-03-11 | 2017-09-14 | 富士通株式会社 | 生体撮影装置、生体撮影方法および生体撮影プログラム |
JP2017528376A (ja) * | 2014-10-27 | 2017-09-28 | ▲広▼州▲極飛▼科技有限公司 | 回転翼機及びその自動着陸システム及び方法 |
JP2018514783A (ja) * | 2015-05-10 | 2018-06-07 | マジック アイ インコーポレイテッド | 距離センサ |
JP2020024234A (ja) * | 2014-10-24 | 2020-02-13 | マジック アイ インコーポレイテッド | 距離センサ |
US10885761B2 (en) | 2017-10-08 | 2021-01-05 | Magik Eye Inc. | Calibrating a sensor system including multiple movable sensors |
US10931883B2 (en) | 2018-03-20 | 2021-02-23 | Magik Eye Inc. | Adjusting camera exposure for three-dimensional depth sensing and two-dimensional imaging |
US11002537B2 (en) | 2016-12-07 | 2021-05-11 | Magik Eye Inc. | Distance sensor including adjustable focus imaging sensor |
US11019249B2 (en) | 2019-05-12 | 2021-05-25 | Magik Eye Inc. | Mapping three-dimensional depth map data onto two-dimensional images |
US11062468B2 (en) | 2018-03-20 | 2021-07-13 | Magik Eye Inc. | Distance measurement using projection patterns of varying densities |
US11199397B2 (en) | 2017-10-08 | 2021-12-14 | Magik Eye Inc. | Distance measurement using a longitudinal grid pattern |
US11320537B2 (en) | 2019-12-01 | 2022-05-03 | Magik Eye Inc. | Enhancing triangulation-based three-dimensional distance measurements with time of flight information |
US11474209B2 (en) | 2019-03-25 | 2022-10-18 | Magik Eye Inc. | Distance measurement using high density projection patterns |
US11475584B2 (en) | 2018-08-07 | 2022-10-18 | Magik Eye Inc. | Baffles for three-dimensional sensors having spherical fields of view |
US11474245B2 (en) | 2018-06-06 | 2022-10-18 | Magik Eye Inc. | Distance measurement using high density projection patterns |
US11483503B2 (en) | 2019-01-20 | 2022-10-25 | Magik Eye Inc. | Three-dimensional sensor including bandpass filter having multiple passbands |
US11580662B2 (en) | 2019-12-29 | 2023-02-14 | Magik Eye Inc. | Associating three-dimensional coordinates with two-dimensional feature points |
US11688088B2 (en) | 2020-01-05 | 2023-06-27 | Magik Eye Inc. | Transferring the coordinate system of a three-dimensional camera to the incident point of a two-dimensional camera |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9605843B2 (en) | 2011-07-11 | 2017-03-28 | Golight, Inc. | LED system and housing for use with halogen light |
TWI490526B (zh) * | 2013-07-05 | 2015-07-01 | Pixart Imaging Inc | 光學感測模組及具有該光學感測模組之電子裝置 |
US9305155B1 (en) * | 2015-02-12 | 2016-04-05 | United Services Automobile Association (Usaa) | Toggling biometric authentication |
CN111025329A (zh) * | 2019-12-12 | 2020-04-17 | 深圳奥比中光科技有限公司 | 一种基于飞行时间的深度相机及三维成像方法 |
DE102022115810A1 (de) | 2022-06-24 | 2024-01-04 | IDloop GmbH | Vorrichtung zur kontaktlosen aufnahme von biometriedaten von hautbereichen |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH07110430A (ja) * | 1993-10-12 | 1995-04-25 | Olympus Optical Co Ltd | 測距装置を有するカメラ |
JP2000230807A (ja) | 1999-02-10 | 2000-08-22 | Micro Research:Kk | 平行光を利用した距離測定方法とその装置 |
JP2004354307A (ja) * | 2003-05-30 | 2004-12-16 | Sunx Ltd | 寸法測定装置 |
JP2006313116A (ja) * | 2005-05-09 | 2006-11-16 | Nec Viewtechnology Ltd | 距離傾斜角度検出装置および該検出装置を備えたプロジェクタ |
JP2007010346A (ja) | 2005-06-28 | 2007-01-18 | Fujitsu Ltd | 撮像装置 |
Family Cites Families (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3436556A (en) * | 1966-03-24 | 1969-04-01 | Sperry Rand Corp | Optical inspection system |
US4152624A (en) * | 1978-03-16 | 1979-05-01 | Monsanto Company | Molded LED indicator |
US4577259A (en) * | 1983-05-09 | 1986-03-18 | Motorola, Inc. | Apparatus and method for retaining an electronic device |
CA1235773A (en) * | 1983-12-23 | 1988-04-26 | Shigeto Nakayama | Device for detecting road surface condition |
US5148211A (en) * | 1989-10-20 | 1992-09-15 | Fuji Photo Film Co., Ltd. | Stabilized range finder for use with an electronically controlled camera |
US5848839A (en) * | 1997-04-07 | 1998-12-15 | Savage, Jr.; John M. | LED sealing lens cap and retainer |
CN2325758Y (zh) * | 1997-07-09 | 1999-06-23 | 北京医科大学口腔医学院 | 一种激光扫描数据采集装置 |
US6154279A (en) * | 1998-04-09 | 2000-11-28 | John W. Newman | Method and apparatus for determining shapes of countersunk holes |
US6410872B2 (en) * | 1999-03-26 | 2002-06-25 | Key Technology, Inc. | Agricultural article inspection apparatus and method employing spectral manipulation to enhance detection contrast ratio |
EP1126412B1 (en) * | 2000-02-16 | 2013-01-30 | FUJIFILM Corporation | Image capturing apparatus and distance measuring method |
US6377353B1 (en) * | 2000-03-07 | 2002-04-23 | Pheno Imaging, Inc. | Three-dimensional measuring system for animals using structured light |
US6618123B2 (en) * | 2000-10-20 | 2003-09-09 | Matsushita Electric Industrial Co., Ltd. | Range-finder, three-dimensional measuring method and light source apparatus |
US6979104B2 (en) * | 2001-12-31 | 2005-12-27 | R.J. Doran & Co. LTD | LED inspection lamp |
GB2395261A (en) * | 2002-11-11 | 2004-05-19 | Qinetiq Ltd | Ranging apparatus |
US6992843B2 (en) * | 2003-12-16 | 2006-01-31 | Metastable Instruments, Inc. | Precision optical wedge light beam scanner |
CN2676151Y (zh) * | 2003-12-26 | 2005-02-02 | 暨南大学 | 双侧光刀型高度测量范围可调的三维轮廓测量装置 |
USD559432S1 (en) * | 2004-12-14 | 2008-01-08 | Moriyama Sangyo Kabushiki Kaisha | Lens for LED |
US8301027B2 (en) * | 2008-05-02 | 2012-10-30 | Massachusetts Institute Of Technology | Agile-beam laser array transmitter |
FR2938908B1 (fr) * | 2008-11-24 | 2011-01-21 | Commissariat Energie Atomique | Dispositif et procede de mesure de la position d'au moins un objet en mouvement dans un repere a trois dimensions |
DE102009003765A1 (de) * | 2009-04-08 | 2010-10-14 | Eurodelta Gmbh | Vorrichtung zur Erfassung biometrischer Merkmale |
US8558161B2 (en) * | 2010-08-10 | 2013-10-15 | Avago Technologies General Ip (Singapore) Pte. Ltd. | Lens having multiple conic sections for LEDs and proximity sensors |
-
2012
- 2012-03-28 CN CN201280071825.1A patent/CN104220838B/zh active Active
- 2012-03-28 JP JP2014507132A patent/JP6079772B2/ja active Active
- 2012-03-28 KR KR1020147026798A patent/KR101630558B1/ko active IP Right Grant
- 2012-03-28 EP EP12873126.2A patent/EP2833095B1/en active Active
- 2012-03-28 WO PCT/JP2012/058163 patent/WO2013145164A1/ja active Application Filing
-
2014
- 2014-09-11 US US14/483,555 patent/US9644943B2/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH07110430A (ja) * | 1993-10-12 | 1995-04-25 | Olympus Optical Co Ltd | 測距装置を有するカメラ |
JP2000230807A (ja) | 1999-02-10 | 2000-08-22 | Micro Research:Kk | 平行光を利用した距離測定方法とその装置 |
JP2004354307A (ja) * | 2003-05-30 | 2004-12-16 | Sunx Ltd | 寸法測定装置 |
JP2006313116A (ja) * | 2005-05-09 | 2006-11-16 | Nec Viewtechnology Ltd | 距離傾斜角度検出装置および該検出装置を備えたプロジェクタ |
JP2007010346A (ja) | 2005-06-28 | 2007-01-18 | Fujitsu Ltd | 撮像装置 |
Non-Patent Citations (1)
Title |
---|
See also references of EP2833095A4 |
Cited By (25)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2020024234A (ja) * | 2014-10-24 | 2020-02-13 | マジック アイ インコーポレイテッド | 距離センサ |
JP2017528376A (ja) * | 2014-10-27 | 2017-09-28 | ▲広▼州▲極飛▼科技有限公司 | 回転翼機及びその自動着陸システム及び方法 |
US10392128B2 (en) | 2014-10-27 | 2019-08-27 | Guangzhou Xaircraft Technology Co., Ltd. | Rotorcraft and automatic landing system and method thereof |
JP2018514783A (ja) * | 2015-05-10 | 2018-06-07 | マジック アイ インコーポレイテッド | 距離センサ |
WO2017038158A1 (ja) * | 2015-08-31 | 2017-03-09 | 富士フイルム株式会社 | 測距装置、測距用制御方法、及び測距用制御プログラム |
WO2017038157A1 (ja) * | 2015-08-31 | 2017-03-09 | 富士フイルム株式会社 | 測距装置、測距用導出方法、及び測距用導出プログラム |
JPWO2017038158A1 (ja) * | 2015-08-31 | 2018-04-05 | 富士フイルム株式会社 | 測距装置、測距用制御方法、及び測距用制御プログラム |
JPWO2017038157A1 (ja) * | 2015-08-31 | 2018-04-26 | 富士フイルム株式会社 | 測距装置、測距用導出方法、及び測距用導出プログラム |
US10802143B2 (en) | 2015-08-31 | 2020-10-13 | Fujifilm Corporation | Distance measurement device, deriving method for distance measurement, and deriving program for distance measurement |
US11828847B2 (en) | 2015-08-31 | 2023-11-28 | Fujifilm Corporation | Distance measurement device, deriving method for distance measurement, and deriving program for distance measurement |
JP2017162394A (ja) * | 2016-03-11 | 2017-09-14 | 富士通株式会社 | 生体撮影装置、生体撮影方法および生体撮影プログラム |
US11002537B2 (en) | 2016-12-07 | 2021-05-11 | Magik Eye Inc. | Distance sensor including adjustable focus imaging sensor |
US10885761B2 (en) | 2017-10-08 | 2021-01-05 | Magik Eye Inc. | Calibrating a sensor system including multiple movable sensors |
US11199397B2 (en) | 2017-10-08 | 2021-12-14 | Magik Eye Inc. | Distance measurement using a longitudinal grid pattern |
US11062468B2 (en) | 2018-03-20 | 2021-07-13 | Magik Eye Inc. | Distance measurement using projection patterns of varying densities |
US10931883B2 (en) | 2018-03-20 | 2021-02-23 | Magik Eye Inc. | Adjusting camera exposure for three-dimensional depth sensing and two-dimensional imaging |
US11381753B2 (en) | 2018-03-20 | 2022-07-05 | Magik Eye Inc. | Adjusting camera exposure for three-dimensional depth sensing and two-dimensional imaging |
US11474245B2 (en) | 2018-06-06 | 2022-10-18 | Magik Eye Inc. | Distance measurement using high density projection patterns |
US11475584B2 (en) | 2018-08-07 | 2022-10-18 | Magik Eye Inc. | Baffles for three-dimensional sensors having spherical fields of view |
US11483503B2 (en) | 2019-01-20 | 2022-10-25 | Magik Eye Inc. | Three-dimensional sensor including bandpass filter having multiple passbands |
US11474209B2 (en) | 2019-03-25 | 2022-10-18 | Magik Eye Inc. | Distance measurement using high density projection patterns |
US11019249B2 (en) | 2019-05-12 | 2021-05-25 | Magik Eye Inc. | Mapping three-dimensional depth map data onto two-dimensional images |
US11320537B2 (en) | 2019-12-01 | 2022-05-03 | Magik Eye Inc. | Enhancing triangulation-based three-dimensional distance measurements with time of flight information |
US11580662B2 (en) | 2019-12-29 | 2023-02-14 | Magik Eye Inc. | Associating three-dimensional coordinates with two-dimensional feature points |
US11688088B2 (en) | 2020-01-05 | 2023-06-27 | Magik Eye Inc. | Transferring the coordinate system of a three-dimensional camera to the incident point of a two-dimensional camera |
Also Published As
Publication number | Publication date |
---|---|
US9644943B2 (en) | 2017-05-09 |
EP2833095A4 (en) | 2015-03-18 |
EP2833095A1 (en) | 2015-02-04 |
CN104220838A (zh) | 2014-12-17 |
CN104220838B (zh) | 2016-12-21 |
KR20140119836A (ko) | 2014-10-10 |
EP2833095B1 (en) | 2023-06-28 |
US20140376005A1 (en) | 2014-12-25 |
JP6079772B2 (ja) | 2017-02-15 |
JPWO2013145164A1 (ja) | 2015-08-03 |
KR101630558B1 (ko) | 2016-06-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6079772B2 (ja) | 撮像装置 | |
JP4799216B2 (ja) | 距離測定機能を有する撮像装置 | |
JP5808502B2 (ja) | 画像生成装置 | |
JP4566929B2 (ja) | 撮像装置 | |
JP4708220B2 (ja) | 照明装置及びこれを用いた撮像装置 | |
WO2012137674A1 (ja) | 情報取得装置、投射装置および物体検出装置 | |
JP5138116B2 (ja) | 情報取得装置および物体検出装置 | |
US9875525B2 (en) | Image processing device, projector, and image processing method | |
WO2012147495A1 (ja) | 情報取得装置および物体検出装置 | |
US10667445B2 (en) | Position recognition apparatus for printed circuit board, position recognition and processing apparatus, and printed circuit board manufacturing method | |
TW201725065A (zh) | 自動計分鏢靶裝置及其飛鏢自動計分方法 | |
WO2023213311A1 (zh) | 胶囊内窥镜、摄像系统的测距方法和装置 | |
US9807348B2 (en) | Imaging apparatus, imaging method, and imaging program | |
JP6260653B2 (ja) | 撮像装置 | |
WO2018030028A1 (ja) | 読取装置、プログラム、及びユニット | |
JPWO2008084523A1 (ja) | 位置情報検出装置、位置情報検出方法及び位置情報検出プログラム | |
JP5883688B2 (ja) | 設置状態検出システム、設置状態検出装置、及び設置状態検出方法 | |
CN113767359A (zh) | 使用显示器的光获取生物特征信息的方法及电子装置 | |
US10607064B2 (en) | Optical projection system and optical projection method | |
CN109269404B (zh) | 图像处理方法、装置及指纹识别设备 | |
US10237545B1 (en) | Image pickup module test system and method | |
CN110505393B (zh) | 影像处理装置与方法 | |
TWI596360B (zh) | 攝像設備以及攝像方法 | |
JP2016001153A (ja) | 光学部品の検査装置及び検査方法 | |
WO2013031448A1 (ja) | 物体検出装置および情報取得装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12873126 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2012873126 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2014507132 Country of ref document: JP Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 20147026798 Country of ref document: KR Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |