WO2023135640A1 - 撮像システム、撮像装置、撮像方法、及び記録媒体 - Google Patents
撮像システム、撮像装置、撮像方法、及び記録媒体 Download PDFInfo
- Publication number
- WO2023135640A1 WO2023135640A1 PCT/JP2022/000532 JP2022000532W WO2023135640A1 WO 2023135640 A1 WO2023135640 A1 WO 2023135640A1 JP 2022000532 W JP2022000532 W JP 2022000532W WO 2023135640 A1 WO2023135640 A1 WO 2023135640A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- camera
- mirror
- imaging system
- image
- target
- Prior art date
Links
- 238000003384 imaging method Methods 0.000 title claims abstract description 280
- 230000003287 optical effect Effects 0.000 claims abstract description 70
- 238000012545 processing Methods 0.000 claims description 24
- 238000000034 method Methods 0.000 claims description 20
- 238000004590 computer program Methods 0.000 claims description 14
- 230000000694 effects Effects 0.000 description 23
- 238000001514 detection method Methods 0.000 description 17
- 238000010586 diagram Methods 0.000 description 16
- 230000006870 function Effects 0.000 description 11
- 238000013459 approach Methods 0.000 description 7
- 238000012986 modification Methods 0.000 description 6
- 230000004048 modification Effects 0.000 description 6
- 238000013519 translation Methods 0.000 description 5
- 238000005516 engineering process Methods 0.000 description 2
- 241000282472 Canis lupus familiaris Species 0.000 description 1
- 241001465754 Metazoa Species 0.000 description 1
- 241000270295 Serpentes Species 0.000 description 1
- 230000006866 deterioration Effects 0.000 description 1
- 239000007788 liquid Substances 0.000 description 1
- 238000003333 near-infrared imaging Methods 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 238000003672 processing method Methods 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/45—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from two or more image sensors being of different type or operating in different modes, e.g. with a CMOS sensor for moving images in combination with a charge-coupled device [CCD] for still images
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B15/00—Special procedures for taking photographs; Apparatus therefor
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B19/00—Cameras
- G03B19/02—Still-picture cameras
- G03B19/04—Roll-film cameras
- G03B19/06—Roll-film cameras adapted to be loaded with more than one film, e.g. with exposure of one or the other at will
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B19/00—Cameras
- G03B19/02—Still-picture cameras
- G03B19/04—Roll-film cameras
- G03B19/07—Roll-film cameras having more than one objective
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/20—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from infrared radiation only
- H04N23/21—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from infrared radiation only from near infrared [NIR] radiation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
- H04N23/55—Optical parts specially adapted for electronic image sensors; Mounting thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/61—Control of cameras or camera modules based on recognised objects
- H04N23/611—Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/63—Control of cameras or camera modules by using electronic viewfinders
- H04N23/633—Control of cameras or camera modules by using electronic viewfinders for displaying additional information relating to control or operation of the camera
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/64—Computer-aided capture of images, e.g. transfer from script file into camera, check of taken image quality, advice or proposal for image composition or decision on when to take image
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/181—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
Definitions
- This disclosure relates to the technical fields of imaging systems, imaging devices, imaging methods, and recording media.
- Japanese Patent Application Laid-Open No. 2002-200000 discloses a technique of capturing an image of a subject's iris using three infrared cameras arranged at regular intervals in the vertical direction.
- Japanese Patent Application Laid-Open No. 2002-200000 discloses a technique of capturing an image of the face of a person to be authenticated using cameras having different focal lengths.
- Patent Literature 3 discloses that in an imaging device equipped with a wide camera and a narrow camera, a reflective mirror is used to change the imaging direction of the narrow camera.
- the purpose of this disclosure is to improve the technology disclosed in prior art documents.
- One aspect of the imaging system of this disclosure is a first camera having a first focal length; a second camera having a second focal length; Optically between the first camera or the second camera and the first mirror, depending on which one of the first camera and the second camera is used to capture an image of the object. and a first adjusting means for adjusting the positional relationship.
- One aspect of the imaging device of this disclosure is a first camera having a first focal length, a second camera having a second focal length, and a camera corresponding to both the first camera and the second camera. Optically between the first camera or the second camera and the first mirror, depending on which one of the first camera and the second camera is used to capture an image of the object. and a first adjusting means for adjusting the positional relationship.
- One aspect of the imaging method of this disclosure comprises, by at least one computer, a first camera having a first focal length, a second camera having a second focal length, and the first camera and the second camera. and a first mirror arranged to correspond to both of the imaging method for controlling an imaging system comprising: The optical positional relationship between the first camera or the second camera and the first mirror is adjusted.
- One aspect of the recording medium of this disclosure is that, in at least one computer, a first camera having a first focal length, a second camera having a second focal length, the first camera and the second camera and a first mirror arranged to correspond to both of the imaging method for controlling an imaging system comprising: A computer program is recorded for executing an imaging method for adjusting the optical positional relationship between the first camera or the second camera and the first mirror.
- FIG. 2 is a block diagram showing the hardware configuration of the imaging system according to the first embodiment
- FIG. 1 is a block diagram showing a functional configuration of an imaging system according to a first embodiment
- FIG. 4 is a flowchart showing the flow of imaging operation of the imaging system according to the first embodiment
- FIG. 5 is a block diagram showing a functional configuration of an imaging system according to a modified example of the first embodiment
- FIG. 9 is a flow chart showing the flow of imaging operation of the imaging system according to the modified example of the first embodiment
- FIG. 11 is a side view showing the viewing angle origins of the first camera and the second camera in the imaging system according to the second embodiment
- FIG. 1 is a block diagram showing a functional configuration of an imaging system according to a first embodiment
- FIG. 4 is a flowchart showing the flow of imaging operation of the imaging system according to the first embodiment
- FIG. 5 is a block diagram showing a functional configuration of an imaging system according to a modified example of the first embodiment
- FIG. 9 is a flow chart showing
- FIG. 12 is a side view showing rotation drive control of the first mirror by the imaging system according to the third embodiment; It is a side view which shows the arrangement
- FIG. 11 is a front view showing an operation example of a driving unit that translates the first camera and the second camera;
- FIG. 4 is a side view showing a configuration example of a driving unit that rotates and moves the first camera and the second camera;
- FIG. 4 is a top view showing a configuration example of a driving unit that rotates and moves the first camera and the second camera; It is a front view which shows the 1st combination example of a camera.
- FIG. 11 is a front view showing a second combination example of cameras;
- FIG. 11 is a front view showing a third combination example of cameras;
- FIG. 11 is a front view showing a fourth combination example of cameras;
- FIG. 11 is a block diagram showing the functional configuration of an imaging system according to a fifth embodiment;
- FIG. FIG. 12 is a conceptual diagram showing remote authentication and proximity authentication by the imaging system according to the fifth embodiment;
- FIG. 16 is a flow chart showing the flow of authentication operation of the imaging system according to the fifth embodiment;
- FIG. 11 is a conceptual diagram showing each phase and processing contents in an imaging system according to a sixth embodiment;
- FIG. 21 is a block diagram showing the functional configuration of an imaging system according to a seventh embodiment;
- FIG. FIG. 22 is a front view showing an output example of guidance information by the imaging system according to the seventh embodiment;
- FIG. 20 is a front view showing an output example of guidance information corresponding to imaging timing in an imaging system according to an eighth embodiment
- FIG. 20 is a front view showing an output example of guidance information corresponding to timings other than the imaging timing in the imaging system according to the eighth embodiment
- FIG. 22 is a block diagram showing a functional configuration of an imaging system according to a ninth embodiment
- FIG. 21 is a front view showing an arrangement example of an imaging system according to a ninth embodiment
- 20 is a flow chart showing the flow of imaging operation of the imaging system according to the ninth embodiment.
- FIG. 22 is a side view showing the viewing angle origins of the third camera and the fourth camera in the imaging system according to the tenth embodiment
- FIG. 20 is a front view showing the viewing angle origin of each camera in the imaging system according to the tenth embodiment;
- FIG. 1 An imaging system according to the first embodiment will be described with reference to FIGS. 1 to 3.
- FIG. 1 An imaging system according to the first embodiment will be described with reference to FIGS. 1 to 3.
- FIG. 1 An imaging system according to the first embodiment will be described with reference to FIGS. 1 to 3.
- FIG. 1 is a block diagram showing the hardware configuration of an imaging system according to the first embodiment.
- an imaging system 10 includes a processor 11, a RAM (Random Access Memory) 12, a ROM (Read Only Memory) 13, and a storage device 14. Imaging system 10 may further comprise an input device 15 and an output device 16 . The imaging system 10 also includes an imaging unit 18 . The processor 11 , the RAM 12 , the ROM 13 , the storage device 14 , the input device 15 , the output device 16 and the imaging section 18 are connected via a data bus 17 .
- the processor 11 reads a computer program.
- processor 11 is configured to read a computer program stored in at least one of RAM 12, ROM 13 and storage device .
- the processor 11 may read a computer program stored in a computer-readable recording medium using a recording medium reader (not shown).
- the processor 11 may acquire (that is, read) a computer program from a device (not shown) arranged outside the imaging system 10 via a network interface.
- the processor 11 controls the RAM 12, the storage device 14, the input device 15 and the output device 16 by executing the read computer program.
- the processor 11 implements a functional block for executing processing for capturing an image of the object. That is, the processor 11 may function as a controller that executes each control in the imaging system 10 .
- the processor 11 includes, for example, a CPU (Central Processing Unit), GPU (Graphics Processing Unit), FPGA (Field-Programmable Gate Array), DSP (Demand-Side Platform), ASIC (Application Specific Integral ted circuit).
- the processor 11 may be configured with one of these, or may be configured to use a plurality of them in parallel.
- the RAM 12 temporarily stores computer programs executed by the processor 11.
- the RAM 12 temporarily stores data temporarily used by the processor 11 while the processor 11 is executing the computer program.
- the RAM 12 may be, for example, a D-RAM (Dynamic Random Access Memory) or an SRAM (Static Random Access Memory). Also, instead of the RAM 12, other types of volatile memory may be used.
- the ROM 13 stores computer programs executed by the processor 11 .
- the ROM 13 may also store other fixed data.
- the ROM 13 may be, for example, a P-ROM (Programmable Read Only Memory) or an EPROM (Erasable Read Only Memory). Also, instead of the ROM 13, other types of non-volatile memory may be used.
- the storage device 14 stores data that the imaging system 10 saves over a long period of time.
- Storage device 14 may act as a temporary storage device for processor 11 .
- the storage device 14 may include, for example, at least one of a hard disk device, a magneto-optical disk device, an SSD (Solid State Drive), and a disk array device.
- the input device 15 is a device that receives input instructions from the user of the imaging system 10 .
- Input device 15 may include, for example, at least one of a keyboard, mouse, and touch panel.
- the input device 15 may be configured as a mobile terminal such as a smart phone or a tablet.
- the input device 15 may be a device capable of voice input including, for example, a microphone.
- the output device 16 is a device that outputs information about the imaging system 10 to the outside.
- output device 16 may be a display device (eg, display) capable of displaying information about imaging system 10 .
- the output device 16 may be a speaker or the like capable of outputting information about the imaging system 10 as sound.
- the output device 16 may be configured as a mobile terminal such as a smart phone or a tablet.
- the output device 16 may be a device that outputs information in a format other than an image.
- the output device 16 may be a speaker that outputs information about the authentication system 10 by voice.
- the imaging unit 18 is configured to be able to capture an image of the target.
- the imaging unit 18 is configured including a first camera 110 , a second camera 120 and a first mirror 210 .
- the first camera 110 and the second camera 120 are cameras installed at positions capable of capturing images of the target.
- the target here is not limited to humans, and may include animals such as dogs and snakes, robots, and the like.
- the first camera 110 and the second camera 120 are cameras with different focal lengths. Specifically, the first camera 110 has a first focal length and the second camera 120 has a second focal length. Also, the first camera 110 and the second camera have different viewing angles.
- the first camera 110 and the second camera 120 may capture an image of the entire object, or may capture an image of a part of the object. Also, the first camera 110 and the second camera 120 may capture images of different parts of the object.
- the first camera 110 captures an image of the target's face (hereinafter referred to as a "face image")
- the second camera 120 captures an image including the target's eyes (hereinafter referred to as an "eye image”). It may be configured to take an image.
- the first camera 110 and the second camera 120 may be cameras that capture still images, or may be cameras that capture moving images.
- the first camera 110 and the second camera 120 may be configured as visible light cameras or may be configured as near-infrared cameras.
- the first camera 110 and the second camera 120 may be configured as cameras of the same type.
- both the first camera 110 and the second camera 120 may be configured as visible light cameras, or both the first camera 110 and the second camera 120 may be configured as near-infrared cameras.
- first camera 110 and the second camera 120 may be configured as different types of cameras.
- the first camera 110 may be configured as a visible light camera and the second camera may be configured as a near-infrared camera.
- a plurality of first cameras 110 and a plurality of second cameras may be provided.
- the first camera 110 and the second camera 120 may have a function of automatically turning off the power when not capturing an image, for example. In this case, for example, the liquid lens, the motor, or the like having a short life may be preferentially turned off.
- the first mirror 210 is a mirror configured to be able to reflect light (specifically, light used when the first camera 110 and the second camera 120 capture images).
- the first mirror 210 is arranged to correspond to both the first camera 110 and the second camera 120 . That is, the first camera 110 and the second camera 120 are configured to be able to image the target through the first mirror 210, respectively.
- the first camera 110 captures an image using light incident through the first mirror 210
- the second camera 120 also captures an image using light incident through the first mirror 210.
- the first camera 110 and the second camera 120 and the first mirror 210 are configured such that their optical positional relationship can be adjusted.
- optical positional relationship here means a relative positional relationship that can affect the optical system including the first camera 110, the second camera 120, and the first mirror 210. Adjustments can be made by moving (eg, moving or rotating) any of the first camera 110, the second camera 120, and the first mirror 210. FIG. Further, instead of moving any one of the first camera 110, the second camera 120, and the first mirror 210, a plurality of them may be moved at the same time. For example, the first camera 110 may be moved while rotating the first mirror 110 . The adjustment of this optical positional relationship will be detailed later.
- FIG. 1 shows an example of the imaging system 10 including a plurality of devices, all or part of these functions may be realized by one device (imaging device).
- this imaging device is configured with only the processor 11, the RAM 12, the ROM 13, and the imaging unit 18 described above, and the other components (that is, the storage device 14, the input device 15, and the output device 16) are, for example,
- An external device connected to the imaging device may be provided.
- the imaging device may implement a part of arithmetic functions by an external device (for example, an external server, cloud, etc.).
- FIG. 2 is a block diagram showing the functional configuration of the imaging system according to the first embodiment.
- the imaging system 10 is configured as a system for capturing an image of a target. More specifically, the imaging system 10 is configured to be capable of imaging a moving object (for example, a pedestrian or the like).
- the use of the image captured by the imaging system 10 is not particularly limited, but it may be used for biometric authentication, for example.
- the imaging system 10 may be configured as a part of an authentication system that performs walk-through authentication by imaging a walking target and performing biometric authentication.
- the imaging system 10 may be configured as part of an authentication system that performs biometric authentication by capturing an image of a standing target.
- the imaging system 10 includes the already-described imaging unit 18 and a first adjustment unit 310 as components for realizing the functions thereof.
- the first adjuster 310 may be, for example, a processing block implemented by the above-described processor 11 (see FIG. 1).
- the first adjuster 310 is configured to be able to adjust the optical positional relationship between the first camera 110 or the second camera 120 and the first mirror 210 . More specifically, first adjuster 310 adjusts the optical positional relationship between first camera 110 and first mirror 210 when first camera 110 takes an image. This enables the first camera 110 to capture an image of the object. Also, the first adjustment unit 310 adjusts the optical positional relationship between the second camera 120 and the first mirror 210 when the second camera 120 performs imaging. This allows the second camera 120 to capture an image of the object. The first adjustment unit 310 drives at least one or more of the first camera 110, the second camera 120, and the first mirror 210 by a driving unit including, for example, an actuator or the like, thereby adjusting their optical positional relationship. may be configured to adjust.
- a driving unit including, for example, an actuator or the like
- FIG. 3 is a flow chart showing the flow of imaging operation of the imaging system according to the first embodiment.
- step S101 first camera
- the first adjusting unit 310 adjusts the optical positional relationship between the first camera 110 and the first mirror 210 (step S102). ). After the optical positional relationship has been adjusted, the first camera 110 takes an image (step S103).
- step S101 second camera
- the first adjustment unit 310 adjusts the optical positional relationship between the second camera 120 and the first mirror 210 (step S104). After the optical positional relationship is adjusted, the second camera 120 takes an image (step S105).
- FIG. 4 is a block diagram showing a functional configuration of an imaging system according to a modification of the first embodiment
- FIG. 5 is a flow chart showing the flow of imaging operation of the imaging system according to the modification of the first embodiment.
- the same symbols are attached to the same elements or processes as those shown in FIGS.
- the imaging system 10 includes an imaging unit 18, a first adjustment unit 310, an object detection unit 315, and is configured with That is, the imaging system 0 according to the modification further includes an object detection unit 315 in addition to the configuration of the first embodiment (see FIG. 2).
- the object detection unit 310 may be, for example, a processing block implemented by the above-described processor 11 (see FIG. 1).
- the target detection unit 315 is configured to detect targets existing around the first camera 110 and the second camera 120 . More specifically, the object detection unit 315 detects objects that can be captured by the first camera 110 and the second camera 120 (for example, objects approaching the first camera 110 and the second camera 120, and objects within a predetermined distance from the second camera 120). The target detection unit 315 may detect the target according to the detection result of the position sensor or the distance sensor, for example. Alternatively, the object detection unit 315 may, based on the result of imaging by a camera different from the first camera 110 and the second camera 120 (for example, an overhead camera with a wider imaging range than the first camera 110 and the second camera 120), Objects may be detected.
- the target detection unit 315 may be configured to detect the positional relationship between the target and the first camera 110 and the second camera 120 . This positional relationship may be used, for example, to determine which of the first camera 110 and the second camera 120 is used for imaging. A detection result by the object detection unit 315 is configured to be output to the first adjustment unit 310 .
- the object detection unit 315 detects an object that can be an imaging target of the first camera 110 and the second camera 120. is detected (step S110). Note that if the target detection unit 315 does not detect the target (step S110: NO), the subsequent processing may be omitted.
- the first adjustment unit 310 determines which of the first camera 110 and the second camera 120 is used to image the object (step S101). ). At this time, the first adjustment unit 310 may determine which of the first camera 110 and the second camera 120 is used to image the object based on the detection result of the object detection unit 315 . For example, when it is detected that the target exists at a position corresponding to the first focal length, the first adjusting section 310 may determine that the first camera 110 is to capture an image. Similarly, when it is detected that the target exists at the position corresponding to the second focal length, the first adjustment section 310 may determine that the second camera 120 is to capture the image.
- step S101 first camera
- the first adjusting unit 310 adjusts the optical positional relationship between the first camera 110 and the first mirror 210 (step S102). ). After the optical positional relationship has been adjusted, the first camera 110 takes an image (step S103).
- step S101 second camera
- the first adjustment unit 310 adjusts the optical positional relationship between the second camera 120 and the first mirror 210 (step S104). After the optical positional relationship is adjusted, the second camera 120 takes an image (step S105).
- the first camera 110 and the second camera 110 and the second camera 120 are used depending on which of the first camera 110 and the second camera 120 is used to image the object.
- the optical positional relationship between camera 120 and first mirror 210 is adjusted. In this way, each of the first camera 110 and the second camera 120 can capture an image via the first mirror 210 . In other words, objects located at different focal lengths can be imaged via a common mirror.
- FIG. 6 is a side view showing the viewing angle origin of the first camera and the second camera in the imaging system according to the second embodiment.
- the first mirror 210 is arranged between the first camera 110 and the second camera 120 .
- the first mirror 210 faces the direction of the first camera 110, and light enters the first camera 110 via the first mirror 210 (FIG. 6A). reference).
- the first mirror 210 faces the direction of the second camera 120, and light enters the second camera 120 via the first mirror 210 (FIG. 6B). reference).
- the intersection of the optical axis of each camera and the mirror surface of the first mirror 210 is a common position.
- the intersection point described above is called the "line-of-sight angle origin".
- the position of the mirror surface serving as the center of rotation serves as a common line-of-sight angle origin.
- the first camera 110 and the second camera 120 have the same line-of-sight angle origin (coincidence), there may be a slight deviation in the respective line-of-sight angle origins. Even if there is, the technical effect according to the present embodiment, which will be described later, can be obtained.
- the first camera 110 and the second camera 120 perform imaging via a common line-of-sight angle origin.
- the path for guiding light to the first camera 110 and the second camera 120 can be made common, so the configuration of the imaging unit 18 can be simplified.
- the line of sight can be guided to a single common line-of-sight angle origin regardless of whether the first camera 110 or the second camera 120 is used for image capturing. All you have to do is guide them.
- FIG. 7 An imaging system 10 according to the third embodiment will be described with reference to FIGS. 7 and 8.
- FIG. 7 It should be noted that the third embodiment may differ from the above-described first and second embodiments only in a part of configuration and operation, and other parts may be the same as those of the first and second embodiments. Therefore, in the following, portions different from the already described embodiments will be described in detail, and descriptions of other overlapping portions will be omitted as appropriate.
- FIG. 7 is a side view showing rotation drive control of the first mirror by the imaging system according to the third embodiment.
- the imaging system 10 is arranged such that the first camera 110 and the second camera 120 sandwich the first mirror 210 .
- the first camera 110 and the second camera 120 are arranged to face each other across the mirror. More specifically, the first camera 110 is arranged so as to face the direction of the first mirror 210 (that is, directly above) from directly below.
- the second camera 120 is arranged so as to face the direction of the first mirror 210 (that is, directly below) from directly above.
- the first camera 110, the second camera 120, and the first mirror 210 according to this embodiment are not limited to this arrangement.
- the first camera 110 and the second camera 120 may be arranged so as to sandwich the first mirror 210 from the lateral direction.
- the first adjuster 310 is configured to be able to control the rotational drive of the first mirror 210 .
- the optical positional relationship between the first camera 110 and the second camera 120 and the first mirror 210 is adjusted by rotationally driving the first mirror 210 according to an instruction from the first adjustment unit 310.
- the first mirror 210 may be rotationally driven using, for example, a motor or the like.
- the first camera 110 when the mirror surface of the first mirror 210 is rotated to face the first camera 110 (that is, downward), light enters the first camera 110 via the first mirror 210. Become. That is, the first camera 110 becomes capable of imaging through the first mirror 210 .
- the mirror surface of the first mirror 210 is rotated to face the second camera 120 side (that is, upward), light enters the second camera 120 via the first mirror 210 . . That is, the second camera 120 becomes capable of imaging through the first mirror 210 .
- the first camera 110 and the second camera 120 can capture images through the common line-of-sight angle origin. .
- FIG. 8 is a side view showing an arrangement variation of the second camera.
- the first camera 110 and the second camera 120 may be arranged obliquely.
- the imaging range will be at a relatively high position, and the subject's eyes will be able to fit within the imaging range. .
- the first camera 110 may also be arranged obliquely.
- the first mirror 210 arranged between the first camera 110 and the second camera 120 is driven to rotate.
- the optical positional relationship between the first camera 110 and the second camera 120 and the first mirror 210 is adjusted.
- the optical positional relationship between the first camera 110 and the second camera 120 and the first mirror 210 can be adjusted by a relatively simple driving operation.
- the optical positional relationship can be adjusted by moving only the first mirror 210 without moving the first camera 110 and the second camera 120 .
- FIG. 9 An imaging system 10 according to the fourth embodiment will be described with reference to FIGS. 9 to 11.
- FIG. 9 An imaging system 10 according to the fourth embodiment will be described with reference to FIGS. 9 to 11.
- FIG. 9 It should be noted that the fourth embodiment may differ from the above-described first to third embodiments only in a part of configuration and operation, and other parts may be the same as those of the first to third embodiments. Therefore, in the following, portions different from the already described embodiments will be described in detail, and descriptions of other overlapping portions will be omitted as appropriate.
- FIG. 9 is a front view showing an operation example of the driving unit that translates the first camera and the second camera.
- the first camera 110 and the second camera 120 are arranged side by side.
- the first camera 110 and the second camera 120 are arranged to face downward.
- a first mirror 210 is arranged below the first camera 110 and the second camera 120 .
- the first camera 110 and the second camera 120 are configured to be movable in parallel by the first driving section 410 .
- the first camera 110 and the second camera 120 may not necessarily be movable in a completely parallel state.
- the “parallel movement” here is a broad concept that indicates movement in the horizontal direction (horizontal direction) in FIG.
- the first adjusting section 310 is configured to be able to control the operation of the first driving section 410 .
- the first driving unit 410 translates the first camera 110 and the second camera 120 in accordance with the instruction of the first adjusting unit 310 , so that the optical movement between the first camera 110 and the second camera 120 and the first mirror 210 is adjusted. positional relationship is adjusted.
- FIG. 9A For example, in the state shown in FIG. 9A, light is incident on the first camera 110 via the first mirror 210 . That is, the first camera 110 is ready to take an image. From this state, when the first camera 110 and the second camera 120 are translated toward the right blank portion (that is, translated rightward in the drawing), the arrangement shown in FIG. 9B is obtained. In the state shown in FIG. 9B, light is incident on the second camera 120 via the first mirror 210 . That is, the second camera 120 is ready to take an image. From this state, if the first camera 110 and the second camera 120 are translated again toward the left blank portion (that is, translated leftward in the drawing), the image shown in FIG. It becomes the arrangement
- FIG. 10 is a side view showing a configuration example of a driving unit that rotates and moves the first camera and the second camera.
- FIG. 11 is a top view showing a configuration example of a driving unit that rotates and moves the first camera and the second camera.
- a first camera 110 and a second camera 120 are arranged side by side.
- cameras 130 and 140 other than the first camera 110 and the second camera 120 are also arranged.
- Cameras 130 and 140 are configured as cameras having different focal lengths than first camera 110 and second camera 120, but cameras 130 and 140 are not essential components.
- the first camera 110 and the second camera 120 are arranged to face downward.
- a first mirror 210 is arranged below the first camera 110 and the second camera 120 .
- the first camera 110, the second camera 120, and the separate cameras 130 and 140 are arranged in an annular shape when viewed from above.
- An annular second drive unit 420 is arranged to connect the respective cameras.
- the second driving section 420 is configured to be able to drive the first camera 110, the second camera 120, and the separate cameras 130 and 140 in a revolver format. Specifically, each camera rotates in a circular motion, so that the positions of the cameras alternate clockwise or counterclockwise.
- the first adjusting section 310 is configured to be able to control the operation of the second driving section 420 . According to the instruction of the first adjusting unit 310, the second driving unit 420 moves each camera in a revolver manner, so that the optical positional relationship between the first camera 110 and the second camera 120 and the first mirror 210 is changed. will be adjusted.
- the first camera 110 is ready to take an image.
- the second camera 120 is positioned above the first mirror 210, and light enters the second camera 120 via the first mirror 210. . That is, the second camera 120 is ready to take an image.
- the cameras 130 and 140 are moved to be positioned above the first mirror 210, the cameras 130 and 140 are also ready for imaging.
- FIG. 12 is a front view showing a first combination example of cameras.
- the same symbols are attached to the same elements as those shown in FIG. 12
- translation cameras are arranged above and below the first mirror 210 .
- the first camera 110, the second camera 120, and the first driving unit 410a are arranged above the first mirror 210
- the third camera 130, the fourth camera 140 are arranged below the first mirror 210.
- a first drive unit 410b is arranged above and below the first mirror 210.
- images may be captured in the order of the first camera 110, the second camera 120, the third camera 130, and the fourth camera 140.
- the image may be captured by the first camera 110 first, and then captured by the second camera 120 by driving the first driving section 410a.
- the third camera 130 may take an image.
- FIG. 13 is a front view showing a second combination example of cameras.
- the same symbols are attached to the same elements as those shown in FIG. 13
- revolver cameras are arranged above and below the first mirror 210 .
- the first camera 110, the second camera 120, the third camera 130, and the second driving unit 420a are arranged above the first mirror 210, and the fourth camera 140, A fifth camera 150, a sixth camera 160, and a second driving section 420b are arranged.
- the first camera 110, the second camera 120, the third camera 130, the fourth camera 140, the fifth camera 150, and the sixth camera 160 capture images in this order.
- the first camera 110 captures an image
- the second driving unit 420a is driven to capture an image with the second camera 120
- the second driving unit 420a is driven again to capture an image with the third camera 130.
- An image may be captured by the sixth camera 160 .
- FIG. 14 is a front view showing a third combination example of cameras.
- the same symbols are attached to the same elements as those shown in FIG. 14.
- a translational camera is arranged above the first mirror 210, and one normal camera is arranged below it.
- the first camera 110, the second camera 120, and the first driving unit 410 are arranged above the first mirror 210, and the third camera 130 is arranged below the first mirror.
- a normal camera may be arranged above the first mirror 210 and a translation camera may be arranged below the first mirror 210 .
- a revolver-type camera may be arranged instead of the translation-type camera. That is, a revolver-type camera and a normal camera may be combined.
- images may be captured in the order of the first camera 110, the second camera 120, and the third camera .
- the image may be captured by the first camera 110 first, and then the image may be captured by the second camera 120 by driving the first driving section 410 .
- the image may be captured by the third camera 130 .
- FIG. 15 is a front view showing a fourth combination example of cameras.
- elements similar to those shown in FIGS. 9 and 10 are given the same reference numerals.
- a translation camera is arranged above the first mirror 210 and a revolver camera is arranged below the first mirror 210 .
- the first camera 110, the second camera 120, and the first driving unit 420 are arranged above the first mirror 210, and the fourth camera 140, the fifth camera 150, A sixth camera 160 and a second driving section 420 are arranged.
- a revolver camera may be arranged above the first mirror 210 and a translation camera may be arranged below the first mirror 210 .
- the first camera 110, the second camera 120, the fourth camera 140, the fifth camera 150, and the sixth camera 160 may take images in this order.
- the first camera 110 may take an image first
- the first driving section 420 may be driven to take an image with the second camera 120 .
- the fourth camera 140 takes an image
- the second driving section 420 is driven to take an image with the fifth camera 150
- the second driving section 420 is driven again.
- An image may be captured by the sixth camera 160 .
- the imaging system 10 by moving the first camera 110 and the second camera 120, the first camera 110 and the second camera 120 and the first mirror
- the optical positional relationship with 210 is adjusted.
- the optical positional relationship between the first camera 110 and the second camera 120 and the first mirror 210 can be adjusted by a relatively simple driving operation.
- the optical positional relationship can be adjusted by moving only the first camera 110 and the second camera 120 without moving the first mirror 210 .
- FIG. 16 to 18 An imaging system 10 according to the fifth embodiment will be described with reference to FIGS. 16 to 18.
- FIG. It should be noted that the fifth embodiment may differ from the above-described first to fourth embodiments only in a part of configuration and operation, and other parts may be the same as those of the first to fourth embodiments. Therefore, in the following, portions different from the already described embodiments will be described in detail, and descriptions of other overlapping portions will be omitted as appropriate.
- FIG. 16 is a block diagram showing the functional configuration of an imaging system according to the fifth embodiment.
- the same reference numerals are given to the same components as those shown in FIG.
- the imaging system 10 according to the fifth embodiment includes an imaging unit 18, a first adjustment unit 310, a position acquisition unit 320, and an authentication unit 330 as components for realizing its functions. and That is, the imaging system 10 according to the fifth embodiment further includes a position acquisition section 320 and an authentication section 330 in addition to the configuration of the first embodiment (see FIG. 2). Note that each of the position acquisition unit 320 and the authentication unit 330 may be a processing block realized by, for example, the above-described processor 11 (see FIG. 1).
- the position acquisition unit 320 is configured to be able to acquire information about the position of the target imaged by the first camera 110 and the second camera 120 .
- the position acquisition unit 320 may be configured to acquire the position of the target using a wide-angle camera different from the first camera 110 and the second camera 120 . Further, the position acquisition unit 320 may be configured to be able to acquire the target position using a distance sensor, a passage sensor, a floor pressure sensor, or the like. Note that the information about the position of the object acquired by the position acquisition unit 320 is used to determine which of the first camera 110 and the second camera 120 is used to image the object, as will be described in detail later.
- the position acquisition unit 320 may be configured to have the function of performing this determination.
- the authentication unit 330 is configured to be able to perform authentication processing based on the images of the target captured by the first camera 110 and the second camera 120 .
- the authentication unit 330 may be configured to perform face authentication using a target face image.
- authentication section 330 may be configured to be able to perform iris authentication using an eye image (iris image) of the target. It should be noted that existing techniques can be appropriately adopted for a specific method of the authentication process, so detailed description thereof will be omitted here.
- FIG. 16 is a conceptual diagram showing remote authentication and proximity authentication by the imaging system according to the fifth embodiment.
- remote authentication in which an image of a target located relatively far from the imaging unit 18 and the gate 25 is captured and authentication processing is executed;
- Proximity authentication is performed in which an image of a target located relatively close to the unit 18 is captured and authentication processing is performed.
- remote authentication and proximity authentication may be performed using a common modal. For example, both remote authentication and proximity authentication may be performed as face authentication, and both remote authentication and proximity authentication may be performed as iris authentication. Also, remote authentication and proximity authentication may be performed using different modals. For example, remote authentication may be performed as face authentication and proximity authentication may be performed as iris authentication. In the example shown in the figure, when remote authentication or proximity authentication is successful, the gate 25 is opened to permit passage of the target.
- Remote authentication is performed by imaging the target with the first camera 110 having the first focal length.
- the first camera 110 may be configured as a camera with a large focal length and a small viewing angle.
- Remote authentication may be performed when the target position acquired by the position acquisition unit 320 becomes the first focal length (that is, the focal length of the first camera 110).
- Remote authentication may be performed by capturing an image of a subject walking toward the imaging unit 18, for example.
- Proximity authentication is performed by imaging the target with the first camera 110 having the second focal length.
- the second camera 120 may be configured as a camera with a short focal length and a medium viewing angle.
- Proximity authentication may be performed when the target position acquired by the position acquisition unit 320 is the second focal length (that is, the focal length of the second camera 120).
- Remote authentication may be performed, for example, by capturing an image of a target stopped near the imaging unit 18 (that is, in front of the gate 25).
- FIG. 18 is a flow chart showing the flow of authentication operation of the imaging system according to the fifth embodiment.
- the position acquisition unit 320 first acquires the target position (step S501). Then, the position acquisition unit 320 determines whether or not the acquired target position is the remote authentication position (that is, the position where remote authentication should be performed) (step S502). Note that the remote authentication position may be set according to the first focal length.
- step S502 If the acquired target position is not the remote authentication position (step S502: NO), the process of step S501 is executed again. On the other hand, if the obtained position of the target is the remote authentication position (step S502: YES), the first adjusting unit 310 adjusts the optical positional relationship so that the first camera 110 can image the target, The camera 110 takes an image of the object (step S503). Authentication unit 330 then performs remote authentication using the image captured by first camera 110 (step S504).
- the authentication unit 330 determines whether or not the remote authentication has succeeded (step S505). Note that if the remote authentication is successful (step S505: YES), the subsequent processing may be omitted. That is, the subject may be allowed to pass without performing proximity authentication.
- step S505 if the remote authentication fails (step S505: NO), the position acquisition unit 320 acquires the target position (step S506). Then, the position acquisition unit 320 determines whether or not the acquired target position is a proximity authentication position (that is, a position where proximity authentication should be performed) (step S507). Note that the remote authentication position may be set according to the second focal length.
- step S507 If the acquired target position is not the proximity authentication position (step S507: NO), the process of step S506 is executed again. On the other hand, if the obtained position of the target is the remote authentication position (step S507: YES), the first adjusting unit 310 adjusts the optical positional relationship so that the second camera 120 can image the target, Camera 120 captures an image of the target (step S508). The authentication unit 330 then performs proximity authentication using the image captured by the second camera 120 (step S509).
- the target may be permitted to pass. On the other hand, if the proximity authentication fails, the target may be barred from passage. Also, the series of operations up to this point may be repeatedly executed each time a new target appears. For example, if a first target is authenticated successfully and passage is permitted, the process from step S501 may be performed on the subsequent second target. In this way, in the case of continuously executing processing for different objects, processing may be executed to return to a state in which first camera 110 can take an image again after a series of operations is completed. That is, the positional relationship adjusted so that the first object can be imaged by the second camera 120 is adjusted to the first camera 110 so that the following second object can be immediately imaged by the first camera 110.
- a process may be executed to return the positional relationship to the original positional relationship.
- Such adjustment of the positional relationship may be performed immediately after the first target is photographed by the second camera 120, or may be performed after the subsequent second target is actually detected.
- the positional relationship is adjusted by rotating the first mirror 210
- the rotation direction of the first mirror 210 for adjusting the positional relationship to match the first camera 110 and the positional relationship to match the first camera 110 are adjusted. and the direction of rotation of the first mirror 210 for the rotation may be in the same direction. For example, assume that the first mirror 210 is rotated counterclockwise when capturing an image with the second camera 120 after capturing an image with the first camera 110 .
- the first mirror 210 when capturing an image with the first camera 110 again after capturing an image with the second camera 120, the first mirror 210 should not be rotated clockwise (that is, rotated in the reverse direction), but should be reversed. It may be made to rotate clockwise once. By doing so, it is possible to suppress the load and the like when changing the rotation direction of the mirror, so that it is possible to suppress the deterioration of the motor and the like.
- remote authentication is first performed using the first camera 110, and if the remote authentication fails, the second camera 120 is used. Neighborhood authentication is performed. In this way, authentication processing using the first camera 110 and the second camera 120 (specifically, authentication processing for objects at different distances) can be appropriately executed.
- FIG. 19 An imaging system 10 according to the sixth embodiment will be described with reference to FIG. 19 .
- the sixth embodiment may differ from the first to fifth embodiments described above only in a part of the configuration and operation, and other parts may be the same as those of the first to fifth embodiments. Therefore, in the following, portions different from the already described embodiments will be described in detail, and descriptions of other overlapping portions will be omitted as appropriate.
- FIG. 19 is a conceptual diagram showing each phase and processing contents in the imaging system according to the sixth embodiment.
- imaging is performed by the first camera 110 and the second camera 120 according to a plurality of phases preset according to the position or situation of the object.
- the phase may be determined, for example, by whether or not the target position has reached a preset distance. Alternatively, phase may be determined by whether the subject is walking or standing still. In the following, an example will be given in which the phase is determined using the distance, the target's eyes are imaged, and iris authentication is performed.
- the control range by the first adjuster 310 is set for remote authentication. Specifically, it is set to the control range for imaging with the first camera 110 .
- the optical positional relationship between the first camera 110 and the first mirror 210 is adjusted according to the eye position of the target, and the first camera 110 performs imaging.
- iris authentication remote authentication
- the control range by the first adjustment unit 310 is set for proximity authentication. Specifically, it is set to the control range when the second camera 120 takes an image.
- the optical positional relationship between the second camera 120 and the first mirror 210 is adjusted according to the eye position of the target, and the second camera 120 performs imaging.
- iris authentication is performed using the eye image captured by the second camera 120 .
- imaging is performed by the first camera 110 and the second camera 120 according to the determined phase.
- the optical positional relationship between the first camera 110 and the second camera 120 and the first mirror 210 can be adjusted at appropriate timing.
- FIG. 20 and 21 An imaging system 10 according to the seventh embodiment will be described with reference to FIGS. 20 and 21.
- FIG. It should be noted that the seventh embodiment may differ from the first to sixth embodiments described above only in a part of configuration and operation, and other parts may be the same as those of the first to sixth embodiments. Therefore, in the following, portions different from the already described embodiments will be described in detail, and descriptions of other overlapping portions will be omitted as appropriate.
- FIG. 20 is a block diagram showing the functional configuration of an imaging system according to the seventh embodiment.
- symbol is attached
- the imaging system 10 according to the seventh embodiment includes an imaging unit 18, a first adjustment unit 310, and a guidance information output unit 340 as components for realizing its functions. configured as follows. That is, the imaging system 10 according to the seventh embodiment further includes a guidance information output section 340 in addition to the configuration of the first embodiment (see FIG. 2). Note that the guidance information output unit 340 may be a processing block realized by, for example, the above-described processor 11 (see FIG. 1).
- the guidance information output unit 340 is configured to be capable of outputting guidance information that guides the target's line of sight to the common line-of-sight angle origin of the first camera 110 and the second camera 120 .
- Guidance information may be displayed using a display or projection, for example. In this case, the guidance information may be displayed directly at the origin of the line-of-sight angle (that is, the intersection of the optical axes of the first camera 110 and the second camera 120 and the mirror surface of the first mirror 210). It may be displayed at a peripheral location or a location in the direction of the line-of-sight angle origin when viewed from the object. Alternatively, the guidance information may be output as voice information via a speaker or the like. In this case, guidance information may be output so that the target can hear the voice from the direction of the line-of-sight angle origin.
- FIG. 21 is a front view showing an output example of guidance information by the imaging system according to the seventh embodiment.
- an arrow that is, a mark
- a message may be displayed to look at the line-of-sight angle origin. That is, a message such as "Look here" may be displayed as shown in the figure.
- these guidance displays may be highlighted. For example, the guidance indicator may flash or change color.
- a visible light cut panel is arranged on the surface of the imaging unit 18 .
- the visible light cut panel does not transmit visible light, but is configured to transmit near-infrared light.
- the guidance information may be displayed on the visible light cut panel.
- an opening is provided in the surface of the imaging unit 18 in order to allow visible light to pass through, but in the case of using near-infrared rays, no opening is provided. By not providing an opening, it is possible to guide the line of sight and perform imaging without making the subject aware of where the line of sight angle origin is. difficult.
- the technical effects of this embodiment, which will be described below, are significantly exhibited in such a case.
- the imaging system 10 outputs guidance information for guiding the target's line of sight to the line-of-sight angle origin.
- the target's line of sight can be guided to the line-of-sight angle origin, and an image of the target's eye (iris) can be appropriately captured.
- FIG. 22 and 23 An imaging system 10 according to the eighth embodiment will be described with reference to FIGS. 22 and 23.
- FIG. It should be noted that the eighth embodiment may differ from the above-described seventh embodiment only in part of the operation, and other parts may be the same as those of the first to seventh embodiments. Therefore, in the following, portions different from the already described embodiments will be described in detail, and descriptions of other overlapping portions will be omitted as appropriate.
- FIG. 22 is a front view showing an output example of guidance information corresponding to imaging timing in the imaging system according to the eighth embodiment.
- FIG. 23 is a front view showing an output example of guidance information corresponding to timings other than the imaging timing in the imaging system according to the eighth embodiment.
- an eye mark is displayed around the line-of-sight angle origin as guidance information.
- This mark indicates when the target is positioned at the first focal length (i.e., when the first camera 110 should image the target) and when the target is positioned at the second focal length (i.e., the second focal length).
- the eyes are displayed to open (see FIG. 22).
- This display mode with the eyes open is for prompting the subject to look at the line-of-sight angle origin. Therefore, it is preferable that the mark with the eyes open is displayed in a relatively conspicuous display mode.
- FIG. 24 is a block diagram showing the functional configuration of an imaging system according to the ninth embodiment.
- the same symbols are attached to the same components as those shown in FIG.
- the imaging system 10 according to the ninth embodiment includes an imaging unit 18, a first adjustment unit 310, and a second adjustment unit 350 as components for realizing the functions thereof. configured as follows. That is, the imaging system 10 according to the ninth embodiment further includes a second adjustment section 350 in addition to the configuration of the first embodiment (see FIG. 2). Note that the second adjustment unit 350 may be a processing block realized by, for example, the above-described processor 11 (see FIG. 1).
- the third camera 510 is provided as a camera for specifying the eye position of the target when the first camera 110 captures an image of the target.
- the fourth camera 520 is provided as a camera for specifying the eye position of the target when the second camera 120 captures an image of the target. Specifically, when the first camera 110 captures an image of the target, the eye positions of the target are specified from the image captured by the third camera 510, and the image is captured based on the specified eye positions of the target. Similarly, when capturing an image of a target with the second camera 120, the eye positions of the target are specified from the image captured with the fourth camera 520, and the image is captured based on the specified eye positions of the target. Note that existing techniques can be appropriately adopted for the method of specifying the eye positions of the object from the image, so detailed description thereof will be omitted here.
- the second mirror 220 is a mirror that can reflect the light used when the third camera 510 and the fourth camera 520 take images.
- the second mirror 220 is arranged to correspond to both the third camera 510 and the fourth camera 520 . That is, the third camera 510 and the fourth camera 520 are configured to be able to image the target via the second mirror 220, respectively.
- the third camera 510 captures an image using light incident through the second mirror 220
- the fourth camera 520 also captures an image using light incident through the second mirror 220. .
- the second adjuster 350 is configured to be able to adjust the optical positional relationship between the third camera 510 or the fourth camera 520 and the second mirror 220 . That is, the second adjuster 350 has the same function as the first adjuster 310 already described. More specifically, second adjustment section 350 adjusts the optical positional relationship between third camera 510 and second mirror 220 when imaging is performed by third camera 510 . This enables the third camera 510 to capture an image of the object. Also, the second adjustment unit 350 adjusts the optical positional relationship between the fourth camera 520 and the second mirror 220 when the fourth camera 520 performs imaging. This enables the fourth camera 520 to capture an image of the object. The second adjustment unit 350 drives at least one of the third camera 510, the fourth camera 520, and the second mirror 220 with a drive unit including, for example, an actuator, thereby adjusting the optical positional relationship of each. may be configured to
- the first camera 110 and the second camera 120 are arranged vertically from top to bottom. 1 mirror 210 is sandwiched therebetween. The optical positional relationship between the first camera 110 and the second camera 120 and the first mirror 210 is adjusted by the first adjustment unit 310 rotationally driving the first mirror 210 . Further, in the ninth embodiment, a third camera 510, a fourth camera 520, and a second mirror 220 are arranged side by side with the first camera 110, the second camera 120, and the first mirror 210 described above. ing. The first camera 110 and the second camera 120 are arranged so as to sandwich the second mirror 220 from above and below.
- the optical positional relationship between the third camera 510, the fourth camera 520, and the second mirror 220 is adjusted by operations similar to those of the first camera 110, the second camera 120, and the first mirror 210 described above. be. Specifically, the optical positional relationship between the third camera 510 and the fourth camera 520 and the second mirror 220 is adjusted by the second adjustment unit 350 rotationally driving the second mirror 220 .
- FIG. 26 is a flow chart showing the flow of imaging operation of the imaging system according to the ninth embodiment.
- the same reference numerals are given to the same processes as those shown in FIG.
- the first adjustment unit 310 first determines which of the first camera 110 and the second camera 120 is used to image the object. is determined (step S101). The determination result by the first adjuster 310 here is output to the second adjuster.
- the second adjustment unit 350 adjusts the optical positional relationship between the fourth camera 520 and the second mirror 220 (step S903).
- the fourth camera 520 captures an image and identifies the eye position of the target from the image (step S902).
- the first adjuster 310 adjusts the optical positional relationship between the second camera 120 and the first mirror 210 (step S104).
- the second camera 120 takes an image (step S105).
- the third camera 510 and the fourth camera 520 are used to identify the eye positions of the object.
- the optical positional relationship between the third camera 510 and the fourth camera 520 and the second mirror 220 is adjusted depending on which camera is used for imaging. In this way, each of the third camera 510 and the fourth camera 520 can capture an image via the second mirror 220 . In other words, it is possible to take an image for specifying the eye position of the object via a common mirror.
- FIG. 27 and 28 An imaging system 10 according to the tenth embodiment will be described with reference to FIGS. 27 and 28.
- FIG. The tenth embodiment may differ from the above-described ninth embodiment only in a part of the configuration and operation, and may be the same as the first to ninth embodiments in other parts. Therefore, in the following, portions different from the already described embodiments will be described in detail, and descriptions of other overlapping portions will be omitted as appropriate.
- FIG. 27 is a side view showing the viewing angle origin of the third camera and the fourth camera in the imaging system according to the tenth embodiment.
- FIG. 28 is a front view showing the viewing angle origin of each camera in the imaging system according to the tenth embodiment.
- the intersection of the optical axis of each camera and the mirror surface of the first mirror 210 is a common position.
- the position of the surface of the mirror serving as the center of rotation becomes the common line-of-sight angle origin.
- the third camera 510 and the fourth camera 520 perform imaging via a common line-of-sight angle origin.
- the path for guiding light to the third camera 510 and the fourth camera 520 can be made common, so the configuration of the imaging section 18 can be simplified.
- a single common line-of-sight angle origin is used. All you have to do is guide them.
- a processing method is also implemented in which a program for operating the configuration of each embodiment described above is recorded on a recording medium, the program recorded on the recording medium is read as code, and executed by a computer. Included in the category of form. That is, a computer-readable recording medium is also included in the scope of each embodiment. In addition to the recording medium on which the above program is recorded, the program itself is also included in each embodiment.
- a floppy (registered trademark) disk, hard disk, optical disk, magneto-optical disk, CD-ROM, magnetic tape, non-volatile memory card, and ROM can be used as recording media.
- the program recorded on the recording medium alone executes the process, but also the one that operates on the OS and executes the process in cooperation with other software and functions of the expansion board. included in the category of Furthermore, the program itself may be stored on the server, and part or all of the program may be downloaded from the server to the user terminal.
- the imaging system according to Appendix 1 is arranged to correspond to a first camera having a first focal length, a second camera having a second focal length, and both the first camera and the second camera.
- appendix 2 The imaging system according to appendix 2 is the imaging system according to appendix 1, wherein the first camera and the second camera perform imaging via a common first line-of-sight angle origin.
- the imaging system includes position acquisition means for acquiring the position of the target, authentication means for performing authentication processing using images of the target captured by the first camera and the second camera, and a first control means for controlling to perform the authentication processing by capturing a first image with the first camera when the position of the target is a position corresponding to the first focal length; If the authentication process fails, control is performed so that the authentication process is performed by capturing a second image with the second camera after waiting for the position of the target to become a position corresponding to the second focal length. 5.
- the imaging system according to any one of appendices 1 to 4, further comprising a second control means.
- the first adjusting means adjusts the first camera or the second camera and the first 6.
- the guidance information output means displays an image related to the eyes around the first line-of-sight angle origin, and the target is at the first focal length and the second focal length.
- the display is controlled such that the eyes are opened when the position is located, and the eyes are closed when the position is not located at the first focal length and the second focal length.
- the imaging system according to Supplementary Note 9 includes a third camera that captures an image that specifies the eye position of the target when the first camera captures an image of the target, and a fourth camera that captures an image that specifies the eye position of the subject; a second mirror that is arranged to correspond to both the third camera and the fourth camera; the third camera and the fourth camera; Supplementary Note 1 further comprising a second adjusting means for adjusting the optical positional relationship between the third camera or the fourth camera and the second mirror, depending on which of the above is used to image the object.
- the imaging system according to appendix 10 is the imaging system according to appendix 9, wherein the third camera and the fourth camera perform imaging via a second line-of-sight angle origin common to each other.
- the imaging device is arranged to correspond to a first camera having a first focal length, a second camera having a second focal length, and both the first camera and the second camera.
- the recording medium according to appendix 13 is configured such that, in at least one computer, a first camera having a first focal length, a second camera having a second focal length, and both the first camera and the second camera and a first mirror arranged to correspond to the first mirror, wherein the first A recording medium recording a computer program for executing an imaging method for adjusting the optical positional relationship between the first camera or the second camera and the first mirror.
- Imaging system 11 processor 18 imaging unit 25 gate 110 first camera 120 second camera 210 first mirror 220 second mirror 310 first adjustment unit 315 object detection unit 320 position acquisition unit 330 authentication unit 340 guidance information output unit 350 second Adjustment unit 410 First drive unit 420 Second drive unit 510 Third camera 520 Fourth camera
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Health & Medical Sciences (AREA)
- Toxicology (AREA)
- Human Computer Interaction (AREA)
- Studio Devices (AREA)
Abstract
Description
第1実施形態に係る撮像システムについて、図1から図3を参照して説明する。
まず、図1を参照しながら、第1実施形態に係る撮像システムのハードウェア構成について説明する。図1は、第1実施形態に係る撮像システムのハードウェア構成を示すブロック図である。
次に、図2を参照しながら、第1実施形態に係る撮像システム10の機能的構成について説明する。図2は、第1実施形態に係る撮像システムの機能的構成を示すブロック図である。
次に、図3を参照しながら、第1実施形態に係る撮像システム10の撮像動作(即ち、対象の画像を撮像する際の動作)の流れについて説明する。図3は、第1実施形態に係る撮像システムの撮像動作の流れを示すフローチャートである。
次に、上述した第1実施形態に係る撮像システム10の変形例について、図4及び図5を参照して説明する。図4は、第1実施形態の変形例に係る撮像システムの機能的構成を示すブロック図である。図5は、第1実施形態の変形例係る撮像システムの撮像動作の流れを示すフローチャートである。なお、図4及び図5では、図2及び図3で示した要素又は処理と同様のものに同一の符号を付している。
次に、第1実施形態に係る撮像システム10によって得られる技術的効果について説明する。
第2実施形態に係る撮像システム10について、図6を参照して説明する。なお、第2実施形態は、上述した第1実施形態と一部の構成が異なるのみであり、その他の部分については第1実施形態と同一であってよい。このため、以下では、すでに説明した第1実施形態と異なる部分について詳細に説明し、その他の重複する部分については適宜説明を省略するものとする。
まず、図6を参照しながら、第2実施形態に係る撮像システム10における視線角原点について説明する。図6は、第2実施形態に係る撮像システムにおける第1カメラ及び第2カメラの視野角原点を示す側面図である。
次に、第2実施形態に係る撮像システム10によって得られる技術的効果について説明する。
第3実施形態に係る撮像システム10について、図7及び図8を参照して説明する。なお、第3実施形態は、上述した第1及び第2実施形態と一部の構成及び動作が異なるのみであり、その他の部分については第1及び第2実施形態と同一であってよい。このため、以下では、すでに説明した各実施形態と異なる部分について詳細に説明し、その他の重複する部分については適宜説明を省略するものとする。
まず、図7を参照しながら、第3実施形態に係る撮像システム10における撮像部18の構成及び動作について説明する。図7は、第3実施形態に係る撮像システムによる第1ミラーの回転駆動制御を示す側面図である。
次に、図8を参照しながら、第3実施形態に係る撮像システム10におけるカメラの配置バリエーションについて説明する。図8は、第2カメラの配置バリエーションを示す側面図である。
次に、第3実施形態に係る撮像システム10によって得られる技術的効果について説明する。
第4実施形態に係る撮像システム10について、図9から図11を参照して説明する。なお、第4実施形態は、上述した第1から第3実施形態と一部の構成及び動作が異なるのみであり、その他の部分については第1から第3実施形態と同一であってよい。このため、以下では、すでに説明した各実施形態と異なる部分について詳細に説明し、その他の重複する部分については適宜説明を省略するものとする。
まず、図9を参照しながら、第4実施形態に係る撮像システム10における撮像部18の一例である平行移動式カメラについて説明する。図9は、第1カメラ及び第2カメラを平行移動させる駆動部の動作例を示す正面図である。
次に、図10及び図11を参照しながら、第4実施形態に係る撮像システム10における撮像部18の一例であるリボルバー式カメラについて説明する。図10は、第1カメラ及び第2カメラを回転移動させる駆動部の構成例を示す側面図である。図11は、第1カメラ及び第2カメラを回転移動させる駆動部の構成例を示す上面図である。
次に、図12から図15を参照して、上述した平行移動式カメラ(図9参照)と、リボルバー式カメラ(図10及び図11参照)とを組み合わせた場合の構成例について説明する。なお、以下では、第1ミラー210の上側と下側とにそれぞれカメラが設置される例を挙げて説明する。
まず、図12を参照しながら、第1の組み合わせ例について説明する。図12は、カメラの第1の組み合わせ例を示す正面図である。図12では、図9で示した構成要素と同様の要素に同一の符号を付している。
次に、図13を参照しながら、第2の組み合わせ例について説明する。図13は、カメラの第2の組み合わせ例を示す正面図である。図13では、図10で示した構成要素と同様の要素に同一の符号を付している。
次に、図14を参照しながら、第3の組み合わせ例について説明する。図14は、カメラの第3の組み合わせ例を示す正面図である。図14では、図9で示した構成要素と同様の要素に同一の符号を付している。
次に、図15を参照しながら、第4の組み合わせ例について説明する。図15は、カメラの第4の組み合わせ例を示す正面図である。図15では、図9及び図10で示した構成要素と同様の要素に同一の符号を付している。
次に、第4実施形態に係る撮像システム10によって得られる技術的効果について説明する。
第5実施形態に係る撮像システム10について、図16から図18を参照して説明する。なお、第5実施形態は、上述した第1から第4実施形態と一部の構成及び動作が異なるのみであり、その他の部分については第1から第4実施形態と同一であってよい。このため、以下では、すでに説明した各実施形態と異なる部分について詳細に説明し、その他の重複する部分については適宜説明を省略するものとする。
まず、図16を参照しながら、第5実施形態に係る撮像システム10の機能的構成について説明する。図16は、第5実施形態に係る撮像システムの機能的構成を示すブロック図である。なお、図16では、図2で示した構成要素と同様のものに同一の符号を付している。
次に、図17を参照しながら、第5実施形態に係る撮像システム10が実行する遠隔認証及び近傍認証について説明する。図16は、第5実施形態に係る撮像システムによる遠隔認証及び近傍認証を示す概念図である。
次に、図18を参照しながら、第5実施形態に係る撮像システム10による認証動作(即ち、撮像した画像を用いて生体認証を行う動作)の流れについて説明する。図18は、第5実施形態に係る撮像システムの認証動作の流れを示すフローチャートである。
次に、第5実施形態に係る撮像システム10によって得られる技術的効果について説明する。
第6実施形態に係る撮像システム10について、図19を参照して説明する。なお、第6実施形態は、上述した第1から第5実施形態と一部の構成及び動作が異なるのみであり、その他の部分については第1から第5実施形態と同一であってよい。このため、以下では、すでに説明した各実施形態と異なる部分について詳細に説明し、その他の重複する部分については適宜説明を省略するものとする。
まず、図19を参照しながら、第6実施形態に係る撮像システム10によって実行される複数のフェイズに応じた処理について説明する。図19は、第6実施形態に係る撮像システムにおける各フェイズと処理内容を示す概念図である。
次に、第6実施形態に係る撮像システム10によって得られる技術的効果について説明する。
第7実施形態に係る撮像システム10について、図20及び図21を参照して説明する。なお、第7実施形態は、上述した第1から第6実施形態と一部の構成及び動作が異なるのみであり、その他の部分については第1から第6実施形態と同一であってよい。このため、以下では、すでに説明した各実施形態と異なる部分について詳細に説明し、その他の重複する部分については適宜説明を省略するものとする。
まず、図20を参照しながら、第7実施形態に係る撮像システム10の機能的構成について説明する。図20は、第7実施形態に係る撮像システムの機能的構成を示すブロック図である。なお、図20では、図2で示した構成要素と同様のものに同一の符号を付している。
次に、図21を参照しながら、第7実施形態に係る撮像システム10において出力される誘導情報の具体例について説明する。図21は、第7実施形態に係る撮像システムによる誘導情報の出力例を示す正面図である。
次に、第7実施形態に係る撮像システム10によって得られる技術的効果について説明する。
第8実施形態に係る撮像システム10について、図22及び図23を参照して説明する。なお、第8実施形態は、上述した第7実施形態と一部の動作が異なるのみであり、その他の部分については第1から第7実施形態と同一であってよい。このため、以下では、すでに説明した各実施形態と異なる部分について詳細に説明し、その他の重複する部分については適宜説明を省略するものとする。
まず、図22及び図23を参照しながら、第8実施形態に係る撮像システム10において出力される誘導情報の具体例について説明する。図22は、第8実施形態に係る撮像システムにおける撮像タイミングに対応する誘導情報の出力例を示す正面図である。図23は、第8実施形態に係る撮像システムにおける撮像タイミング以外に対応する誘導情報の出力例を示す正面図である。
次に、第8実施形態に係る撮像システム10によって得られる技術的効果について説明する。
第9実施形態に係る撮像システム10について、図24から図26を参照して説明する。なお、第9実施形態は、上述した第1から第8実施形態と一部の構成及び動作が異なるのみであり、その他の部分については第1から第8実施形態と同一であってよい。このため、以下では、すでに説明した各実施形態と異なる部分について詳細に説明し、その他の重複する部分については適宜説明を省略するものとする。
まず、図24を参照しながら、第9実施形態に係る撮像システム10の機能的構成について説明する。図24は、第9実施形態に係る撮像システムの機能的構成を示すブロック図である。なお、図24では、図2で示した構成要素と同様のものに同一の符号を付している。
次に、図25を参照しながら、第9実施形態に係る撮像システム10における撮像部18の構成及び動作について説明する。図25は、第9実施形態に係る撮像システムの配置例を示す正面図である。
次に、図26を参照しながら、第9実施形態に係る撮像システム10の撮像動作の流れについて説明する。図26は、第9実施形態に係る撮像システムの撮像動作の流れを示すフローチャートである。なお、図26では、図3で示した処理と同様の処理に同一の符号を付している。
次に、第9実施形態に係る撮像システム10によって得られる技術的効果について説明する。
第10実施形態に係る撮像システム10について、図27及び図28を参照して説明する。なお、第10実施形態は、上述した第9実施形態と一部の構成及び動作が異なるのみであり、その他の部分については第1から第9実施形態と同一であってよい。このため、以下では、すでに説明した各実施形態と異なる部分について詳細に説明し、その他の重複する部分については適宜説明を省略するものとする。
まず、図27及び図28を参照しながら、第10実施形態に係る撮像システムにおける視線角原点について説明する。図27は、第10実施形態に係る撮像システムにおける第3カメラ及び第4カメラの視野角原点を示す側面図である。図28は、第10実施形態に係る撮像システムにおける各カメラの視野角原点を示す正面図である。
次に、第10実施形態に係る撮像システム10によって得られる技術的効果について説明する。
以上説明した実施形態に関して、更に以下の付記のようにも記載されうるが、以下には限られない。
付記1に記載の撮像システムは、第1の焦点距離を有する第1カメラと、第2の焦点距離を有する第2カメラと、前記第1カメラ及び前記第2カメラの両方に対応するように配置された第1ミラーと、前記第1カメラ及び前記第2カメラのいずれで対象の撮像を行うかに応じて、前記第1カメラ又は前記第2カメラと、前記第1ミラーとの光学的な位置関係を調整する第1調整手段と、を備える撮像システムである。
付記2に記載の撮像システムは、前記第1カメラ及び前記第2カメラは、互いに共通する第1の視線角原点を介して撮像を行う、付記1に記載の撮像システムである。
付記3に記載の撮像システムは、前記第1カメラ及び前記第2カメラは、前記第1ミラーを挟んで互いに向かい合うように配置されており、前記第1調整手段は、前記第1ミラーを回転させることで、前記第1カメラ又は前記第2カメラと、前記第1ミラーとの光学的な位置関係を調整する、付記1又は2に記載の撮像システムである。
付記4に記載の撮像システムは、前記第1調整手段は、前記第1カメラ及び前記第2カメラを移動させることで、前記第1カメラ又は前記第2カメラと、前記第1ミラーとの光学的な位置関係を調整する、付記1又は2に記載の撮像システムである。
付記5に記載の撮像システムは、前記対象の位置を取得する位置取得手段と、前記第1カメラ及び前記第2カメラで撮像した前記対象の画像を用いて認証処理を実行する認証手段と、前記対象の位置が前記第1の焦点距離に対応する位置である場合に、前記第1カメラで第1画像を撮像して前記認証処理を行うよう制御する第1制御手段と、前記第1画像による前記認証処理が失敗した場合に、前記対象の位置が前記第2の焦点距離に対応する位置となるのを待って前記第2カメラで第2画像を撮像して前記認証処理を行うよう制御する第2制御手段と、を更に備える付記1から4のいずれか一項に記載の撮像システムである。
付記6に記載の撮像システムは、前記第1調整手段は、前記対象の位置又は状況に応じて予め設定される複数のフェイズに応じて、前記第1カメラ又は前記第2カメラと、前記第1ミラーとの光学的な位置関係を調整する、付記1から5のいずれか一項に記載の撮像システムである。
付記7に記載の撮像システムは、前記第1カメラ及び前記第2カメラで前記対象を撮像する場合に、前記第1の視線角原点に前記対象の視線を誘導する情報を出力する誘導情報出力手段を更に備える、付記2から6のいずれか一項に記載の撮像システムである。
付記8に記載の撮像システムは、前記誘導情報出力手段は、前記第1の視線角原点の周囲に目に関する画像を表示し、前記対象が前記第1の焦点距離及び前記第2の焦点距離に位置する場合には目を開き、前記第1の焦点距離及び前記第2の焦点距離に位置しない場合には目を閉じるように表示を制御する、付記7に記載の撮像システムである。
付記9に記載の撮像システムは、前記第1カメラで前記対象を撮像する際の前記対象の目位置を特定する画像を撮像する第3カメラと、前記第2カメラで前記対象を撮像する際の前記対象の目位置を特定する画像を撮像する第4カメラと、前記第3カメラ及び前記第4カメラの両方に対応するように配置された第2ミラーと、前記第3カメラ及び前記第4カメラのいずれで前記対象の撮像を行うかに応じて、前記第3カメラ又は前記第4カメラと、前記第2ミラーとの光学的な位置関係を調整する第2調整手段と、を更に備える付記1から8のいずれか一項に記載の撮像システムである。
付記10に記載の撮像システムは、前記第3カメラ及び前記第4カメラは、互いに共通する第2の視線角原点を介して撮像を行う、付記9に記載の撮像システムである。
付記11に記載の撮像装置は、第1の焦点距離を有する第1カメラと、第2の焦点距離を有する第2カメラと、前記第1カメラ及び前記第2カメラの両方に対応するように配置された第1ミラーと、前記第1カメラ及び前記第2カメラのいずれで対象の撮像を行うかに応じて、前記第1カメラ又は前記第2カメラと、前記第1ミラーとの光学的な位置関係を調整する第1調整手段と、を備える撮像装置である。
付記12に記載の撮像方法は、少なくとも1つのコンピュータによって、第1の焦点距離を有する第1カメラと、第2の焦点距離を有する第2カメラと、前記第1カメラ及び前記第2カメラの両方に対応するように配置された第1ミラーと、を備える撮像システムを制御する撮像方法であって、前記第1カメラ及び前記第2カメラのいずれで対象の撮像を行うかに応じて、前記第1カメラ又は前記第2カメラと、前記第1ミラーとの光学的な位置関係を調整する、撮像方法である。
付記13に記載の記録媒体は、少なくとも1つのコンピュータに、第1の焦点距離を有する第1カメラと、第2の焦点距離を有する第2カメラと、前記第1カメラ及び前記第2カメラの両方に対応するように配置された第1ミラーと、を備える撮像システムを制御する撮像方法であって、前記第1カメラ及び前記第2カメラのいずれで対象の撮像を行うかに応じて、前記第1カメラ又は前記第2カメラと、前記第1ミラーとの光学的な位置関係を調整する、撮像方法を実行させるコンピュータプログラムが記録された記録媒体である。
付記14に記載のコンピュータプログラムは、少なくとも1つのコンピュータに、第1の焦点距離を有する第1カメラと、第2の焦点距離を有する第2カメラと、前記第1カメラ及び前記第2カメラの両方に対応するように配置された第1ミラーと、を備える撮像システムを制御する撮像方法であって、前記第1カメラ及び前記第2カメラのいずれで対象の撮像を行うかに応じて、前記第1カメラ又は前記第2カメラと、前記第1ミラーとの光学的な位置関係を調整する、撮像方法を実行させるコンピュータプログラムである。
11 プロセッサ
18 撮像部
25 ゲート
110 第1カメラ
120 第2カメラ
210 第1ミラー
220 第2ミラー
310 第1調整部
315 対象検出部
320 位置取得部
330 認証部
340 誘導情報出力部
350 第2調整部
410 第1駆動部
420 第2駆動部
510 第3カメラ
520 第4カメラ
Claims (13)
- 第1の焦点距離を有する第1カメラと、
第2の焦点距離を有する第2カメラと、
前記第1カメラ及び前記第2カメラの両方に対応するように配置された第1ミラーと、
前記第1カメラ及び前記第2カメラのいずれで対象の撮像を行うかに応じて、前記第1カメラ又は前記第2カメラと、前記第1ミラーとの光学的な位置関係を調整する第1調整手段と、
を備える撮像システム。 - 前記第1カメラ及び前記第2カメラは、互いに共通する第1の視線角原点を介して撮像を行う、
請求項1に記載の撮像システム。 - 前記第1カメラ及び前記第2カメラは、前記第1ミラーを挟んで互いに向かい合うように配置されており、
前記第1調整手段は、前記第1ミラーを回転させることで、前記第1カメラ又は前記第2カメラと、前記第1ミラーとの光学的な位置関係を調整する、
請求項1又は2に記載の撮像システム。 - 前記第1調整手段は、前記第1カメラ及び前記第2カメラを移動させることで、前記第1カメラ又は前記第2カメラと、前記第1ミラーとの光学的な位置関係を調整する、
請求項1又は2に記載の撮像システム。 - 前記対象の位置を取得する位置取得手段と、
前記第1カメラ及び前記第2カメラで撮像した前記対象の画像を用いて認証処理を実行する認証手段と、
前記対象の位置が前記第1の焦点距離に対応する位置である場合に、前記第1カメラで第1画像を撮像して前記認証処理を行うよう制御する第1制御手段と、
前記第1画像による前記認証処理が失敗した場合に、前記対象の位置が前記第2の焦点距離に対応する位置となるのを待って前記第2カメラで第2画像を撮像して前記認証処理を行うよう制御する第2制御手段と、
を更に備える請求項1から4のいずれか一項に記載の撮像システム。 - 前記第1調整手段は、前記対象の位置又は状況に応じて予め設定される複数のフェイズに応じて、前記第1カメラ又は前記第2カメラと、前記第1ミラーとの光学的な位置関係を調整する、
請求項1から5のいずれか一項に記載の撮像システム。 - 前記第1カメラ及び前記第2カメラで前記対象を撮像する場合に、前記第1の視線角原点に前記対象の視線を誘導する情報を出力する誘導情報出力手段を更に備える、
請求項2から6のいずれか一項に記載の撮像システム。 - 前記誘導情報出力手段は、前記第1の視線角原点の周囲に目に関する画像を表示し、前記対象が前記第1の焦点距離及び前記第2の焦点距離に位置する場合には目を開き、前記第1の焦点距離及び前記第2の焦点距離に位置しない場合には目を閉じるように表示を制御する、
請求項7に記載の撮像システム。 - 前記第1カメラで前記対象を撮像する際の前記対象の目位置を特定する画像を撮像する第3カメラと、
前記第2カメラで前記対象を撮像する際の前記対象の目位置を特定する画像を撮像する第4カメラと、
前記第3カメラ及び前記第4カメラの両方に対応するように配置された第2ミラーと、
前記第3カメラ及び前記第4カメラのいずれで前記対象の撮像を行うかに応じて、前記第3カメラ又は前記第4カメラと、前記第2ミラーとの光学的な位置関係を調整する第2調整手段と、
を更に備える請求項1から8のいずれか一項に記載の撮像システム。 - 前記第3カメラ及び前記第4カメラは、互いに共通する第2の視線角原点を介して撮像を行う、
請求項9に記載の撮像システム。 - 第1の焦点距離を有する第1カメラと、
第2の焦点距離を有する第2カメラと、
前記第1カメラ及び前記第2カメラの両方に対応するように配置された第1ミラーと、
前記第1カメラ及び前記第2カメラのいずれで対象の撮像を行うかに応じて、前記第1カメラ又は前記第2カメラと、前記第1ミラーとの光学的な位置関係を調整する第1調整手段と、
を備える撮像装置。 - 少なくとも1つのコンピュータによって、第1の焦点距離を有する第1カメラと、第2の焦点距離を有する第2カメラと、前記第1カメラ及び前記第2カメラの両方に対応するように配置された第1ミラーと、を備える撮像システムを制御する撮像方法であって、
前記第1カメラ及び前記第2カメラのいずれで対象の撮像を行うかに応じて、前記第1カメラ又は前記第2カメラと、前記第1ミラーとの光学的な位置関係を調整する、
撮像方法。 - 少なくとも1つのコンピュータに、
第1の焦点距離を有する第1カメラと、第2の焦点距離を有する第2カメラと、前記第1カメラ及び前記第2カメラの両方に対応するように配置された第1ミラーと、を備える撮像システムを制御する撮像方法であって、
前記第1カメラ及び前記第2カメラのいずれで対象の撮像を行うかに応じて、前記第1カメラ又は前記第2カメラと、前記第1ミラーとの光学的な位置関係を調整する、
撮像方法を実行させるコンピュータプログラムが記録された記録媒体。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2022/000532 WO2023135640A1 (ja) | 2022-01-11 | 2022-01-11 | 撮像システム、撮像装置、撮像方法、及び記録媒体 |
JP2023573514A JPWO2023135640A1 (ja) | 2022-01-11 | 2022-01-11 |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2022/000532 WO2023135640A1 (ja) | 2022-01-11 | 2022-01-11 | 撮像システム、撮像装置、撮像方法、及び記録媒体 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2023135640A1 true WO2023135640A1 (ja) | 2023-07-20 |
Family
ID=87278572
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2022/000532 WO2023135640A1 (ja) | 2022-01-11 | 2022-01-11 | 撮像システム、撮像装置、撮像方法、及び記録媒体 |
Country Status (2)
Country | Link |
---|---|
JP (1) | JPWO2023135640A1 (ja) |
WO (1) | WO2023135640A1 (ja) |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0359632A (ja) * | 1989-07-28 | 1991-03-14 | Canon Inc | カメラシステム |
JPH09186917A (ja) * | 1995-12-29 | 1997-07-15 | Kokusai Electric Co Ltd | 撮像装置 |
JP2009104599A (ja) | 2007-10-04 | 2009-05-14 | Toshiba Corp | 顔認証装置、顔認証方法、及び顔認証システム |
US20170272650A1 (en) * | 2016-03-21 | 2017-09-21 | Chiun Mai Communication Systems, Inc. | Multiple lens system and portable electronic device employing the same |
WO2020255244A1 (ja) | 2019-06-18 | 2020-12-24 | 日本電気株式会社 | 撮像システム、撮像方法、制御装置、コンピュータプログラム及び記録媒体 |
WO2021090366A1 (ja) | 2019-11-05 | 2021-05-14 | 日本電気株式会社 | 撮像装置 |
-
2022
- 2022-01-11 WO PCT/JP2022/000532 patent/WO2023135640A1/ja active Application Filing
- 2022-01-11 JP JP2023573514A patent/JPWO2023135640A1/ja active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0359632A (ja) * | 1989-07-28 | 1991-03-14 | Canon Inc | カメラシステム |
JPH09186917A (ja) * | 1995-12-29 | 1997-07-15 | Kokusai Electric Co Ltd | 撮像装置 |
JP2009104599A (ja) | 2007-10-04 | 2009-05-14 | Toshiba Corp | 顔認証装置、顔認証方法、及び顔認証システム |
US20170272650A1 (en) * | 2016-03-21 | 2017-09-21 | Chiun Mai Communication Systems, Inc. | Multiple lens system and portable electronic device employing the same |
WO2020255244A1 (ja) | 2019-06-18 | 2020-12-24 | 日本電気株式会社 | 撮像システム、撮像方法、制御装置、コンピュータプログラム及び記録媒体 |
WO2021090366A1 (ja) | 2019-11-05 | 2021-05-14 | 日本電気株式会社 | 撮像装置 |
Also Published As
Publication number | Publication date |
---|---|
JPWO2023135640A1 (ja) | 2023-07-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20200209961A1 (en) | Visibility improvement method based on eye tracking, machine-readable storage medium and electronic device | |
KR102378472B1 (ko) | 미러를 회전 시킬수 있는 구동 장치를 포함하는 카메라를 이용하여 이미지를 획득하는 방법 및 전자 장치 | |
US9852339B2 (en) | Method for recognizing iris and electronic device thereof | |
US20170374280A1 (en) | Methods and systems to obtain desired self-pictures with an image capture device | |
WO2019128101A1 (zh) | 一种投影区域自适应的动向投影方法、装置及电子设备 | |
US20200043488A1 (en) | Voice recognition image feedback providing system and method | |
JP2013076924A (ja) | 表示装置、表示制御方法及びプログラム | |
WO2018038158A1 (ja) | 虹彩撮像装置、虹彩撮像方法および記録媒体 | |
US20190237078A1 (en) | Voice recognition image feedback providing system and method | |
KR20200040716A (ko) | 시선 추적을 이용한 시인성 개선 방법, 저장 매체 및 전자 장치 | |
WO2021184341A1 (en) | Autofocus method and camera system thereof | |
US20160283781A1 (en) | Display device, display method, and display program | |
WO2023135640A1 (ja) | 撮像システム、撮像装置、撮像方法、及び記録媒体 | |
US11991450B2 (en) | Composition control device, composition control method, and program | |
US20230298386A1 (en) | Authentication apparatus, authentication method, and recording medium | |
JP2015091031A (ja) | 撮影装置、撮影制御方法及びプログラム | |
US20220244788A1 (en) | Head-mounted display | |
JP2018085579A (ja) | 撮像装置、制御方法、及び情報処理プログラム | |
JP6845121B2 (ja) | ロボットおよびロボット制御方法 | |
JP2017037375A (ja) | 撮像装置及びその制御方法 | |
US20220383658A1 (en) | Imaging apparatus for authentication and authentication system | |
CN111625089B (zh) | 智能眼镜控制方法、装置、存储介质及智能眼镜 | |
JP2012227830A (ja) | 情報処理装置、その処理方法、プログラム及び撮像装置 | |
KR20200111144A (ko) | 시선 추적을 이용한 시인성 개선 방법, 저장 매체 및 전자 장치 | |
WO2023181130A1 (ja) | 情報処理装置、情報処理方法、及び記録媒体 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22920166 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2023573514 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2022920166 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2022920166 Country of ref document: EP Effective date: 20240812 |