WO2024150283A1 - Système de traitement d'informations, dispositif de traitement d'informations, procédé de traitement d'informations et support d'enregistrement - Google Patents

Système de traitement d'informations, dispositif de traitement d'informations, procédé de traitement d'informations et support d'enregistrement Download PDF

Info

Publication number
WO2024150283A1
WO2024150283A1 PCT/JP2023/000343 JP2023000343W WO2024150283A1 WO 2024150283 A1 WO2024150283 A1 WO 2024150283A1 JP 2023000343 W JP2023000343 W JP 2023000343W WO 2024150283 A1 WO2024150283 A1 WO 2024150283A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
image
control amount
focus
target
Prior art date
Application number
PCT/JP2023/000343
Other languages
English (en)
Japanese (ja)
Inventor
亮磨 大網
Original Assignee
日本電気株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 日本電気株式会社 filed Critical 日本電気株式会社
Priority to PCT/JP2023/000343 priority Critical patent/WO2024150283A1/fr
Publication of WO2024150283A1 publication Critical patent/WO2024150283A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals

Definitions

  • This disclosure relates to an information processing system, an information processing device, an information processing method, and a recording medium.
  • Patent Document 1 discloses a technique for quickly adjusting the focus when photographing a subject that is the target of biometric authentication.
  • the biometric authentication device described in Patent Document 1 includes a focus determination unit, a feature point extraction unit, a distance estimation unit, a correction unit, and a focus control unit.
  • the focus determination unit analyzes image information and performs a focus determination to determine whether the image is in focus. In the focus determination, a focus index is calculated that indicates whether a specific area of the image is in focus.
  • the feature point extraction unit performs face or head detection on the input image and obtains facial landmarks, which are the positions of characteristic parts of the face or head.
  • the distance estimation unit estimates the distance to the subject from the position information of the obtained landmarks.
  • the correction unit generates information about the correction distance for correcting the distance to the subject based on the estimated distance information and the focus information.
  • the focus control unit generates control information for controlling the focus of the imaging device based on the correction distance information.
  • the operation of the correction unit can be broadly divided into a focus distance search mode and a focus tracking mode.
  • the focus distance search mode is a mode in which the focus difference distance is found.
  • the focus tracking mode is a mode in which the distance is corrected using the found focus difference distance to maintain the focus state.
  • Patent Document 1 states that the difference between the corrected distance and the estimated distance when focusing in focus distance search mode is a deviation in distance based on individual differences, etc. It also states that because it is possible to focus quickly while taking individual differences into account, it is possible to focus quickly when focusing and taking pictures from the second time onwards.
  • This disclosure aims to improve upon the techniques described in the prior art documents mentioned above.
  • a part detection means for obtaining part information relating to a predetermined part of a target based on a part image including the predetermined part of the target;
  • An information processing system comprising: a focus control means for determining a reference control amount to be used for focus control in photographing the object, using information about a part of the object and history information including information used for focus control in past photographing.
  • a part detection means for obtaining part information relating to a predetermined part of a target based on a part image including the predetermined part of the target;
  • An information processing device comprising: a focus control means for determining a reference control amount to be used for focus control in photographing the object, using part information of the object and history information including information used for focus control in past photographing.
  • One or more computers determining site information regarding the predetermined site of the subject based on a site image including the predetermined site of the subject;
  • An information processing method is provided for determining a reference control amount to be used for focus control in imaging of the object, using part information of the object and history information including information used for focus control in past imaging.
  • determining site information regarding the predetermined site of the subject based on a site image including the predetermined site of the subject;
  • a recording medium is provided having recorded thereon a program for executing the following: determining a reference control amount to be used for focus control in photographing the object, using information about the part of the object and history information including information used for focus control in past photographing.
  • FIG. 1 is a diagram showing an overview of an information processing system according to a first embodiment.
  • 1 is a diagram showing an overview of an information processing device according to a first embodiment.
  • 2 is a flowchart showing an overview of an information processing method according to the first embodiment.
  • 1 is a diagram illustrating an example of the configuration of an information processing system according to a first embodiment.
  • FIG. 4 is a diagram showing an example of an iris image according to the first embodiment.
  • FIG. 2 is a diagram showing an example of a binocular image which is a local image according to the first embodiment.
  • FIG. 4 is a diagram illustrating an example of history information according to the first embodiment.
  • FIG. 2 is a diagram illustrating an example of the physical configuration of an information processing apparatus according to a first embodiment.
  • FIG. 5 is a flowchart illustrating an example of information processing according to the first embodiment.
  • 5 is a flowchart illustrating an example of a detection process according to the first embodiment.
  • FIG. 11 is a diagram illustrating an example of the configuration of an information processing system according to a second embodiment.
  • FIG. 11 is a diagram illustrating an example of history information according to the second embodiment.
  • FIG. 11 is a diagram illustrating an example of the mechanical configuration of a focus control unit according to a second embodiment.
  • 10 is a flowchart illustrating an example of information processing according to a second embodiment.
  • 10 is a flowchart illustrating an example of a reference control amount acquisition process according to a second embodiment.
  • 13 is a flowchart illustrating an example of information processing according to a third embodiment.
  • FIG. 13 is a flowchart illustrating an example of a reference control amount acquisition process according to a third embodiment.
  • FIG. 13 is a diagram illustrating an example of the configuration of an information processing system according to a fourth embodiment.
  • FIG. 13 is a diagram illustrating an example of the mechanical configuration of a focus control unit according to a fourth embodiment.
  • 13 is a flowchart illustrating an example of information processing according to a fourth embodiment.
  • 13 is a flowchart illustrating an example of a reference control amount acquisition process according to a fourth embodiment.
  • FIG. 13 is a diagram illustrating an example of the configuration of an information processing system according to a fifth embodiment.
  • 13 is a flowchart illustrating an example of a reference history change process according to the fifth embodiment.
  • 13 is a diagram illustrating an example of the configuration of an information processing system according to a sixth embodiment. 13 is a flowchart illustrating an example of information processing according to a sixth embodiment. 13 is a flowchart illustrating an example of a detection process according to a sixth embodiment.
  • FIG. 1 is a diagram showing an overview of an information processing system 100 according to embodiment 1.
  • the information processing system 100 includes a part detection unit 111 and a focus control unit 112.
  • the body part detection unit 111 obtains body part information about a specific part of the target based on a body part image including the specific part of the target.
  • the focus control unit 112 obtains a reference control amount to be used for focus control when photographing the target, using the body part information of the target and history information including information used for focus control in past photographing.
  • This information processing system 100 makes it possible to achieve high-speed focusing.
  • FIG. 2 is a diagram showing an overview of the information processing device 103 according to the first embodiment.
  • the information processing device 103 includes a body part detection unit 111 and a focus control unit 112.
  • the body part detection unit 111 obtains body part information about a specific part of the target based on a body part image including the specific part of the target.
  • the focus control unit 112 obtains a reference control amount to be used for focus control when photographing the target, using the body part information of the target and history information including information used for focus control in past photographing.
  • This information processing device 103 makes it possible to focus quickly.
  • FIG. 3 is a flowchart showing an overview of the information processing method according to the first embodiment.
  • the body part detection unit 111 obtains body part information about the specific body part of the target based on a body part image including the specific body part of the target (step S101a).
  • the focus control unit 112 uses information about the part of the subject and history information including information used for focus control in past photographs to determine a reference control amount to be used for focus control in photographing the subject (step S102).
  • This information processing method makes it possible to focus quickly.
  • Patent Document 1 discloses technology for quickly adjusting the focus on the second and subsequent shots, it does not disclose technology for quickly adjusting the focus on the first shot. Technology for faster focusing is desired.
  • one example of the objective of this disclosure is to provide an information processing system, information processing device, information processing method, program, and the like that solves the problem of high-speed focusing.
  • FIG. 4 is a diagram showing an example of the configuration of an information processing system 100 according to the first embodiment.
  • the information processing system 100 is a system for photographing an object.
  • the information processing system 100 also performs authentication using an object image obtained by photographing the object.
  • the target, target image, and authentication in this embodiment are a person, a facial image, and iris authentication, respectively. Therefore, the information processing system 100 in this embodiment performs authentication using an iris image that includes the iris from among the facial images.
  • the facial image is an image that includes at least a portion of the face, and may further include a portion or the entire head.
  • FIG. 5 is a diagram showing an example of an iris image according to embodiment 1.
  • the iris image needs to include at least the iris, and may also include some or all of the pupil, white of the eye, eyelid, outer corner of the eye, inner corner of the eye, eyelashes, and surroundings of the eye.
  • the figure shows an example of a right iris image
  • the iris image may be a left iris image, or an image including both the left and right irises.
  • the target is not limited to a person, but may be an animal or other object.
  • the target image is not limited to a face image, but may be an image that includes at least a portion of the target (e.g., a portion appropriate for the purpose, such as authentication).
  • the authentication performed by the information processing system 100 is one example of a use of the target image, and the target image may be used for purposes other than authentication.
  • the authentication is not limited to iris authentication, and may be any authentication performed using an image.
  • the authentication may be, for example, biometric authentication other than iris authentication, or may be other than biometric authentication.
  • biometric authentication other than iris authentication examples include face authentication, fingerprint authentication, eye authentication, and biometric authentication using images including palm prints and veins.
  • Face authentication is biometric authentication that uses a face image.
  • Eye authentication is biometric authentication that uses an eye image that includes one of the left or right eyes or both eyes in their entirety.
  • Fingerprint authentication is biometric authentication that uses a fingerprint image that includes a fingerprint.
  • the information processing system 100 includes an imaging device 101, an object detection sensor 102, and an information processing device 103.
  • the imaging device 101 captures an image of an object and generates an object image that includes at least a portion of the object.
  • the imaging device 101 captures an image of a person and generates a facial image that includes the face and head of the person.
  • the imaging device 101 is, for example, a camera such as a near-infrared camera.
  • the imaging device 101 according to this embodiment includes a liquid lens in its lens system.
  • the liquid lens can adjust its focus according to the applied voltage.
  • the imaging device 101 only needs to be able to adjust the focus of the lens system in accordance with the output control amount from the information processing device 103, and the lens system is not limited to a liquid lens.
  • the lens system may be composed of one or more solid lenses manufactured from materials such as resin or glass.
  • the imaging device 101 may be provided with a drive mechanism that moves the solid lens along the optical axis, for example, in order to adjust the focus of the lens system.
  • the drive mechanism may include a motor that can control the rotation angle.
  • the imaging device 101 may also be incorporated into the information processing device 103.
  • the information processing system 100 may include an image capturing device (not shown) separate from the image capturing device 101.
  • the separate image capturing device may capture a face image, and the image capturing device 101 may capture an eye position estimated based on this face image. In this way, the image capturing device 101 can capture an eye image as a target image.
  • the separate image capturing device may be a visible light camera.
  • the object detection sensor 102 is a sensor for detecting the presence of an object in a trigger area.
  • the trigger area may be a predetermined linear, planar, or three-dimensional area, and the object detection sensor 102 is, for example, an infrared sensor, an ultrasonic sensor, or an area sensor.
  • the object detection sensor 102 when the object detection sensor 102 detects the presence of an object in the trigger area, it outputs a trigger signal to the image capture device 101.
  • the image capture device 101 may capture an image of a predetermined capture area upon receiving this trigger signal. By appropriately setting the trigger area and the capture area, the image capture device 101 can capture an image of the object detected by the object detection sensor 102 and generate an image of the object.
  • the trigger area when photographing an object that has stopped in the trigger area, the trigger area may be set to be part of the photographing area. For example, when photographing an object that passes through the trigger area and moves in a specified direction, the trigger area may be set at a position a specified distance away from the photographing area in the opposite direction to the specified direction.
  • the method of detecting the presence of an object in the trigger area is not limited to this.
  • the photographing device 101 may continuously photograph the photographing area, and the information processing device 103 may detect the presence of an object in the photographing area based on the images generated in each photographing.
  • the information processing system 100 may not need to be equipped with an object detection sensor 102.
  • the object detection sensor 102 may also be a sensor that reads object identification information (hereinafter also referred to as "object ID") of the object from a card or the like that holds the object ID.
  • object ID object identification information
  • the object detection sensor 102 is positioned so that the object is located in the trigger area, thereby making it possible to detect that the object is in the trigger area. Therefore, when the object detection sensor 102 reads the object ID from the card, it is preferable that the object detection sensor 102 outputs a detection signal to the imaging device 101.
  • the information processing device 103 is a device for causing the imaging device 101 to capture an image of a target.
  • the information processing device 103 also performs authentication using an image of a target generated by the imaging device 101. Note that the authentication may be performed by a device other than the information processing system 100.
  • the information processing device 103 functionally comprises a body part detection unit 111, a focus control unit 112, a focus determination unit 113, a history storage unit 114, a memory control unit 115, and an authentication unit 116.
  • the body part detection unit 111 acquires a target image from the imaging device 101.
  • the body part detection unit 111 detects a body part image including a specific part of the target from the target image.
  • the body part detection unit 111 obtains body part information based on the body part image.
  • the specified part in this embodiment is, for example, both eyes. Therefore, the part image is a binocular image.
  • the binocular image needs to include at least both eyes, and may include some or all of the eyelids, corners of the eyes, inner corners of the eyes, eyelashes, and the area around the eyes in addition to the pupils, irises, and whites of the eyes.
  • Figure 6 is a diagram showing an example of a binocular image that is a part image in embodiment 1.
  • the specified parts are not limited to both eyes, and may be any part included in the target. If the target image is intended for authentication, the specified parts may include a part used for the authentication.
  • the part image may include at least a portion of an image including a part used for authentication (in this embodiment, an iris image), and the target image may include at least a portion of the part image.
  • the part information is information about a specific part of the subject, and includes the dimensions of the specific part.
  • the dimensions here represent the length on the image, not the length in real space.
  • the part information according to this embodiment includes the distance between characteristic points included in the specified part as a dimension related to the specified part.
  • the part information according to this embodiment further includes the positions of the characteristic points.
  • the characteristic points may be determined appropriately, and characteristic points that differ in distance and position may be used.
  • the distance between characteristic locations in this embodiment is, for example, the interocular distance, which is the distance between characteristic locations included in both eyes.
  • the interocular distance is the distance between the centers of the pupils.
  • the interocular distance is the distance between the inner corners of the eyes.
  • the interocular distance is the distance between the outer corners of the eyes.
  • the distance may be expressed as the distance in the target image, for example, based on the distance between pixels in the target image.
  • the position of the characteristic feature in this embodiment is the eye position of the characteristic feature included in the eye.
  • the eye position is the position of the pupil center.
  • the eye position is the position of the iris.
  • the eye position may be the position of the pupil center and the iris.
  • the position may be expressed as a position in the target image, for example, the position of a pixel in the target image.
  • the dimensions of the specified part are not limited to the above examples, and may be, for example, the outer diameter of the iris. Furthermore, if the specified part is the face, the dimensions of the specified part may be the distance between the eyebrows and the tip of the nose, the width of the mouth, etc.
  • the focus control unit 112 obtains a reference control amount used for focus control in photographing an object.
  • the focus control is a control for focusing the image capturing device 101 on the object.
  • the reference control amount is a control amount that serves as a reference for determining an output control amount, which is a control amount to be output to the image capturing device 101.
  • the focus control unit 112 determines a reference control amount to be used for focus control in photographing the target, for example, using the target part information determined by the part detection unit 111 and history information 114a stored in the history storage unit 114, which will be described in detail later. Then, the photographing device 101 may photograph the target by adjusting the focus of the lens system according to control using the reference control amount, for example.
  • the focus control unit 112 may determine an output control amount based on the determined reference control amount, and output the determined output control amount to the image capture device 101.
  • the output control amount may be the reference control amount itself, or may be determined by a predetermined method using the reference control amount.
  • the photographing device 101 may adjust the focus of the lens system according to the output control amount from the focus control unit 112 to photograph the subject.
  • each of the reference control amount and the output control amount may be expressed as an applied voltage value, which is the magnitude of the voltage applied to the liquid lens.
  • each of the reference control amount and the output control amount is not limited to an applied voltage value, and may be expressed as an estimated distance to the target, a motor rotation angle, or the like. If the motor rotation angle is used, the rotation angle may be expressed as a change from a reference position.
  • the focus determination unit 113 determines whether the target image is a focused image.
  • a focused image is one that is in focus to a certain degree or greater.
  • the focus determination unit 113 may obtain a focus score that indicates the degree of focus, and determine whether or not the target image is a focused image based on whether or not the focus score satisfies a predetermined focus condition.
  • the focus score in this case is a value indicating the degree to which a predetermined area in the target image is in focus.
  • the focus score may be expressed, for example, by the magnitude of high-frequency components contained in the target image, the sharpness of edge components in the target image, etc.
  • a focus condition in which the focus score increases as the target is in focus is, for example, that the focus score is equal to or greater than a threshold value.
  • the focusing conditions may be determined according to the use of the target image. Since the use of the target image in this embodiment is iris recognition, the focusing conditions need only be conditions for the image to be clear enough to perform iris recognition.
  • the history storage unit 114 is a storage unit for storing history information 114a.
  • the history information 114a includes information used for focus control in past image capture.
  • the storage control unit 115 generates history information 114a and stores it in the history storage unit 114.
  • FIG. 7 is a diagram showing an example of history information 114a according to the first embodiment.
  • History information 114a is information that associates the shooting time, the focusing voltage value, the interocular distance which is part information, and the target ID.
  • the focusing voltage value and the interocular distance are examples of information used for focus control in past shooting.
  • the memory control unit 115 generates the history information 114a shown in FIG. 7 by associating the shooting time, focusing voltage value, interocular distance, and target ID.
  • the focus voltage value is the magnitude of the voltage applied to the liquid lens to capture a focused target image.
  • a focused target image is a target image that the focus determination unit 113 has determined to be a focused image.
  • the interocular distance included in the history information 114a is the part information used to determine the associated focusing voltage value.
  • the shooting time included in the history information 114a is information that indicates the time when the focused target image was captured.
  • the shooting time includes at least a portion of the date and time.
  • the time is at least a portion of the hour, minute, and second, and the seconds are not limited to 1 second units, but may be any appropriate unit such as 1/10 second units or 1/100 second units.
  • the target ID included in the history information 114a is the target ID of the target included in the focused target image.
  • the target ID may be obtained from the authentication unit 116, for example, when authentication by the authentication unit 116, which will be described later, is successful. Note that when the target detection sensor 102 reads the target ID, the target ID may be obtained from the target detection sensor 102.
  • the history information 114a illustrated in FIG. 7 includes, for example, a history of a focus voltage value, interocular distance, and object ID of an object photographed at photographing time T1, which are "V1,” "DE1,” and "P1,” respectively.
  • the history information 114a may be stored in the history storage unit 114 in chronological order according to photographing times T1 to Tn, as illustrated in FIG. 7.
  • the authentication unit 116 performs authentication using a target image.
  • the authentication unit 116 performs authentication of the target using a body part image included in the target image.
  • the authentication unit 116 according to the present embodiment performs iris authentication of a person using an iris image included in a face image.
  • the body part image may be an image captured separately from the target image.
  • the authentication unit 116 acquires from the part detection unit 111 a part image detected based on a focused target image that the focus determination unit 113 has determined to be an in-focus image, and part information determined based on the target image.
  • the authentication unit 116 acquires an iris image based on the acquired part image and part information.
  • the authentication unit 116 extracts features from the acquired iris image, and compares the extracted features with features of an iris image registered in advance.
  • the authentication unit 116 performs authentication based on the result of this comparison, and outputs the authentication result.
  • the authentication unit 116 determines that the target is already registered (authentication successful), and if the similarity is less than the threshold, determines that the target is not registered (authentication failed).
  • the authentication unit 116 may find the registrant with the highest similarity, and if that similarity is equal to or greater than a threshold, determine that the target is the found registrant (authentication successful), or if not, determine that the target is none of the registrants (authentication failed).
  • Example of physical configuration of information processing device 103 8 is a diagram showing an example of the physical configuration of the information processing device 103 according to embodiment 1.
  • the information processing device 103 physically includes, for example, a bus 1010, a processor 1020, a memory 1030, a storage device 1040, a communication interface 1050, an input interface 1060, and an output interface 1070.
  • the processor 1020 is a processor realized by a CPU (Central Processing Unit) or a GPU (Graphics Processing Unit).
  • Memory 1030 is a main storage device realized by a RAM (Random Access Memory) or the like.
  • the storage device 1040 is an auxiliary storage device realized by a hard disk drive (HDD), a solid state drive (SSD), a memory card, or a read only memory (ROM).
  • the storage device 1040 stores program modules for realizing each function of the information processing device 103.
  • the processor 1020 loads each of these program modules into the memory 1030 and executes them to realize the function corresponding to that program module.
  • the communication interface 1050 is an interface for connecting the information processing device 103 to a communication line.
  • the input interface 1060 is an interface through which the user inputs information.
  • the input interface 1060 is composed of, for example, a touch panel, a keyboard, a mouse, etc.
  • the output interface 1070 is an interface for presenting information to the user.
  • the output interface 1070 is composed of, for example, a liquid crystal panel, an organic EL (Electro-Luminescence) panel, etc.
  • the information processing device 103 is preferably housed in a single housing and configured as an integrated unit. However, the physical configuration of the information processing device 103 is not limited to this.
  • the information processing device 103 may be configured from multiple devices that are physically or logically divided. In this case, the functions of each device may be determined as appropriate.
  • the information processing system 100 executes information processing for causing the image capturing device 101 to capture an image of a target.
  • the information processing according to this embodiment further includes authentication processing for authenticating the target.
  • FIG. 9 is a flowchart showing an example of information processing according to the first embodiment.
  • the photographing device 101 when the photographing device 101 receives a trigger signal from the object detection sensor 102, it photographs the object and generates an object image.
  • the information processing device 103 receives the object image generated in response to the trigger signal, it starts the photographing process. Note that the method for starting the photographing process is not limited to this.
  • the body part detection unit 111 performs a detection process (step S101) based on a target image generated in response to a trigger signal, for example.
  • FIG. 10 is a flowchart showing an example of the detection process (step S101) according to the first embodiment.
  • the body part detection unit 111 detects body part images based on the target image (step S101a).
  • the part detection unit 111 may detect part images using a detection model with the target image as input.
  • the detection model is a machine learning model that has been trained to detect part images from the target image.
  • supervised learning may be performed using training data in which a correct answer label indicating the position of the part image is attached to the target image. Such training may be performed by the information processing device 103 or another device.
  • the body part detection unit 111 generates body part information based on the body part image detected in step S101a (step S101b) and returns to the imaging process.
  • the target image and the part image are a face image and a binocular image, respectively, so the part detection unit 111 detects the binocular image from the face image. Then, the part detection unit 111 determines, for example, the interocular distance and eye position in the detected binocular image, and generates part information including these.
  • the focus control unit 112 uses the part information generated in step S101b and the history information 114a stored in the history storage unit 114 to determine the reference control amount to be used for focus control in photographing the subject (step S102).
  • the focus control unit 112 determines the reference control amount from the interocular distance included in the body part information. In determining this reference control amount, the focus control unit 112 performs correction using the history information 114a.
  • the value before correction found from the body part information using the history information 114a is called the intermediate control amount, and is distinguished from the reference control amount. That is, first, the intermediate control amount is found using the history information 114a, and then the intermediate control amount is corrected to find the reference control amount.
  • the focus control unit 112 determines an output control amount based on the reference control amount calculated in step S102 (step S103) and outputs it to the imaging device 101. If there are multiple output control amounts, the focus control unit 112 may output the multiple output control amounts one by one and output the next output control amount based on the result of the focus determination in step S106, or may output the multiple output control amounts sequentially in response to imaging without waiting for the focus determination in step S106.
  • the focus control unit 112 may output the multiple output control amounts one by one and output the next output control amount based on the result of the focus determination in step S106, or may output the multiple output control amounts sequentially in response to imaging without waiting for the focus determination in step S106.
  • the focus control unit 112 may determine the amount of output control based on predetermined output pattern information.
  • Output pattern information is information that defines the method for determining the output control amount from the reference control amount.
  • the output pattern information specifies, for example, that multiple output control amounts are determined within a search range defined based on a reference control amount.
  • the output pattern information specifies that M output control amounts are determined in increments of ⁇ , with the reference control amount R as the median.
  • M is an integer equal to or greater than 1.
  • it specifies that multiple output control amounts are determined within a search range of size ⁇ (M-1) with the reference control amount R as the median. Note that when M is 1, the output control amount is the same as the reference control amount R.
  • the focus control unit 112 may determine M output control amounts, each differing by ⁇ , within a search range of R- ⁇ ((M-1)/2) or more and R+ ⁇ ((M-1)/2) or less. For example, the focus control unit 112 first outputs the first output control amount within the search range.
  • the output control amount determined in the first step S103 may be, for example, the minimum value of the search range R- ⁇ ((M-1)/2) or the maximum value of the search range R+ ⁇ ((M-1)/2). As will be described later, in determining the output control amount in the subsequent step S103, output control amounts that have not yet been selected from the information included in the output pattern are sequentially output.
  • the output pattern information is not limited to this, and the method of determining the output control amount based on the reference control amount may be changed as appropriate. For example, if a focused image cannot be obtained based on the output control amount determined within the search range defined by the output pattern information, the output pattern information may be redefined to include the sections before and after the search range in the next search range, as described below. Then, the output control amount may be further output from within the redefined search range.
  • the imaging device 101 acquires the output control amount determined in step S103, it captures an image of the target according to the acquired output control amount (step S104).
  • the image capturing device 101 generates a target image by capturing an image of the target using the output control amount.
  • the image capturing device 101 may also generate image information including this target image.
  • the image information may include at least one of the following: the image ID of the target image, the time the target image was captured, and the applied voltage value used to capture the target image to generate it.
  • the image ID is information for identifying the target image.
  • the body part detection unit 111 acquires the target image generated in step S104 from the imaging device 101, it executes a detection process (step S105) based on the acquired target image.
  • the details of the detection process (step S105) may be similar to those of the detection process (step S101), so a detailed description thereof will be omitted here.
  • the focus determination unit 113 determines whether the target image generated in step S103 is a focused image (step S106).
  • the focus determination unit 113 may acquire the part image generated in step S105 from the part detection unit 111, and calculate a focus score for the acquired part image. Then, the focus determination unit 113 may determine whether or not the target image is a focused image based on whether or not the calculated focus score satisfies the focus condition.
  • the focus control unit 112 determines an output control amount different from the output control amount determined immediately before, based on the output pattern information described above (step S103).
  • the focus control unit 112 may sequentially change the output control amount from a small value to a large value. In this case, the focus control unit 112 may determine a value that is ⁇ larger than the current value as the output control amount. Also, for example, the focus control unit 112 may sequentially change the output control amount from a large value to a small value. In this case, the focus control unit 112 may determine a value that is ⁇ smaller than the current value as the output control amount.
  • the focus control unit 112 may examine how the focus score has changed within the range indicated by the output pattern information and determine the next output pattern information. Then, the focus control unit 112 may determine the output control amount based on the next output pattern information in a similar manner to that described above.
  • the focus control unit 112 may define new output pattern information with values that are greater than or equal to R- ⁇ N and less than or equal to R- ⁇ ((M-1)/2+1), where N is an integer greater than or equal to (M-1)/2+1, and that differ by ⁇ , and determine one of these values as the output control amount.
  • the focus control unit 112 may define, for example, values that are greater than or equal to R+ ⁇ ((M-1)/2+1) and less than or equal to R+ ⁇ N, and differ by ⁇ , as new output pattern information, and determine one of these values as the output control amount.
  • step S103 When the image capturing device 101 acquires the output control amount determined in step S103, it captures an image of the target again in accordance with the acquired output control amount (step S104). Then, steps S105 to S106 are executed again.
  • step S103 the focus control unit 112 may sequentially output the output control amount in response to shooting. For example, if there are multiple output control amounts, the focus control unit 112 may sequentially output the multiple output control amounts in response to shooting. In this case, steps S103 to S105 may be repeated a number of times according to the number of output control amounts.
  • step S106 the focus determination unit 113 may receive the detection result of step S105 and perform focus determination for each target image.
  • the focus determination unit 113 may determine that the target image is a focused image (step S106; Yes). In this case, the focus determination unit 113 may output not only the determination result of whether or not the image is in focus, but also ID information that identifies the focused image. On the other hand, when all of the multiple target images captured according to multiple output control amounts are not focused images, the focus determination unit 113 may determine that the image is not a focused image (step S106; No).
  • the focus determination unit 113 may calculate focus scores for all of the multiple target images captured according to the multiple output control amounts, and then determine whether or not the images are in focus. For example, if there is a peak in the focus scores for each output control amount and the peak satisfies a predetermined condition (for example, if it is equal to or greater than a predetermined threshold), the focus determination unit 113 may determine that the target image corresponding to the peak is an in-focus image (step S106; Yes). In this case, if there is no peak, the focus determination unit 113 may determine that the image is not in focus (step S106; No). If it is determined that the image is not in focus (step S106; No), the focus control unit 112 may redefine the output pattern information in step S103 as described above, and repeat the processes of steps S104 to S106.
  • a predetermined condition for example, if it is equal to or greater than a predetermined threshold
  • the focus determination unit 113 may determine that the target image corresponding to the peak is an in-focus image (step S
  • step S106 If it is determined that the image is in focus (step S106; Yes), the memory control unit 115 generates history information 114a and stores it in the history memory unit 114 (step S108).
  • the memory control unit 115 associates the output control amount determined in step S103 and used to capture the focused target image, the interocular distance calculated in step S105, and the capture time included in the image information. In this way, the memory control unit 115 generates the history information 114a.
  • the interocular distance included in the history information 114a is the interocular distance obtained in step S105 for the image determined to be a focused image. In other words, the interocular distance when the image was actually focused is found and stored as history information.
  • the authentication unit 116 uses the image included in the part image detected in step S105 (i.e., the process equivalent to step S101a executed in step S105) to authenticate the target (step S109), and ends the information control process.
  • the storage control unit 115 may, for example, obtain the target ID registered for the target from the authentication unit 116. Then, the storage control unit 115 may store the obtained target ID in association with the history information 114a stored in step S108.
  • the discrepancy between the determined control amount and the focusing control amount becomes large, which can result in a longer focusing time.
  • the focus control amount is the control amount for generating an image that is actually in focus.
  • the focus voltage described above is an example of a focus control amount.
  • the focus time is the length of time required for an object to come into focus when photographing the object, in other words, for a focused image to be generated.
  • a liquid lens can generally adjust the focus by changing the refractive index in response to the applied voltage, but one characteristic is that the amount of change in refractive index relative to the applied voltage is easily affected by temperature. Therefore, when photographing an object at the same distance from the liquid lens, even if a focusing voltage value at a certain temperature is applied to the liquid lens, the focus may not be achieved if the temperature changes. Also, even when a solid lens is used, it may be affected by temperature and other factors due to the characteristics of the lens system, drive mechanism, etc.
  • the correction is performed using the history information 114a, so the reference control amount, which is the corrected value, can be obtained from the first shooting in step S104.
  • the history information 114a is not limited to that relating to a specific object, when using the history of the IDs of multiple objects, the effect of differences in interocular distance between objects is canceled, and this correction can reduce deviations caused mainly by the characteristics of the imaging device 101 and environmental factors such as temperature. As a result, it is more likely that a control amount close to the focus control amount can be determined from the first shooting. Therefore, the focusing time can be reduced.
  • the history information 114a used for correction may be history information 114a related to multiple different objects among the history information 114a stored in the history storage unit 114, or it may be history information 114a related to the same object as the object being photographed.
  • the latter can be achieved by extracting the history information 114a using an object ID read from a card, for example. In this case, it is also possible to reduce deviations caused by individual differences in the dimensions of specific parts, such as interocular distance. This makes it possible to further reduce the focusing time.
  • the information processing system 100 includes the part detection unit 111 and the focus control unit 112 .
  • the body part detection unit 111 obtains body part information about a specific part of the target based on a body part image including the specific part of the target.
  • the focus control unit 112 obtains a reference control amount to be used for focus control when photographing the target, using the body part information of the target and history information 114a including information used for focus control in past photographing.
  • the specified part is a part included in the target.
  • the part information includes dimensions related to the specified part.
  • the reference control amount can be calculated using the dimensions of the part included in the target, so that a specific part can be determined so that a target image focused on an area appropriate for the application of the target image can be easily obtained, and the reference control amount can be calculated. This makes it possible to focus quickly.
  • the information processing system 100 includes an image capturing device 101 and an authentication unit 116.
  • the image capturing device 101 captures an image of an object to generate an object image including at least a portion of the object.
  • the authentication unit 116 authenticates the object using an image included in the object image.
  • the image capturing device 101 includes a liquid lens.
  • the image capturing device 101 captures an image of the object by adjusting the focus of the liquid lens according to control using a reference control amount.
  • a correction amount is calculated based on a model defined based on history information, and a control amount determined from part information is corrected with the calculated correction amount.
  • FIG. 11 is a diagram showing an example of the configuration of an information processing system 200 according to the second embodiment.
  • the information processing system 200 includes an information processing device 203 that replaces the information processing device 103. Except for this, the information processing system 200 may be configured similarly to the information processing system 100 according to the first embodiment.
  • the information processing device 203 includes a focus control unit 212 and a history storage unit 214 in place of the focus control unit 112 and the history storage unit 114 of the first embodiment. Except for these, the information processing device 203 may be configured similarly to the information processing device 103 of the first embodiment.
  • the history storage unit 214 stores history information 214a that is different from the history information 114a according to embodiment 1. Except for this, the history storage unit 214 may be similar to the history storage unit 114 according to embodiment 1.
  • FIG. 12 is a diagram showing an example of history information 214a according to embodiment 2.
  • History information 214a is similar to history information 114a according to embodiment 1, except that it includes a "difference" instead of the focus voltage according to embodiment 1. That is, history information 214a is information that associates the shooting time, the difference between the intermediate control amount (control amount before correction; the value after correction becomes the reference control amount; described in detail below) and the actual focus voltage value, the interocular distance, which is part information, and the target ID.
  • the difference and interocular distance are examples of information used for focus control in past shooting.
  • the memory control unit 115 generates the history information 214a shown in FIG. 12 by associating the shooting time, the difference between the intermediate control amount and the actual focusing voltage value, the interocular distance, and the target ID.
  • the history information 214a illustrated in FIG. 12 includes, for example, a history of the difference, interocular distance, and object ID of an object photographed at the photographing time "T1" being "DF1,” "DE1,” and "P1,” respectively.
  • the history information 214a may be stored in the history storage unit 214 in chronological order according to the photographing times T1 to Tn, as illustrated in FIG. 12.
  • history information 214a is not limited to this, and may further be associated with, for example, the focusing voltage.
  • the focus control unit 212 is a detailed example of the focus control unit 112 according to the first embodiment. Therefore, the outline of the focus control unit 212 is similar to that of the focus control unit 112 according to the first embodiment.
  • FIG. 13 is a diagram showing an example of the mechanical configuration of the focus control unit 212 according to the second embodiment.
  • the focus control unit 212 includes a calculation unit 212a, a correction amount acquisition unit 212b, a correction parameter acquisition unit 212c, a correction unit 212d, and an output unit 213e.
  • the calculation unit 212a calculates an intermediate control amount for imaging the subject based on the subject's part information and a first model that takes part information as input and outputs an intermediate control amount.
  • the first model takes part information as input and outputs an intermediate control amount, and is, for example, a model that shows the relationship between part information and intermediate control amount.
  • the intermediate control amount is represented by a voltage value for controlling the focal length of the liquid lens. That is, the calculation unit 212a according to this embodiment determines an intermediate voltage value, which is an intermediate control amount in imaging the subject, based on the interocular distance, which is part information, and the first model.
  • the first model is a function that takes the interocular distance, which is part information, as input, and outputs the intermediate voltage value, which is an intermediate control amount.
  • This function f is defined to show the relationship between the interocular distance DE in an image and the focusing voltage value FV when, for example, a person with a standard interocular distance in real space (e.g., about 63 mm) is photographed in a certain environment at different distances from the photographing device. This may be found by actual measurement, or may be found arithmetically using information on lens characteristics, camera sensor information, etc.
  • the function f represents a straight line, curve, etc. that approximates the interocular distance DE and the focusing voltage value FV.
  • the correction amount acquisition unit 212b obtains the correction amount for photographing the target based on the target part information and a second model that takes the part information as input and outputs the correction amount.
  • the second model is a model that is defined based on the relationship between the part information and the difference.
  • the difference is the difference between the intermediate control amount and the actual focusing voltage value.
  • the second model is a model that obtains the difference between the intermediate control amount used for focus control in past photographing and the actual focusing voltage value as the correction amount.
  • the correction amount acquisition unit 212b obtains a correction voltage value, which is a correction amount for photographing a subject, based on the interocular distance, which is part information, and the second model.
  • the second model is a function that receives the interocular distance, which is part information, as input, and outputs a correction voltage value, which is a correction amount.
  • This function g is determined, for example, using the interocular distance DE used for focus control in past shooting, and the difference between the intermediate control amount and the actual focusing voltage value.
  • the function g represents a straight line, curve, etc. that approximates the interocular distance DE and the focusing voltage value FV.
  • the correction parameter acquisition unit 212c determines the values of the parameters included in the second model based on the history information 214a. Therefore, the correction amount acquisition unit 212b determines the correction amount using the second model to which the parameter values determined by the correction parameter acquisition unit 212c are applied.
  • the "difference" included in history information 214a corresponds to the correction voltage CV. Therefore, the correction parameter acquisition unit 212c, for example, determines the "difference" included in history information 214a as CV and the "interocular distance" included in history information 214a as DE, and finds the coefficients a and b of function g by finding a straight line that approximates the relationship between the difference and interocular distance.
  • the correction unit 212d uses the correction amount determined using the history information 214a to correct the intermediate control amount in the shooting of the target, thereby determining the reference control amount in the shooting of the target.
  • the second model is defined using the history information 214a, and the correction amount acquisition unit 212b uses this second model to determine the correction amount. Therefore, the correction amount determined using the history information 214a is the correction amount determined by the correction amount acquisition unit 212b.
  • the reference control amount is expressed as a voltage value for controlling the focal length of the liquid lens.
  • the correction unit 212d calculates the reference voltage value for photographing the target by, for example, adding or subtracting the correction voltage value calculated by the correction amount acquisition unit 212b to or from the intermediate voltage value calculated by the calculation unit 212a.
  • the correction parameter acquisition unit 212c may determine the values of the parameters included in the first model based on the history information 214a. Since a correction is applied to the first model, the intermediate control amount itself becomes a value that takes into account the correction, and the intermediate control amount can be used as a reference control amount. In this case, the correction amount acquisition unit 212b and the correction unit 212d do not need to be included.
  • the output unit 213e determines an output control amount that the imaging device 101 uses for focus control when imaging the target based on a reference control amount for imaging the target, and outputs the determined output control amount to the imaging device 101. Since the imaging device 101 includes a liquid lens, the reference control amount is expressed as a voltage value for controlling the focal length of the liquid lens.
  • the method by which the output unit 213e determines the output control amount based on the reference control amount may be the same as that of the focus control unit 112 according to the first embodiment. That is, the output unit 213e may determine the output control amount by applying output pattern information similar to that of the first embodiment to the reference control amount determined by the correction unit 212d, for example.
  • the information processing system 200 according to this embodiment may be physically configured in the same manner as the information processing system 100 according to the first embodiment.
  • Fig. 14 is a flowchart showing an example of information processing according to the second embodiment.
  • Fig. 14 shows a difference from the information processing according to the first embodiment. That is, the information processing according to this embodiment includes steps S202 to S203 instead of steps S102 to S103 according to the first embodiment. Except for these, the information processing according to this embodiment may be similar to the information processing according to the first embodiment.
  • step S101 which is the same as in embodiment 1, the focus control unit 212 uses the part information generated in step S101b and the history information 214a stored in the history storage unit 214 to determine a reference control amount to be used for focus control in photographing the target (step S202).
  • FIG. 15 is a flowchart showing an example of the reference control amount acquisition process (step S202) according to the second embodiment.
  • the calculation unit 212a calculates the intermediate control amount for imaging the target based on the target body part information and the first model (step S202a).
  • the calculation unit 212a uses the interocular distance generated in step S101b as an input and calculates the intermediate voltage value for photographing the subject using the first model.
  • the correction parameter acquisition unit 212c determines the values of the parameters included in the second model based on the history information 214a stored in the history storage unit 214 (step S202b).
  • the found coefficients a and b are used as the values of the parameters a and b to be applied to the second model.
  • the correction amount acquisition unit 212b determines the amount of correction in photographing the subject based on the subject's part information and the second model (step S202c).
  • the correction amount acquisition unit 212b inputs the interocular distance generated in step S101b into a second model to which the value obtained in step S202b is applied, and obtains a correction voltage value for photographing the subject.
  • the correction unit 212d uses the correction amount calculated in step S202c to correct the intermediate control amount calculated in step S202a to calculate the reference control amount for shooting the target (step S202d), and returns to information processing.
  • the output unit 213e determines an output control amount based on the reference control amount calculated in step S202d (step S203) and outputs the output control amount to the imaging device 101.
  • the detailed processing in step S203 may be similar to that in step S103 according to the first embodiment.
  • Steps S104 to S106 are executed in the same manner as in embodiment 1.
  • the output unit 213e determines an output control amount different from the output control amount determined immediately before, based on the output pattern information described above (step S203).
  • correction is performed using the history information 214a, so that the corrected reference control amount can be obtained from the first shooting in step S104.
  • the history information 214a is not limited to that relating to a specific object, when using the history of the IDs of multiple objects, the effect of differences in interocular distance between objects is cancelled out, and this correction can reduce deviations mainly caused by the characteristics of the imaging device 101 and environmental factors such as temperature. As a result, there is a high possibility that a control amount close to the focus control amount can be calculated from the first shooting. Therefore, the focusing time can be reduced.
  • the focus control unit 212 includes a calculation unit 212a and a correction unit 212d.
  • the calculation unit 212a calculates an intermediate control amount in photographing the object based on part information of the object and a first model that inputs the part information and outputs an intermediate control amount.
  • the correction unit 212d calculates a reference control amount in photographing the object by correcting the intermediate control amount in photographing the object using a correction amount calculated using the history information 214a.
  • Each of the intermediate control amount and the reference control amount is represented by a voltage value for controlling the focal length of the liquid lens.
  • a reference control amount which is a value corrected using the history information 214a, and thus it becomes possible to achieve high-speed focusing, as in embodiment 1.
  • the focus control unit 212 includes an output unit 213e that determines an output control amount that the imaging device 101 uses for focus control when imaging the target based on a reference control amount for imaging the target, and outputs the determined output control amount to the imaging device 101.
  • the output control amount is represented by a voltage value for controlling the focal length of the liquid lens.
  • the imaging device 101 can image the subject according to the output control amount determined based on the reference control amount. Therefore, the subject can be imaged according to an appropriate output control amount. Therefore, it becomes possible to focus the liquid lens at high speed. In addition, since the focal length of the liquid lens can be controlled, it becomes possible to focus the liquid lens at high speed.
  • the output unit 213e determines the output control amount by applying output pattern information that specifies determining a plurality of output control amounts within a search range that is defined based on a reference control amount.
  • the imaging device 101 can image the target according to the output control amount determined based on the reference control amount. Therefore, the target can be imaged according to the output control amount within an appropriate search range. This makes it possible to focus the liquid lens at high speed.
  • the history information 214a is information that associates part information used for focus control in past shooting with the difference between the intermediate control amount and the actual focus voltage value.
  • the focus control unit 212 further includes a correction amount acquisition unit 212b that determines the correction amount in shooting the target based on the part information of the target and a second model that takes the part information as input and outputs the correction amount.
  • the second model is defined based on the history information 214a.
  • the correction amount can be calculated using the history information 214a, so there is no need to install additional sensors or the like. This makes it possible to prevent the configuration from becoming complicated or large in order to focus the liquid lens at high speed.
  • the second model is a model that is defined based on the relationship between the part information and the difference.
  • the focus control unit 212 further includes a correction parameter acquisition unit 212c that determines the values of the parameters included in the second model based on the history information 214a.
  • the correction amount acquisition unit 212b uses the second model to which the determined parameter values are applied.
  • the second model for determining the correction amount can be appropriately determined using the history information 214a, so there is no need to provide additional sensors or the like. This makes it possible to prevent the configuration from becoming complicated or large in size in order to focus the liquid lens at high speed.
  • ⁇ Embodiment 3> In general, shooting may continue even after a target image in which the target is focused is obtained. In shooting after focusing, a correction amount when the target is focused may be applied. In this embodiment, an example in which such shooting after focusing is applied to the information processing system 200 according to the second embodiment will be described.
  • the information processing system according to the third embodiment may be configured functionally in the same way as the information processing system according to the second embodiment (see FIG. 11).
  • the correction unit 212d further includes a function for taking an image after a target image in which the target is in focus has been obtained.
  • the correction unit 212d corrects the intermediate control amount in photographing the target after the target image in focus on the target has been obtained, using the correction amount in photographing the target that generated the target image in focus on the target. In this way, the correction unit 212d determines the reference control amount in photographing the target after the target image in focus on the target has been obtained.
  • the amount of correction in the shooting that generated the target image in focus on the target is the amount of correction applied by the correction unit 212d to obtain the previous reference control amount. Therefore, it is recommended that the correction unit 212d hold the amount of correction applied immediately before.
  • the information processing system according to this embodiment may be physically configured in the same manner as the information processing system 100 according to the first embodiment.
  • Fig. 16 is a flowchart showing an example of information processing according to the third embodiment.
  • Fig. 16 shows a difference from the information processing according to the first embodiment. That is, the information processing according to this embodiment further includes steps S310 to S314 in addition to the processing in the information processing according to the second embodiment (see Fig. 14). Except for these, the information processing according to this embodiment may be similar to the information processing according to the second embodiment.
  • the body part detection unit 111 acquires the target image determined to be in focus from the imaging device 101 and executes a detection process (step S310) based on the acquired target image.
  • the immediately preceding image capture is the image capture performed in step S105 or S313. Details of the detection process (step S310) may be similar to those of the detection process (step S101), so a detailed description will be omitted here.
  • the focus control unit 212 uses the part information generated in step S101b and the correction amount used in the shooting that generated the target image in focus on the target to determine the reference control amount to be used for focus control in shooting the target (step S311).
  • FIG. 17 is a flowchart showing an example of the reference control amount acquisition process (step S311) according to the third embodiment.
  • the calculation unit 212a calculates an intermediate control amount for imaging the target based on the target body part information and the first model, similar to step S202a in embodiment 2 (step S311a).
  • step S311a the interocular distance generated in step S105 (i.e., the process equivalent to step S101b in step S105) is used as input to determine the intermediate voltage value for photographing the subject using the first model.
  • the correction unit 212d uses the correction amount used in the shooting that generated the target image in focus on the target to correct the intermediate control amount calculated in step S311a, thereby calculating the reference control amount used in the shooting of the target (step S311d), and returns to information processing.
  • the output unit 213e determines an output control amount based on the reference control amount calculated in step S311d (step S312), and outputs the output control amount to the image capturing apparatus 101.
  • Detailed processing in step S312 may be similar to that in step S103 according to the first embodiment.
  • the image capturing device 101 acquires the output control amount determined in step S312, it captures an image of the target again according to the acquired output control amount (step S313).
  • the correction unit 212d determines whether the termination condition is met (step S314).
  • the termination condition is a condition for terminating image capture after an image of the subject in focus has been obtained. It is preferable that the termination condition be determined in advance.
  • the termination condition may be, for example, a predetermined number of times that the target has been photographed after a target image in focus on the target has been obtained, a predetermined amount of time has elapsed since the target image in focus on the target was obtained, or an instruction to terminate has been received from the user.
  • the termination condition may also be that a local image cannot be detected in step S310 (i.e., the process equivalent to step S101a in step S310). The termination condition is not limited to these.
  • the image with the clearest image e.g., the image with the highest focus score
  • the image with the largest iris area e.g., the image with the largest iris area, the image with an iris area equal to or larger than a predetermined value, etc.
  • the information processing system further includes the focus determination unit 113 that determines whether the target image is a focused image.
  • the correction unit 212d corrects the intermediate control amount in photographing the target by using the correction amount in photographing the target that generated the target image in focus on the target.
  • FIG. 18 is a diagram showing an example of the configuration of an information processing system 400 according to the fourth embodiment.
  • the information processing system 400 includes an information processing device 403 that replaces the information processing device 103. Except for this, the information processing system 400 may be configured similarly to the information processing system 100 according to the first embodiment.
  • the information processing device 403 includes a focus control unit 412 and a history storage unit 214 in place of the focus control unit 112 and the history storage unit 114 of the first embodiment. Except for these, the information processing device 203 may be configured similarly to the information processing device 103 of the first embodiment.
  • the history storage unit 214 may be generally similar to that in embodiment 2, so a detailed explanation will be omitted here.
  • the focus control unit 412 is a detailed example of the focus control unit 112 according to the first embodiment. Therefore, the outline of the focus control unit 212 is similar to that of the focus control unit 112 according to the first embodiment.
  • FIG. 19 is a diagram showing an example of the mechanical configuration of the focus control unit 412 according to the fourth embodiment.
  • the focus control unit 412 includes a calculation unit 212a, a correction unit 212d, and an output unit 213e similar to those in the second embodiment, and a correction amount acquisition unit 412b that replaces the correction amount acquisition unit 212b in the second embodiment.
  • the correction amount acquisition unit 412b obtains the average value of the differences contained in the history information 214a as the correction amount for the target photograph.
  • the correction amount acquisition unit 412b may change the method for determining the correction amount depending on whether the number of past photographs, i.e., the number of histories included in the history information 214a, satisfies a predetermined lower limit condition.
  • the lower limit condition is, for example, equal to or greater than a predetermined threshold, or exceeds a predetermined threshold. Note that the lower limit condition is not limited to these.
  • the correction amount acquisition unit 412b obtains the average value of the differences included in the history information 214a as the correction amount for shooting the target when the number of past shootings meets the lower limit condition. Also, when the number of past shootings does not meet the predetermined lower limit condition, the correction amount acquisition unit 412b obtains the correction amount for shooting the target using a value obtained by dividing the sum of the differences in the past shootings by a predetermined value.
  • the information processing system 400 according to this embodiment may be physically configured in the same manner as the information processing system 100 according to the first embodiment.
  • Fig. 20 is a flowchart showing an example of information processing according to the fourth embodiment.
  • Fig. 20 shows a part different from the information processing according to the first embodiment. That is, the information processing according to this embodiment includes step S402 instead of step S202 in the information processing according to the second embodiment (see Fig. 14). Except for this, the information processing according to this embodiment may be similar to the information processing according to the second embodiment.
  • step S101 which is the same as in embodiment 1, the focus control unit 212 uses the part information generated in step S101b and the history information 214a stored in the history storage unit 214 to determine a reference control amount to be used for focus control in photographing the target (step S402).
  • FIG. 21 is a flowchart showing an example of the reference control amount acquisition process (step S402) according to the fourth embodiment.
  • step S202a which is the same as in embodiment 2, the correction amount acquisition unit 412b determines the correction amount for photographing the target (step S402c).
  • the correction amount acquisition unit 412b obtains the average value of the differences included in the history information 214a as the correction amount for shooting the target. Also, when the number of past shootings does not satisfy the predetermined lower limit condition, the correction amount acquisition unit 412b obtains the correction amount for shooting the target using a value obtained by dividing the sum of the differences in the past shootings by a predetermined value.
  • the threshold included in the lower limit condition and the threshold by which the total is divided be 10.
  • threshold value included in the lower limit condition and the value by which the total is divided are not limited to 10 and may be changed as appropriate. Also, the threshold value included in the lower limit condition and the value by which the total is divided may be the same or different.
  • the correction unit 212d executes step S202d, which is the same as in embodiment 2, and then returns to information processing.
  • the correction is performed using the history information 214a, so that the corrected reference control amount can be obtained from the first shooting in step S104. Since the history information 214a is not limited to information relating to a specific target, this correction can reduce deviations that are mainly caused by the characteristics of the imaging device 101. As a result, there is a high possibility that a control amount close to the focus control amount can be calculated from the first shooting. Therefore, the focusing time can be reduced.
  • the focus control unit 412 includes a correction amount acquisition unit 412b.
  • the correction amount acquisition unit 412b obtains an average value of the differences included in the history information 214a as a correction amount for shooting the target.
  • the correction amount acquisition unit 412b obtains a correction amount for shooting the target using a value obtained by dividing the sum of the differences in the past shootings by a predetermined value.
  • the total is divided by a predetermined value to set the correction amount, thereby reducing the influence of the characteristics of each history. Therefore, even when the number of past shots is small, it is more likely that a reference control amount close to the focus control amount can be determined. Therefore, it becomes possible to focus quickly even when the number of past shots is small.
  • FIG. 22 is a diagram showing an example of the configuration of an information processing system 500 according to the fifth embodiment.
  • the information processing system 500 includes an information processing device 503 that replaces the information processing device 203. Except for this, the information processing system 500 may be configured similarly to the information processing system 100 according to the second embodiment.
  • the information processing device 503 further includes a history selection unit 521. Except for this, the information processing system 500 may be configured similarly to the information processing system 200 according to the second embodiment.
  • the history selection unit 521 selects the history information 214a to be referenced to obtain the correction amount from among the history information 214a stored in the history storage unit 214. This selection makes it possible to change part or all of the history information 214a to be referenced to obtain the correction amount.
  • the change conditions include, for example, at least one of (1) temperature conditions, (2) focusing time conditions, and (3) frequency conditions.
  • the temperature condition is a predetermined condition regarding at least one temperature change among the environmental temperature and the temperature of the image capturing device 101.
  • the temperature condition is that at least one temperature change among the environmental temperature and the temperature of the image capturing device 101 is equal to or greater than a predetermined temperature threshold.
  • the information processing system 500 may further include a temperature sensor.
  • the focusing time condition is a predetermined condition regarding the focusing time.
  • the focusing time is the length of time required for a focused image to be generated when photographing a subject.
  • the focusing time condition is that the focusing time is equal to or greater than a predetermined time threshold.
  • the frequency condition is a predetermined condition regarding the number of times photographs are taken within a specified time.
  • the frequency condition is that the number of times photographs are taken within a specified time is equal to or greater than a predetermined frequency threshold.
  • the focus control unit 212 may, for example, use the part information of the target obtained by the part detection unit 111 and the history information 114a selected by the history selection unit 521 to obtain a reference control amount to be used for focus control in photographing the target.
  • the correction parameter acquisition unit 212c may obtain the value of a parameter included in the second model based on the history information 214a selected by the history selection unit 521.
  • the information processing system 500 according to this embodiment may be physically configured in the same manner as the information processing system 100 according to the first embodiment.
  • the information processing executed by the information processing system 500 according to the fifth embodiment further includes a reference history change process.
  • the reference history change process is a process for changing the history information 214a that is referenced to obtain the correction amount.
  • the information processing device 503 may repeatedly execute the reference history change process during operation.
  • FIG. 23 is a flowchart showing an example of a reference history change process according to embodiment 5.
  • the history selection unit 521 determines whether the change conditions are met (step S501).
  • step S501 If it is determined that the change condition is not met (step S501; No), the history selection unit 521 repeats step S501.
  • the history selection unit 521 selects the history information 214a to be referenced to obtain the correction amount from the history information 214a stored in the history storage unit 214 (step S502).
  • the history selection unit 521 it is desirable for the history selection unit 521 to select history information 214a that satisfies the elements used in the determination change condition that is determined to be satisfied.
  • the elements used in the change condition are the environmental temperature, the temperature of the image capture device, the focusing time, and the frequency.
  • the history selection unit 521 may select history information 214a for a predetermined time period when the image was taken, such as from 12:00 to 14:00, during which the temperature is relatively high.
  • the history information 214a may further include an element used in the change condition.
  • the history selection unit 521 may select history information 214a in which the value of the element used in the judged change condition that is judged to be satisfied is within a predetermined range from the current state.
  • the history information 214a referenced to determine the correction amount can be changed to a more appropriate one, thereby reducing the discrepancy between the reference control amount and the focus control amount.
  • the change in the history information 214a referenced to obtain the correction amount is described as being applied to the second embodiment, but it can also be applied to other embodiments.
  • the correction amount acquisition unit 412b can refer to the history information 214a selected by the history selection unit 521.
  • the information processing system 500 further includes the history storage unit 214 for storing the history information 214 a , and the history selection unit 521 .
  • the history selection unit 521 selects the history information 214a stored in the history storage unit 214 that is to be referenced to determine the correction amount.
  • the temperature condition is a predetermined condition regarding at least one of the temperature changes of the environmental temperature and the temperature of the imaging device 101.
  • the focusing time condition is a predetermined condition regarding the focusing time, which is the length of time required for a focused image to be generated when photographing a subject.
  • the frequency condition is a predetermined condition regarding the number of times photographing is performed within a specified period of time.
  • the characteristics of a liquid lens are easily affected by changes in temperature.
  • the history information 214a referenced to determine the correction amount can be changed to more appropriate information according to the characteristics of the liquid lens, thereby reducing the discrepancy between the reference control amount and the focusing control amount. This makes it possible to focus quickly.
  • the history information 214a referenced to determine the correction amount may be inappropriate.
  • By changing such history information 214a to more appropriate information it is possible to reduce the discrepancy between the reference control amount and the focusing control amount. This makes it possible to focus quickly.
  • the temperature of the image capture device 101 increases due to heat generated during processing, and the characteristics of the liquid lens may change.
  • the history information 214a to a more appropriate one, it is possible to reduce the discrepancy between the reference control amount and the focus control amount. This makes it possible to focus quickly.
  • the interocular distance in a binocular image may change depending on whether or not the subject is wearing something. For example, if the subject is wearing glasses, the interocular distance may differ from the interocular distance according to the distance between the subject and the image capture device 101 due to distortion caused by the lenses.
  • the interocular distance in a binocular image may change depending on the direction of the subject's face. For example, even if the subject is at the same distance from the image capture device 101, the interocular distance in a binocular image will usually differ depending on whether the subject is facing directly ahead or at an angle to the image capture device 101.
  • FIG. 24 is a diagram showing an example of the configuration of an information processing system 600 according to the sixth embodiment.
  • the information processing system 600 includes an information processing device 603 that replaces the information processing device 103. Except for this point, the information processing system 600 may be configured similarly to the information processing system 100 according to the first embodiment.
  • the information processing device 603 includes a body part detection unit 611 that replaces the body part detection unit 111 of the first embodiment.
  • the information processing device 603 further includes an analysis unit 617. Except for these, the information processing device 603 may be configured similarly to the information processing device 103 of the first embodiment.
  • the analysis unit 617 acquires a target image from the image capture device 101.
  • the analysis unit 617 analyzes the target image to acquire at least one of the presence or absence of an item worn by the target and the facial orientation.
  • Examples of articles of clothing include glasses, sunglasses, eye patches, goggles, masks, and niqabs.
  • General techniques such as pattern matching and machine learning models may be used to analyze the target image.
  • the analysis unit 617 may use an analysis model with a target image as input to obtain at least one of the presence or absence of the subject's clothing and the facial direction.
  • the analysis model is a machine learning model that has been trained to analyze the target image and output at least one of the presence or absence of the subject's clothing and the facial direction.
  • supervised learning may be performed using training data in which a correct answer label for at least one of the presence or absence of the subject's clothing and the facial direction is attached to the target image.
  • the part detection unit 611 when the part detection unit 611 acquires a target image from the image capture device 101, it detects a part image including a specific part of the target from the target image.
  • the part detection unit 611 acquires the analysis result of the analysis unit 617. Then, the part detection unit 611 obtains part information based on the detected part image and at least one of the presence or absence of an article of clothing worn by the target and the facial direction.
  • the part detection unit 611 corrects the dimensions found from the part image (i.e., the dimensions relating to a specific part in the part image) using at least one of the presence or absence of an article of clothing on the subject and the facial orientation, and obtains part information including the corrected dimensions.
  • the information processing system according to this embodiment may be physically configured in the same manner as the information processing system 100 according to the first embodiment.
  • Fig. 25 is a flowchart showing an example of information processing according to the sixth embodiment.
  • Fig. 25 shows a difference from the information processing according to the first embodiment. That is, the information processing according to this embodiment includes step S601 instead of step S101 according to the first embodiment. Except for this, the information processing according to this embodiment may be similar to the information processing according to the first embodiment.
  • the analysis unit 617 and the part detection unit 611 execute the detection process (step S601) based on the target image generated in response to the trigger signal, for example.
  • FIG. 26 is a flowchart showing an example of detection processing (step S601) according to embodiment 6.
  • the analysis unit 617 acquires the target image from the image capture device 101, it analyzes the target image (step S601c). As a result, the analysis unit 617 acquires at least one of the presence or absence of an article of clothing worn by the subject and the facial orientation as the analysis result.
  • the body part detection unit 611 executes step S101a similar to that in embodiment 1.
  • the part detection unit 611 generates part information based on the analysis results obtained in step S601c and the part images detected in step S101a (step S601b), and returns to the imaging process.
  • the part detection unit 611 acquires an analysis result from the analysis unit 617 indicating that glasses are being worn.
  • the part detection unit 611 obtains the interocular distance.
  • the part detection unit 611 obtains the corrected interocular distance by multiplying the interocular distance in the part image by a value that is predetermined for the wearing of glasses.
  • the part detection unit 611 acquires an analysis result from the analysis unit 617 indicating that the subject is facing at an angle of ⁇ degrees with respect to the image capture device 101. Furthermore, suppose that the part detection unit 611 determines the interocular distance. In this case, for example, the part detection unit 611 determines the corrected interocular distance by dividing the interocular distance in the part image by cos ⁇ .
  • the correction methods are not limited to these.
  • the dimensions of a specific part in the part image can be corrected to values that are closer to the actual dimensions. This makes it possible to improve the accuracy of the reference control amount calculated using those dimensions.
  • the information processing system 600 further includes an analysis unit 617 that acquires at least one of the presence or absence of an attachment of the target and the facial orientation based on the target image.
  • the body part detection unit 611 obtains body part information on a specific body part of the target based on a body part image including the specific body part of the target and at least one of the attachment and the facial orientation of the target.
  • a part detection means for obtaining part information regarding a specific part of a target based on a part image including the specific part of the target; and a focus control unit that uses information about a part of the object and history information including information used for focus control in past imaging to determine a reference control amount to be used for focus control in imaging the object.
  • the predetermined site is a site included in the target, 2.
  • An imaging device that captures an image of the object and generates an image of the object including at least a part of the object; and an authentication means for authenticating the target by using an image included in the target image, 3.
  • the information processing system according to 1.
  • the photographing device includes a liquid lens, and photographs the object by adjusting a focus of the liquid lens in accordance with control using the reference control amount.
  • the focus control means a calculation means for calculating an intermediate control amount in imaging of the object based on part information of the object and a first model that receives the part information as an input and outputs an intermediate control amount; a correction means for correcting an intermediate control amount in shooting the object using a correction amount obtained using the history information, thereby obtaining a reference control amount in shooting the object, 4.
  • the information processing system according to 3., wherein each of the intermediate control amount and the reference control amount is expressed as a voltage value for controlling a focal length of the liquid lens. 5.
  • the focus control means further includes an output means for determining an output control amount used by the photographing device for focus control in photographing the object based on a reference control amount in photographing the object, and outputting the determined output control amount to the photographing device; 5.
  • the history information is information that associates information about a part used in focus control in past photographing with a difference between an intermediate control amount and an actual focusing voltage value
  • the focus control means further includes a correction amount acquisition means for acquiring a correction amount for shooting the object based on part information of the object and a second model that receives the part information as an input and outputs a correction amount;
  • the information processing system according to any one of 4. to 6., wherein the second model is defined based on the history information.
  • the second model is a model defined based on a relationship between the part information and the difference
  • the focus control means further includes a correction parameter acquisition means for determining values of parameters included in the second model based on the history information; 8.
  • the information processing system according to 7. wherein the correction amount acquisition means uses a second model to which the determined parameter value is applied.
  • the imaging device further includes a focus determination unit for determining whether the target image is a focused image,
  • the information processing system described in any one of 4. to 8. wherein the correction means corrects an intermediate control amount in photographing the object after the object image in focus on the object has been obtained, by using the correction amount in photographing which generated the object image in focus on the object. 10.
  • the focus control means further includes correction amount acquisition means for calculating an average value of the differences included in the history information as a correction amount for photographing the object when the number of times of past photographing satisfies a predetermined lower limit condition, and for calculating an amount of correction for photographing the object using a value obtained by dividing the sum of the differences in the past photographing by a predetermined value when the number of times of past photographing does not satisfy the predetermined lower limit condition.
  • a history storage means for storing the history information; The information processing system described in any one of 4. to 10.
  • the apparatus further comprises a history selection means for selecting history information 214a to be referenced to obtain the correction amount from among the history information stored in the history storage means when at least one of the following conditions is satisfied: (1) a predetermined temperature condition for at least one temperature change of an environmental temperature and a temperature of the photographing device; (2) a predetermined focusing time condition for a focusing time that is a length of time required for a focused image to be generated in photographing the subject; and (3) a predetermined frequency condition for the number of times photographing is performed within a specified time. 12.
  • the apparatus further includes an analysis means for acquiring at least one of the presence or absence of an attachment of the subject and the facial orientation based on the subject image, The information processing system according to any one of 1.
  • the part detection means obtains part information regarding a predetermined part of the target based on a part image including the predetermined part of the target and at least one of an attachment and a facial direction of the target.
  • One or more computers determining site information regarding the predetermined site of the subject based on a site image including the predetermined site of the subject; an information processing method for determining a reference control amount to be used for focus control in imaging of the object, using part information of the object and history information including information used for focus control in past imaging.
  • the predetermined site is a site included in the target, 14.
  • the step of obtaining a reference control amount used in the focus control includes: determining an intermediate control amount for imaging the object based on part information of the object and a first model that receives the part information as an input and outputs an intermediate control amount; and calculating a reference control amount in photographing the object by correcting an intermediate control amount in photographing the object using a correction amount calculated using the history information; 17.
  • each of the intermediate control amount and the reference control amount is expressed as a voltage value for controlling a focal length of the liquid lens.
  • the obtaining of the reference control amount used for the focus control further includes determining an output control amount used by the photographing device for focus control in photographing the object based on a reference control amount in photographing the object, and outputting the determined output control amount to the photographing device; 18.
  • the information processing method according to 17. wherein the output control amount is expressed as a voltage value for controlling a focal length of the liquid lens. 19.
  • the information processing method is determined by applying output pattern information that specifies determining a plurality of output control amounts within a search range defined based on the reference control amount.
  • the history information is information that associates information about a part used in focus control in past photographing with a difference between an intermediate control amount and an actual focusing voltage value, determining a reference control amount used for the focus control further includes determining a correction amount in photographing the object based on part information of the object and a second model that receives the part information as an input and outputs a correction amount; 19.
  • the information processing method according to any one of 17. to 19., wherein the second model is defined based on the history information. 21.
  • the second model is a model defined based on a relationship between the part information and the difference, determining a reference control amount used in the focus control further includes determining values of parameters included in the second model based on the history information; 21.
  • the method further includes determining whether the target image is a focused image; 22.
  • calculating a reference control amount used for focus control further includes: calculating an average value of the differences included in the history information as a correction amount for shooting the object when the number of times of past shooting satisfies a predetermined lower limit condition; and calculating a value obtained by dividing the sum of the differences in the past shootings by a predetermined value as the correction amount for shooting the object when the number of times of past shootings does not satisfy the predetermined lower limit condition.
  • the method further includes acquiring at least one of the presence or absence of an attachment of the target and a facial orientation of the target based on the target image; 27.
  • the information processing method according to any one of 14.
  • obtaining the part information includes obtaining part information related to a specific part of the target based on a part image including the specific part of the target and at least one of an attachment and a facial direction of the target.
  • determining site information regarding the predetermined site of the subject based on a site image including the predetermined site of the subject;
  • a recording medium having a program recorded thereon for executing the following: determining a reference control amount to be used for focus control in photographing the object, using information about the part of the object and history information including information used for focus control in past photographing.
  • the predetermined site is a site included in the target, 29.
  • the recording medium according to 28., wherein the part information includes dimensions related to the predetermined part. 30.
  • the program comprises: Photographing the object using an imaging device to generate an object image including at least a portion of the object; further performing authentication of the object using an image included in the object image; 30.
  • the step of obtaining a reference control amount used in the focus control includes: determining an intermediate control amount for imaging the object based on part information of the object and a first model that receives the part information as an input and outputs an intermediate control amount; and calculating a reference control amount in photographing the object by correcting an intermediate control amount in photographing the object using a correction amount calculated using the history information; 30.
  • each of the intermediate control amount and the reference control amount is expressed as a voltage value for controlling a focal length of the liquid lens.
  • the program further causes the program to execute the step of determining an output control amount to be used by the image capture device for focus control in photographing the object based on a reference control amount in photographing the object, and outputting the determined output control amount to the image capture device, 32.
  • 33. The recording medium according to 32., wherein the output control amount is determined by applying output pattern information that determines a plurality of output control amounts within a search range defined based on the reference control amount. 34.
  • the history information is information that associates information about a part used in focus control in past photographing with a difference between an intermediate control amount and an actual focusing voltage value, determining a reference control amount used for the focus control further includes determining a correction amount in photographing the object based on part information of the object and a second model that receives the part information as an input and outputs a correction amount; 34.
  • the storage medium according to any one of claims 31 to 33, wherein the second model is defined based on the history information. 35.
  • the second model is a model defined based on a relationship between the part information and the difference, determining a reference control amount used in the focus control further includes determining values of parameters included in the second model based on the history information; 34.
  • the recording medium according to claim 34 wherein a second model to which the determined parameter value is applied is used in determining the correction amount in photographing the object. 36.
  • the program further causes the program to determine whether the target image is a focused image, 36.
  • the recording medium described in any one of 31. to 35. wherein in obtaining a reference control amount in photographing the object, in photographing the object after the object image in focus on the object is obtained, an intermediate control amount in photographing the object is corrected using the correction amount in photographing which generated the object image in focus on the object. 37.
  • determining the reference control amount used for focus control further includes determining an average value of the differences included in the history information as a correction amount for photographing the object when the number of times of past photographing satisfies a predetermined lower limit condition, and determining a value obtained by dividing the sum of the differences in the past photographing by a predetermined value as the correction amount for photographing the object when the number of times of past photographing does not satisfy the predetermined lower limit condition.
  • the program further executes selecting, from the history information, history information to be referenced for determining the correction amount when at least one of the following conditions is satisfied: (1) a predetermined temperature condition regarding at least one temperature change among an environmental temperature and a temperature of the photographing device; (2) a predetermined focusing time condition regarding a focusing time that is a length of time required for a focused image to be generated in photographing the target; and (3) a predetermined frequency condition regarding the number of times photographing is performed within a predetermined time. 39.
  • the program further causes the computer to acquire at least one of the presence or absence of an attachment of the target and a facial orientation based on the target image, 29.
  • the recording medium described in any one of 28. to 38., wherein obtaining the part information includes obtaining part information related to a specific part of the target based on a part image including the specific part of the target and at least one of an attachment and a facial direction of the target.
  • Information processing system 101 Image capture device 102 Object detection sensor 103, 203, 403, 503, 603 Information processing device 103 Information processing device 111, 611 Part detection unit 112, 212, 412 Focus control unit 113 Focus determination unit 114, 214 History storage unit 114a, 214a History information 115 Storage control unit 116 Authentication unit 212a Calculation unit 212b, 412b Correction amount acquisition unit 212c Correction parameter acquisition unit 212d Correction unit 213e Output unit 521 History selection unit 617 Analysis unit

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Automatic Focus Adjustment (AREA)
  • Focusing (AREA)

Abstract

Un système de traitement d'informations (100) comprend une unité de détection de partie (111) et une unité de commande de mise au point (112). L'unité de détection de partie (111) acquiert des informations de partie concernant une partie prescrite d'un sujet sur la base d'une image de partie qui comprend la partie prescrite du sujet. L'unité de commande de mise au point (112) détermine une quantité de commande de référence à utiliser pour une commande de mise au point dans l'imagerie du sujet en utilisant les informations de partie du sujet et des informations d'historique (114a) qui comprennent des informations qui ont été utilisées pour une commande de mise au point dans l'imagerie dans le passé.
PCT/JP2023/000343 2023-01-10 2023-01-10 Système de traitement d'informations, dispositif de traitement d'informations, procédé de traitement d'informations et support d'enregistrement WO2024150283A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2023/000343 WO2024150283A1 (fr) 2023-01-10 2023-01-10 Système de traitement d'informations, dispositif de traitement d'informations, procédé de traitement d'informations et support d'enregistrement

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2023/000343 WO2024150283A1 (fr) 2023-01-10 2023-01-10 Système de traitement d'informations, dispositif de traitement d'informations, procédé de traitement d'informations et support d'enregistrement

Publications (1)

Publication Number Publication Date
WO2024150283A1 true WO2024150283A1 (fr) 2024-07-18

Family

ID=91896566

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/000343 WO2024150283A1 (fr) 2023-01-10 2023-01-10 Système de traitement d'informations, dispositif de traitement d'informations, procédé de traitement d'informations et support d'enregistrement

Country Status (1)

Country Link
WO (1) WO2024150283A1 (fr)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004038181A (ja) * 2002-07-11 2004-02-05 Campus Create Co Ltd 自動焦点調節補助眼鏡とそのキャリブレーション方法
CN112312016A (zh) * 2020-10-28 2021-02-02 维沃移动通信有限公司 拍摄处理方法、装置、电子设备和可读存储介质
WO2022208606A1 (fr) * 2021-03-29 2022-10-06 日本電気株式会社 Système d'entraînement, système d'authentification, procédé d'entraînement, programme d'ordinateur, dispositif de génération de modèle d'apprentissage et dispositif d'estimation

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004038181A (ja) * 2002-07-11 2004-02-05 Campus Create Co Ltd 自動焦点調節補助眼鏡とそのキャリブレーション方法
CN112312016A (zh) * 2020-10-28 2021-02-02 维沃移动通信有限公司 拍摄处理方法、装置、电子设备和可读存储介质
WO2022208606A1 (fr) * 2021-03-29 2022-10-06 日本電気株式会社 Système d'entraînement, système d'authentification, procédé d'entraînement, programme d'ordinateur, dispositif de génération de modèle d'apprentissage et dispositif d'estimation

Similar Documents

Publication Publication Date Title
Hansen et al. In the eye of the beholder: A survey of models for eyes and gaze
US9733703B2 (en) System and method for on-axis eye gaze tracking
JP6873918B2 (ja) 傾斜シフト虹彩撮像
US10521661B2 (en) Detailed eye shape model for robust biometric applications
JP7542563B2 (ja) 眼追跡待ち時間向上
WO2017013913A1 (fr) Dispositif de détection du regard, terminal de lunetterie, procédé de détection du regard et programme
JP5001930B2 (ja) 動作認識装置及び方法
EP3676688A1 (fr) Modèle détaillé de forme d' il pour applications biométriques robustes
MX2012010602A (es) Aparato para el reconocimiento de la cara y metodo para el reconocimiento de la cara.
US10430644B2 (en) Blended iris and facial biometric system
JP5776323B2 (ja) 角膜反射判定プログラム、角膜反射判定装置および角膜反射判定方法
JP7439980B2 (ja) 生体認証装置、生体認証方法、および生体認証用プログラム
US11163994B2 (en) Method and device for determining iris recognition image, terminal apparatus, and storage medium
WO2022244357A1 (fr) Système et procédé d'authentification de parties du corps
EP2198391A1 (fr) Système et procédé biométrique multimodal longue distance
WO2018220963A1 (fr) Dispositif de traitement d'informations, procédé de traitement d'informations et programme
JP5971712B2 (ja) 監視装置及び方法
US10817722B1 (en) System for presentation attack detection in an iris or face scanner
Perra et al. Adaptive eye-camera calibration for head-worn devices
WO2024150283A1 (fr) Système de traitement d'informations, dispositif de traitement d'informations, procédé de traitement d'informations et support d'enregistrement
JP5688514B2 (ja) 視線計測システム、方法およびプログラム
JP2021179815A (ja) 瞳孔径変化を用いた視線計測用キャリブレーション方法及び装置、並びに視線計測装置及びカメラ装置
CN118279966B (zh) 一种视线追踪方法、装置、电子设备及存储介质
WO2024057508A1 (fr) Dispositif de traitement d'informations, système de traitement d'informations, procédé de traitement d'informations et support d'enregistrement
Middendorff et al. Multibiometrics using face and ear

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23915907

Country of ref document: EP

Kind code of ref document: A1