US20230326247A1 - Information processing device - Google Patents

Information processing device Download PDF

Info

Publication number
US20230326247A1
US20230326247A1 US18/209,904 US202318209904A US2023326247A1 US 20230326247 A1 US20230326247 A1 US 20230326247A1 US 202318209904 A US202318209904 A US 202318209904A US 2023326247 A1 US2023326247 A1 US 2023326247A1
Authority
US
United States
Prior art keywords
gate
distance
matching
captured image
person
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
US18/209,904
Inventor
Taketo Kochi
Kenji Saito
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
NEC Corp
Original Assignee
NEC Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by NEC Corp filed Critical NEC Corp
Priority to US18/209,904 priority Critical patent/US20230326247A1/en
Publication of US20230326247A1 publication Critical patent/US20230326247A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/172Classification, e.g. identification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/62Analysis of geometric attributes of area, perimeter, diameter or volume
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/168Feature extraction; Face representation
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C9/00Individual registration on entry or exit
    • G07C9/10Movable barriers with registering means

Definitions

  • the present invention relates to an information processing device, an information processing system, a program, and an information processing method.
  • a matching system As a means for restricting and managing persons who enter and exit a specific place such as an office or an event venue, a matching system is used that checks whether or not a person who is about to pass through is a previously registered person.
  • a walk-through face authentication system that performs face authentication based on the face image of a person captured by a camera installed at a gate has been used owing to development of a person face authentication technology.
  • a walk-through face authentication system needs to perform matching of persons who are in a line at a gate in order and to open and close the gate so that the persons can smoothly pass through the gate.
  • there are various persons who are going to pass through the gate and it is difficult to properly determine their sequence. As a result, there arises a problem that smoothly passing through the gate is difficult.
  • an object of the present invention is to solve the abovementioned problem that smoothly passing through the gate is difficult.
  • An information processing device includes: an image processing means that extracts a feature value of an object within a captured image obtained by capturing a pre-passing region of a gate, and stores matching information relating to matching of the object based on the feature value; a distance estimating means that estimates a distance from the gate to the object within the captured image; and a matching means that executes matching determination based on the estimated distance and the stored matching information of the object that the distance has been estimated.
  • an information processing system includes: a capturing means that acquires a captured image obtained by capturing a pre-passing region of a gate; an image processing means that extracts a feature value of an object within the captured image, and stores matching information relating to matching of the object based on the feature value; a distance estimating means that estimates a distance from the gate to the object within the captured image; and a matching means that executes matching determination based on the estimated distance and the stored matching information of the object that the distance has been estimated.
  • a program includes instructions for causing an information processing device to realize: an image processing means that extracts a feature value of an object within a captured image obtained by capturing a pre-passing region of a gate, and stores matching information relating to matching of the object based on the feature value; a distance estimating means that estimates a distance from the gate to the object within the captured image; and a matching means that executes matching determination based on the estimated distance and the stored matching information of the object that the distance has been estimated.
  • an information processing method includes: extracting a feature value of an object within a captured image obtained by capturing a pre-passing region of a gate, and storing matching information relating to matching of the object based on the feature value; estimating a distance from the gate to the object within the captured image; and executing matching determination based on the estimated distance and the stored matching information of the object that the distance has been estimated.
  • the present invention can provide an information processing device which can realize smoothly passing through a gate.
  • FIG. 1 is a view showing a situation in which a face authentication system in a first example embodiment of the present invention is used;
  • FIG. 2 is a block diagram showing the configuration of the face authentication system in the first example embodiment of the present invention
  • FIG. 3 is a view showing a situation in which an image is captured by the face authentication system disclosed in FIG. 1 ;
  • FIG. 4 is a view showing a situation in which the face authentication system disclosed in FIG. 1 is used;
  • FIG. 5 is a flowchart showing a processing operation by the face authentication system disclosed in FIG. 1 ;
  • FIG. 6 is a flowchart showing a processing operation by the face authentication system disclosed in FIG. 1 ;
  • FIG. 7 is a block diagram showing the configuration of an information processing system in a second example embodiment of the present invention.
  • FIG. 8 is a block diagram showing the configuration of an information processing system in a third example embodiment of the present invention.
  • FIG. 9 is a block diagram showing the configuration of an information processing device in the third example embodiment of the present invention.
  • FIG. 1 is a view showing a situation in which a face authentication system is used.
  • FIG. 2 is a view showing the configuration of the face authentication system.
  • FIGS. 3 to 6 are views for describing a processing operation by the face authentication system.
  • a face authentication system 10 (an information processing system) according to the present invention is a system used for restricting and managing entry/exit of a person (an object) in a specific place such as an office or an event venue.
  • a capture device C included by the face authentication system 10 is installed, for each gate that is opened/closed when a person enters/exits, near a place in which the gate is set up.
  • a plurality of gates G are arranged in parallel and adjacent to each other, and configured so that persons pass in a direction indicated by an arrow from the right side of FIG. 1 to the gates G. Therefore, a region on the right side in FIG. 1 with reference to each of the gates G is a region where persons remain before passing through the gate (a pre-passing side region).
  • a pre-passing side region In the pre-passing side regions of the gates G, lanes in which persons who are going to pass through the gates G make lines and pass are located in parallel so as to correspond to the gates G, respectively.
  • the lanes may be or may not be partitioned by some member.
  • the number of the gates G may be one.
  • the capture device C included by the face authentication system 10 in this example embodiment is installed near the corresponding gate G and on the right side in view of a person heading to the gate G.
  • a position to install the capture device is not limited to the position shown in FIG. 1 , and may be any position such as on the left side of the gate or above the gate.
  • the face authentication system 10 also includes display devices D in the vicinity of the respective capture devices C.
  • the face authentication system 10 captures an image of a person heading to the gate G by the capture device C included by the system. Then, the face authentication system 10 executes a process of matching to check whether or not the person shown in the captured image is a previously registered person based on the face image of the person and, when the matching succeeds, opening the gate G so that the person can pass through.
  • the configuration of the face authentication system 10 will be described in detail.
  • the face authentication system 10 in this example embodiment is an information processing device including an arithmetic device and a storage device, configured integrally with the capture device C (a camera) and the display device D (a display).
  • the capture device C is equipped with the information processing device executing a face authentication process including the arithmetic device and the storage device and with the display device D.
  • the face authentication system 10 is not necessarily limited to being configured integrally with the capture device C and the display device D.
  • the capture device C, the display device D, and the information processing device processing a captured image may be configured by separate devices and installed in separate places.
  • the face authentication system 10 includes the capture device C and the display device D, and also includes a feature value extraction unit 11 , a distance measurement unit 12 , a matching unit 13 and a gate control unit 14 that are structured by execution of a program by the arithmetic device. Moreover, the face authentication system 10 includes a feature value storage unit 15 and a matching data storage unit 16 that are structured in the storage device.
  • the abovementioned capture device C (a capturing means) includes a camera and a camera control unit that acquire captured images of the pre-passing side region with reference to the gate G, that is, a region before gate in the corresponding lane at predetermined frame rates.
  • a range sandwiched between lines denoted by reference symbol Ca is a capture region.
  • a captured image captured by the capture device C is as shown in the upper view of FIG. 3 .
  • a captured image is set to be substantially in focus within a range of a preset distance in the perspective direction with reference to the capture device C.
  • the feature value extraction unit 11 executes a process of extracting a person within the captured image to extract the feature value of the person first.
  • the feature value extraction unit 11 extracts a person within the captured image, targets all the extracted persons, and generates a feature value necessary for matching from the face region of each of the persons.
  • the feature value is, for example, information used by the matching unit 13 later to calculate a matching score such as the degree of similarity to the feature value of a person previously registered in the matching data storage unit 16 and execute a matching process.
  • the feature value may be a feature value used in the existing face matching technique, or may be a feature value calculated by another method.
  • the feature value extraction unit 11 stores the extracted feature value as information relating to person matching, that is, information used for the matching process (matching information) into the feature value storage unit 15 .
  • the feature value is stored into the feature value storage unit 15 in association with information for identifying the person within the captured image.
  • the person within the captured image may be tracked in a subsequent captured image. In such a case, the tracked person and the feature value in the feature value storage unit 15 are associated with each other.
  • the feature value extraction unit 11 may store a plurality of feature values with respect to one person by extracting feature values from different captured images, respectively. Moreover, the feature value extraction unit 11 may store only one feature value when extracting feature values from different captured images. That is, in the case of extracting a feature value every time a captured image is acquired with respect to the same person, the feature value extraction unit 11 may update and store only one feature value. For example, the feature value extraction unit 11 may judge the qualities of the extracted feature values and store only one feature value of the highest quality in association with the person.
  • the feature value extraction unit 11 may execute the matching process by using the matching unit 13 to be described later.
  • the feature value extraction unit 11 issues a matching instruction to the matching unit 13 to execute the matching process of matching between the feature value of the person within the captured image extracted as described above and a previously registered person.
  • the feature value extraction unit 11 may acquire the result of the matching by the matching unit 13 and store the matching result as the matching information relating to person matching into the feature value storage unit 15 .
  • the feature value is stored into the feature value storage unit 15 in association with the information for identifying the person within the captured image.
  • the person within the captured image may be tracked in a subsequent captured image. In such a case, the tracked person and the matching result in the feature value storage unit 15 are associated with each other.
  • the feature value extraction and the matching process may be executed on a plurality of captured images. In such a case, only one matching result may be updated and stored in association with the person.
  • the abovementioned distance measurement unit 12 (a distance estimating means) measures the distance from the gate G to the person within the captured image that the feature value has been extracted as described above. At this time, in this example embodiment, the distance measurement unit 12 uses an image portion of the person within the captured image to measure the distance from the gate G to the person. To be specific, the distance measurement unit 12 measures the distance in the following manner.
  • the distance measurement unit 12 first sets a reference value which is necessary for measurement of the distance to the person within the captured image. To be specific, the distance measurement unit 12 extracts an image portion of the face region of the person to be processed from the captured image. The extraction of the face region of the person is performed, for example, by judging from the position of a moving person with reference to the entire image, the color of the person, or the like. Then, the distance measurement unit 12 executes an attribute analysis process of identifying the attribute of the person from the image portion of the face region.
  • the attribute of the person is, for example, gender, age (generation, adult, child), and race.
  • the distance measurement unit 12 identifies the attribute of the person by extracting attribute identification information that is information necessary for identifying the attribute from the image portion of the face region, and comparing the extracted attribute identification information with previously registered attribute identification reference information.
  • the attribute identification information is, for example, information representing a physical characteristic that generally appears in the face region of a person for each attribute such as gender or age. Because the attribute analysis process of identifying the attribute such as gender or age (generation) of the person can be realized by the existing technique, the detailed description of the process will be omitted.
  • the attribute that can be identified is not limited to the abovementioned attributes, and may be any attribute.
  • the distance measurement unit 12 sets a reference value corresponding to the identified attribute of the person.
  • the reference value is previously registered in the storage device included by the face authentication system 10 .
  • a reference value of an eye-to-eye distance representing the distance between both the eyes of a person is registered for each attribute.
  • an eye-to-eye distance that is the reference value of an attribute “female” is set to a smaller value than the reference value of the attribute “male”.
  • the reference value is a value corresponding to a general physical constitution of the attribute of a person. Then, the distance measurement unit 12 sets the reference value registered corresponding to the attribute identified with respect to the person extracted from the captured image, as the reference value of the person.
  • the distance measurement unit 12 measures the distance to the person by using the reference value set for the person as described above.
  • the distance measurement unit 12 firstly detects, as object information representing the feature of the person within the captured image, an eye-to-eye distance representing the distance between both the eyes of the person from an image portion of the face region of the person.
  • the distance measurement unit 12 detects the eye-to-eye distances of the persons P 10 , P 11 and P 12 within the captured image as denoted reference values d 10 , d 11 and d 12 in the lower view of FIG. 3 .
  • the distance measurement unit 12 compares the eye-to-eye distances d 10 , d 11 and d 12 having been detected with the reference values set for the persons P 10 , P 11 and P 12 , and thereby measures the distances from the gate G to the respective persons. For example, the distance measurement unit 12 measures the distance between the gate G and the person based on the difference between the reference value set for the person and the eye-to-eye distance detected from the person or on the ratio of the difference. The distance measurement unit 12 may measure the relative distance of the person within the captured image, that is, what the person's order to the gate G is, as the distance to the gate G.
  • the persons P 10 , P 11 and P 12 are in a line in the order of the person P 10 , the person P 11 and the person P 12 to the gate G, and the captured image is captured as shown in the lower view of FIG. 3 .
  • the eye-to-eye distances d 10 , d 11 and d 12 of the persons P 10 , P 11 and P 12 are generally d 10 >d 11 >d 12 .
  • the attribute of the person P 10 is identified to be “child”, a reference value of a smaller value is set, and the distance is measured using the reference value and the detected eye-to-eye distance d 10 .
  • the attributes of the persons P 11 and P 12 are identified to be “adult”, a reference value of a larger value than the reference value for child is set, and the distances are measured using the reference value and the detected eye-to-eye distances d 11 and d 12 . Consequently, as shown in FIG. 4 , it is possible to measure distances D 10 , D 11 and D 12 to the respective persons so as to be in order of the person P 10 , the person P 11 and the person P 12 with reference to the gate G in the same order as the actual order to the gate G.
  • the distance measurement unit 12 associates the measured distance with the person within the captured image, so that the distance is also associated with the feature value of the same person stored in the feature value storage unit 15 .
  • the person within the captured image may be tracked in a subsequent captured image and, in such a case, the measurement of the distance is performed on the tracked person in the same manner as described above, and the distance associated with the person is updated.
  • the correspondence of persons within captured images that are temporally different from each other can be realized by tracking feature points, or the like.
  • the method for detecting the eye-to-eye distance by the distance measurement unit 12 is not limited to the method described above, and any method may be used. Moreover, the distance measurement unit 12 may measure the distance by detecting the size of another site of the person or another characteristic of the person as the object information, instead of the eye-to-eye distance. In this case, the reference value described above also becomes a value corresponding to the object information.
  • the distance measurement unit 12 is not necessarily limited to measuring the distance from the gate G to a person.
  • the distance measurement unit 12 may estimate the relative positional relation between persons with reference to the gate.
  • the distance measurement unit 12 may estimate the proximity of each of persons to the gate, that is, the perspective relation between the persons with reference to the gate G, based on the object information such as the eye-to-eye distance described above and the reference value.
  • the matching unit 13 executes a process of matching between a person within a captured image and a previously registered person. At this time, the matching unit 13 executes the matching process based on the distance to the person measured as described above. For example, in a case where a person is located in a predetermined range located at a preset distance from the gate G set immediately before the gate G and the person is located the closest to the gate G within the captured image, the matching unit 13 executes the matching process on the person.
  • the matching unit 13 may execute the matching process on a person simply when the person is located in a predetermined range located at a preset distance from the gate G set immediately before the gate G, or may execute the matching process on a person based on another criterion in accordance with the distance to the person. Moreover, in a case where the distance measurement unit 12 estimates only the relative positional relation between persons with reference to the gate G as described above, the matching unit 13 may execute the matching process on the person who is the closest to the gate G based on the positional relation.
  • the matching process by the matching unit 13 is executed using a feature value stored in the feature value storage unit 15 of a person to be processed based on the distance as described above. That is, the matching unit 13 does not newly generate a feature value from the face region of a person within a captured image.
  • the matching unit 13 executes the matching by calculating a matching score such as the degree of similarity between the feature value stored in the feature value storage unit 15 and the feature value of a person previously registered in the matching data storage unit 16 , and determining whether or not the matching score is higher than a threshold value. In a case where the matching score is higher than the threshold value, the matching unit 13 determines that the matching succeeds, and determines that the person who is about to pass through the gate G is the previously registered person.
  • the matching method may be any method.
  • the matching unit 13 In a case where, on a person to be processed based on a distance, the matching process has already been executed according to an instruction by the feature value extraction unit 11 in the abovementioned manner, or the result of the matching has been stored in the feature value storage unit 15 , the matching unit 13 only confirms the result of the matching. That is, the matching unit 13 checks success or failure of the matching result as stored matching information about the person to be processed based on the distance to determine whether or not the matching succeeds.
  • the gate control unit 14 (a gate controlling means) first determines whether or not a person is permitted to pass through the gate G based on the result of the matching by the matching unit 13 . To be specific, the gate control unit 14 determines a person successfully matched by the matching unit 13 to be permitted to pass through. Moreover, the gate control unit 14 has a function to display the matching result, that is, success or failure of the matching onto the display device D. Moreover, the gate control unit 14 has a gate control function to open and close the gate G, and controls the gate G to open for a person determined to be permitted to pass through.
  • the display device D is placed with its display screen facing the pre-passing side region of the gate G so that a person who is about to pass through the gate G can visually recognize. Meanwhile, the display device D may not be installed necessarily.
  • the capture device C corresponding to the gate G keeps capturing images of the pre-passing side region of the gate G. Then, the face authentication system 10 executes the following process at all times on the captured images having been captured.
  • the feature value extraction unit 11 extracts the persons (objects) P 10 , P 11 and P 12 to be processed from the captured image (step S 1 ).
  • the feature value extraction unit 11 generates a feature value that is necessary for the matching from a face region of the person (step S 3 ).
  • the feature value extraction unit 11 stores the extracted feature value into the feature value storage unit 15 in association with the person within the captured image.
  • the feature value extraction unit 11 When it is not possible to sufficiently generate the feature value of the person from the captured image because, for example, the captured image does not clearly show the face region of the person or the front of the face is not shown (No at step S 2 ), the feature value extraction unit 11 generates the feature value from a subsequent captured image.
  • the distance measurement unit 12 measures the distance from the gate G to the person within the captured image that the feature value has been extracted (step S 4 ).
  • the process of measuring the distance by the distance measurement unit 12 will be described with reference to the flowchart of FIG. 6 .
  • the distance measurement unit 12 executes the attribute analysis process on image portions of the face regions of the persons within the captured image, and identifies the attributes of the persons. For example, in the example shown by FIGS. 1 and 3 , it is assumed that the attribute of the person P 10 is identified as child and the attributes of the persons P 11 and P 12 are identified as adult. Then, the distance measurement unit 12 sets reference values previously registered in the face authentication system 10 corresponding to the identified attributes of the persons, as the reference values of the persons (step S 11 ). For example, in the example shown by FIGS. 3 and 4 , the distance measurement unit 12 sets a reference value corresponding to child for the person P 10 , and sets a reference value corresponding to adult for the persons P 11 and P 12 .
  • the distance measurement unit 12 detects the size of a predetermined site that is a person's feature as object information necessary for measuring the distance from a person in a captured image to the gate G, herein, detects the eye-to-eye distance of the person (step S 12 ).
  • the distance measurement unit 12 detects the eye-to-eye distances of the persons P 10 , P 11 and P 12 as shown by reference numerals d 10 , d 11 and d 12 in the lower view of FIG. 3 .
  • the distance measurement unit 12 measures the distances from the gate G to the persons P 10 , P 11 and P 12 by comparing the reference values set for the persons P 10 , P 11 and P 12 as described above with the eye-to-eye distances d 10 , d 11 and d 12 of the persons P 10 , P 11 and P 12 for each person (step S 13 ). For example, in the example shown by the lower view of FIG. 3 , the eye-to-eye distance d 11 of the person P 11 who is the second from the gate among the three persons is shown the largest.
  • the distances D 10 , D 11 and D 12 to the persons are measured so as to be in order of the person P 10 , the person P 11 and the person P 12 as shown in FIG. 4 in the same order as the actual order with reference to the gate G.
  • the matching unit 13 executes the matching process on the respective persons based on the distances to the persons P 10 , P 11 and P 12 measured as described above.
  • the matching unit 13 executes the matching process on the person (Yes at step S 5 , step S 6 ). Therefore, in the example shown by FIG. 4 , the matching unit 13 executes the matching process on the person P 10 who is the closest to the gate G.
  • the matching unit 13 calculates a matching score such as the degree of similarity between the feature value stored in the feature value storage unit 15 in association with the person within the captured image to be processed and a feature value of a person previously registered in the matching data storage unit 16 , and determines whether or not the matching score is higher than a threshold value (step S 6 ). When the matching score is higher than the threshold value, the matching unit 13 determines that the matching succeeds (Yes at step S 7 ), and determines that the person who is about to pass through the gate G is the previously registered person.
  • a matching score such as the degree of similarity between the feature value stored in the feature value storage unit 15 in association with the person within the captured image to be processed and a feature value of a person previously registered in the matching data storage unit 16 .
  • the gate control unit 14 permits the person P 10 to pass through the gate G, and controls the gate G to open (step S 8 ). At this time, the gate control unit 14 displays “permitted to pass” on the display device D.
  • a feature value is extracted from the face region of a person beforehand in the pre-passing side region of the gate G, and the matching process is executed immediately before the gate, so that it is possible to properly open and close the gate for a person. As a result, it is possible to realize that a person smoothly pass through the gate G.
  • the feature value of a person is extracted and stored from a captured image captured not only immediately before the gate G but also at any timing when a person heads to the gate G, so that it is possible to extract a highly reliable feature value and execute the matching with accuracy. As a result, it is possible to realize that a person smoothly passes through the gate G.
  • the object is not limited to a person and may be any object.
  • the object may be an object such as baggage.
  • information used for measuring the distance from the gate G such as the reference value and the object information representing the feature of the object that are described above, may be information representing any feature that can be detected from an object.
  • any feature value that can be detected from an object may be used.
  • FIG. 7 is a block diagram showing the configuration of a face authentication system.
  • the face authentication system 10 in this example embodiment is configured almost the same as the face authentication system 10 in the first example embodiment, but is configured to measure the distance from the gate G to a person without using a captured image for extracting a feature value. Below, the configuration different from that of the first example embodiment will be described in detail.
  • the face authentication system 10 in this example embodiment includes a rangefinder E in addition to the configuration of the face authentication system 10 described in the first example embodiment.
  • the rangefinder E is an infrared depth sensor, and is a device which can measure the distance to a person who is in the pre-passing side region of the gate G by using a measured depth that is different information from a captured image for extracting the feature value of a person as described above.
  • the rangefinder E is installed near the corresponding gate G, on the right side in view of a person heading to the gate G or above the gate in the same manner as the capture device C.
  • a single rangefinder is installed.
  • a plurality of rangefinders are installed.
  • the rangefinder E may be a device that measures the distance to a person by another method.
  • the distance measurement unit 12 in this example embodiment acquires the measured depth from the rangefinder E, and associates the depth with a person within a captured image.
  • the distance measurement unit 12 specifies a depth at each position in the pre-passing side region of the gate G, and makes the depth correspond to a position in the captured image. With this, it is possible to specify the distance to each person located in the captured image, and associate the distance with the person.
  • the person within the captured image may be tracked in a subsequent captured image and, in such a case, the measurement of the distance is performed on the tracked person in the same manner as described above, and the distance associated with the person is updated.
  • the correspondence of persons within captured images that are temporally different from each other can be realized by tracking feature points, or the like.
  • the matching unit 13 in this example embodiment executes the matching process based on the measured distance to the person as described above. Such a matching process is as in the first example embodiment.
  • the matching unit 13 retrieves the stored feature value on the person to execute the matching process.
  • the rangefinder E and the distance measurement unit 12 are not limited to measuring the distance from the gate G to a person necessarily.
  • the rangefinder E and the distance measurement unit 12 may estimate the relative positional relation between persons with reference to the gate G. That is, the rangefinder E and the distance measurement unit 12 may estimate the perspective relation between persons with reference to the gate G.
  • the matching unit 13 may execute the matching process on the person who is the closest to the gate G based on the estimated relative positional relation between the persons with reference to the gate G.
  • FIG. 8 is a block diagram showing the configuration of an information processing system in the third example embodiment.
  • FIG. 9 is a block diagram showing the configuration of an information processing device in the third example embodiment. In this example embodiment, the overview of the configuration of the face authentication system described in the first and second example embodiments will be illustrated.
  • an information processing system 100 in this example embodiment includes: a capturing means 110 that acquires a captured image obtained by capturing a pre-passing side region with reference to a gate; an image processing means 120 that extracts a feature value of an object within the captured image and stores matching information used for matching of the object based on the feature value; a distance estimating means 130 that estimates a distance from the gate to the object within the captured image; and a matching means 140 that executes matching determination based on the estimated distance and the stored matching information of the object that the distance has been estimated.
  • the capturing means 110 may be eliminated from the information processing system 100 shown in FIG. 8 .
  • an information processing device 200 in this example embodiment includes: an image processing means 220 that extracts a feature value of an object within a captured image obtained by capturing a pre-passing side region with reference to a gate and stores matching information used for matching of the object based on the feature value; a distance estimating means 230 that estimates a distance from the gate to the object within the captured image; and a matching means 240 that executes matching determination based on the estimated distance and the stored matching information of the object that the distance has been estimated.
  • the image processing means 120 , 220 , the distance estimating means 130 , 230 and the matching means 140 , 240 described above may be structured by execution of a program by an arithmetic device, or may be structured by electronic circuits.
  • the information processing system 100 and the information processing device 200 with the above configurations each operate to execute: a process of extracting a feature value of an object within a captured image obtained by capturing a pre-passing side region with reference to a gate, and storing matching information used for matching of the object based on the feature value; estimating a distance from the gate to the object within the captured image; and executing matching determination based on the estimated distance and the stored matching information of the object that the distance has been estimated.
  • the information processing system 100 and the information processing device 200 described above each extract a feature value from a face region of an object beforehand in a pre-passing side region with reference to a gate and execute a matching process beforehand using the feature value, and executes matching determination immediately before the gate, so that it is possible to properly open and close the gate for the person.
  • the information processing system 100 and the information processing device 200 each extracts a feature value of a person from a captured image captured at any timing when the person heads to the gate and stores the feature value, so that it is possible to extract a highly reliable feature value and execute matching with accuracy. As a result, it is possible to realize that the person smoothly passes through the gate.
  • An information processing device comprising:
  • the information processing device according to Supplementary Note 1, wherein the matching means executes a matching process of matching between the feature value that is the stored matching information of the object that the distance has been estimated and a previously registered feature value, based on the estimated distance.
  • An information processing system comprising:
  • the matching means executes a matching process of matching between the stored feature value of the object located at a preset distance to the gate and the previously registered feature value.
  • the information processing system according to Supplementary Note 12 or 12.1, wherein the matching means executes a matching process of matching between the stored feature value of the object located closest to the gate and the previously registered feature value.
  • a program comprising instructions for causing an information processing device to realize:
  • An information processing method comprising:
  • the program described above can be stored using various types of non-transitory computer-readable mediums and supplied to a computer.
  • the non-transitory computer-readable mediums include various types of tangible storage mediums.
  • Examples of the non-transitory computer-readable mediums include a magnetic recording medium (for example, a flexible disk, a magnetic tape, a hard disk drive), a magneto-optical recording medium (for example, a magneto-optical disk), a CD-ROM (Read Only Memory), a CD-R, a CD-R/W, and a semiconductor memory (for example, a mask ROM, a PROM (Programmable ROM), an EPROM (Erasable PROM), a flash ROM, a RAM (Random Access Memory).
  • a magnetic recording medium for example, a flexible disk, a magnetic tape, a hard disk drive
  • a magneto-optical recording medium for example, a magneto-optical disk
  • CD-ROM Read Only Memory
  • CD-R Compact Only Memory
  • the program may be supplied to a computer by various types of transitory computer-readable mediums.
  • Examples of the transitory computer-readable mediums include electric signals, optical signals, and electromagnetic waves.
  • the transitory computer-readable medium can supply a program to a computer via a wired communication path such as an electric line and an optical fiber, or a wireless communication path.

Abstract

An information processing device of the present invention includes: an image processing means that extracts a feature value of an object within a captured image obtained by capturing a pre-passing region of a gate, and stores matching information relating to matching of the object based on the feature value; a distance estimating means that estimates a distance from the gate to the object within the captured image; and a matching means that executes matching determination based on the estimated distance and the stored matching information of the object that the distance has been estimated.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • The present application is a continuation application of U.S. patent application Ser. No. 17/717,266 filed on Apr. 11, 2022, which is a continuation application of U.S. patent application Ser. No. 16/965,097 filed on Jul. 27, 2020, which issued as U.S. Pat. No. 11,335,125, which is a National Stage Entry of international application PCT/JP2019/002335 filed on Jan. 24, 2019, which claims the benefit of priority from Japanese Patent Application 2018-014275 filed on Jan. 31, 2018, the disclosures of all of which are incorporated in their entirety by reference herein.
  • TECHNICAL FIELD
  • The present invention relates to an information processing device, an information processing system, a program, and an information processing method.
  • BACKGROUND ART
  • As a means for restricting and managing persons who enter and exit a specific place such as an office or an event venue, a matching system is used that checks whether or not a person who is about to pass through is a previously registered person. In particular, in recent years, a walk-through face authentication system that performs face authentication based on the face image of a person captured by a camera installed at a gate has been used owing to development of a person face authentication technology.
    • Patent Document 1: Japanese Unexamined Patent Application Publication No. JP-A 2016-083225
  • A walk-through face authentication system needs to perform matching of persons who are in a line at a gate in order and to open and close the gate so that the persons can smoothly pass through the gate. However, there are various persons who are going to pass through the gate, and it is difficult to properly determine their sequence. As a result, there arises a problem that smoothly passing through the gate is difficult.
  • SUMMARY
  • Accordingly, an object of the present invention is to solve the abovementioned problem that smoothly passing through the gate is difficult.
  • An information processing device according to an aspect of the present invention includes: an image processing means that extracts a feature value of an object within a captured image obtained by capturing a pre-passing region of a gate, and stores matching information relating to matching of the object based on the feature value; a distance estimating means that estimates a distance from the gate to the object within the captured image; and a matching means that executes matching determination based on the estimated distance and the stored matching information of the object that the distance has been estimated.
  • Further, an information processing system according to another aspect of the present invention includes: a capturing means that acquires a captured image obtained by capturing a pre-passing region of a gate; an image processing means that extracts a feature value of an object within the captured image, and stores matching information relating to matching of the object based on the feature value; a distance estimating means that estimates a distance from the gate to the object within the captured image; and a matching means that executes matching determination based on the estimated distance and the stored matching information of the object that the distance has been estimated.
  • Further, a program according to another aspect of the present invention includes instructions for causing an information processing device to realize: an image processing means that extracts a feature value of an object within a captured image obtained by capturing a pre-passing region of a gate, and stores matching information relating to matching of the object based on the feature value; a distance estimating means that estimates a distance from the gate to the object within the captured image; and a matching means that executes matching determination based on the estimated distance and the stored matching information of the object that the distance has been estimated.
  • Further, an information processing method according to another aspect of the present invention includes: extracting a feature value of an object within a captured image obtained by capturing a pre-passing region of a gate, and storing matching information relating to matching of the object based on the feature value; estimating a distance from the gate to the object within the captured image; and executing matching determination based on the estimated distance and the stored matching information of the object that the distance has been estimated.
  • With the configurations as described above, the present invention can provide an information processing device which can realize smoothly passing through a gate.
  • BRIEF DESCRIPTION OF DRAWINGS
  • FIG. 1 is a view showing a situation in which a face authentication system in a first example embodiment of the present invention is used;
  • FIG. 2 is a block diagram showing the configuration of the face authentication system in the first example embodiment of the present invention;
  • FIG. 3 is a view showing a situation in which an image is captured by the face authentication system disclosed in FIG. 1 ;
  • FIG. 4 is a view showing a situation in which the face authentication system disclosed in FIG. 1 is used;
  • FIG. 5 is a flowchart showing a processing operation by the face authentication system disclosed in FIG. 1 ;
  • FIG. 6 is a flowchart showing a processing operation by the face authentication system disclosed in FIG. 1 ;
  • FIG. 7 is a block diagram showing the configuration of an information processing system in a second example embodiment of the present invention;
  • FIG. 8 is a block diagram showing the configuration of an information processing system in a third example embodiment of the present invention; and
  • FIG. 9 is a block diagram showing the configuration of an information processing device in the third example embodiment of the present invention.
  • EXAMPLE EMBODIMENTS First Example Embodiment
  • A first example embodiment of the present invention will be described with reference to FIGS. 1 to 6 . FIG. 1 is a view showing a situation in which a face authentication system is used. FIG. 2 is a view showing the configuration of the face authentication system. FIGS. 3 to 6 are views for describing a processing operation by the face authentication system.
  • [General Configuration]
  • A face authentication system 10 (an information processing system) according to the present invention is a system used for restricting and managing entry/exit of a person (an object) in a specific place such as an office or an event venue. For example, a capture device C included by the face authentication system 10 is installed, for each gate that is opened/closed when a person enters/exits, near a place in which the gate is set up.
  • In an example shown in FIG. 1 , a plurality of gates G are arranged in parallel and adjacent to each other, and configured so that persons pass in a direction indicated by an arrow from the right side of FIG. 1 to the gates G. Therefore, a region on the right side in FIG. 1 with reference to each of the gates G is a region where persons remain before passing through the gate (a pre-passing side region). In the pre-passing side regions of the gates G, lanes in which persons who are going to pass through the gates G make lines and pass are located in parallel so as to correspond to the gates G, respectively. The lanes may be or may not be partitioned by some member. In addition, although a case in which a plurality of gates G are arranged adjacent to each other is illustrated in this example embodiment, the number of the gates G may be one.
  • In the situation shown in FIG. 1 , the capture device C included by the face authentication system 10 in this example embodiment is installed near the corresponding gate G and on the right side in view of a person heading to the gate G. However, a position to install the capture device is not limited to the position shown in FIG. 1 , and may be any position such as on the left side of the gate or above the gate. Besides, the face authentication system 10 also includes display devices D in the vicinity of the respective capture devices C.
  • The face authentication system 10 captures an image of a person heading to the gate G by the capture device C included by the system. Then, the face authentication system 10 executes a process of matching to check whether or not the person shown in the captured image is a previously registered person based on the face image of the person and, when the matching succeeds, opening the gate G so that the person can pass through. Below, the configuration of the face authentication system 10 will be described in detail.
  • [Configuration of Face Authentication System]
  • The face authentication system 10 in this example embodiment is an information processing device including an arithmetic device and a storage device, configured integrally with the capture device C (a camera) and the display device D (a display). In other words, the capture device C is equipped with the information processing device executing a face authentication process including the arithmetic device and the storage device and with the display device D. However, the face authentication system 10 is not necessarily limited to being configured integrally with the capture device C and the display device D. For example, the capture device C, the display device D, and the information processing device processing a captured image may be configured by separate devices and installed in separate places.
  • To be specific, as shown in FIG. 2 , the face authentication system 10 includes the capture device C and the display device D, and also includes a feature value extraction unit 11, a distance measurement unit 12, a matching unit 13 and a gate control unit 14 that are structured by execution of a program by the arithmetic device. Moreover, the face authentication system 10 includes a feature value storage unit 15 and a matching data storage unit 16 that are structured in the storage device.
  • The abovementioned capture device C (a capturing means) includes a camera and a camera control unit that acquire captured images of the pre-passing side region with reference to the gate G, that is, a region before gate in the corresponding lane at predetermined frame rates. For the capture device C, for example, as shown in FIG. 1 , a range sandwiched between lines denoted by reference symbol Ca is a capture region. For example, in a case where three persons P10, P11, and P12 are in a lane as shown in FIG. 1 , a captured image captured by the capture device C is as shown in the upper view of FIG. 3 . A captured image is set to be substantially in focus within a range of a preset distance in the perspective direction with reference to the capture device C.
  • When a captured image is captured by the capture device C, the feature value extraction unit 11 (an image processing means) executes a process of extracting a person within the captured image to extract the feature value of the person first. To be specific, the feature value extraction unit 11 extracts a person within the captured image, targets all the extracted persons, and generates a feature value necessary for matching from the face region of each of the persons. The feature value is, for example, information used by the matching unit 13 later to calculate a matching score such as the degree of similarity to the feature value of a person previously registered in the matching data storage unit 16 and execute a matching process. The feature value may be a feature value used in the existing face matching technique, or may be a feature value calculated by another method.
  • Then, the feature value extraction unit 11 stores the extracted feature value as information relating to person matching, that is, information used for the matching process (matching information) into the feature value storage unit 15. At this time, the feature value is stored into the feature value storage unit 15 in association with information for identifying the person within the captured image. The person within the captured image may be tracked in a subsequent captured image. In such a case, the tracked person and the feature value in the feature value storage unit 15 are associated with each other.
  • Further, the feature value extraction unit 11 may store a plurality of feature values with respect to one person by extracting feature values from different captured images, respectively. Moreover, the feature value extraction unit 11 may store only one feature value when extracting feature values from different captured images. That is, in the case of extracting a feature value every time a captured image is acquired with respect to the same person, the feature value extraction unit 11 may update and store only one feature value. For example, the feature value extraction unit 11 may judge the qualities of the extracted feature values and store only one feature value of the highest quality in association with the person.
  • The feature value extraction unit 11 may execute the matching process by using the matching unit 13 to be described later. In this case, the feature value extraction unit 11 issues a matching instruction to the matching unit 13 to execute the matching process of matching between the feature value of the person within the captured image extracted as described above and a previously registered person. Then, the feature value extraction unit 11 may acquire the result of the matching by the matching unit 13 and store the matching result as the matching information relating to person matching into the feature value storage unit 15. At this time, the feature value is stored into the feature value storage unit 15 in association with the information for identifying the person within the captured image. The person within the captured image may be tracked in a subsequent captured image. In such a case, the tracked person and the matching result in the feature value storage unit 15 are associated with each other. At this time, the feature value extraction and the matching process may be executed on a plurality of captured images. In such a case, only one matching result may be updated and stored in association with the person.
  • The abovementioned distance measurement unit 12 (a distance estimating means) measures the distance from the gate G to the person within the captured image that the feature value has been extracted as described above. At this time, in this example embodiment, the distance measurement unit 12 uses an image portion of the person within the captured image to measure the distance from the gate G to the person. To be specific, the distance measurement unit 12 measures the distance in the following manner.
  • The distance measurement unit 12 first sets a reference value which is necessary for measurement of the distance to the person within the captured image. To be specific, the distance measurement unit 12 extracts an image portion of the face region of the person to be processed from the captured image. The extraction of the face region of the person is performed, for example, by judging from the position of a moving person with reference to the entire image, the color of the person, or the like. Then, the distance measurement unit 12 executes an attribute analysis process of identifying the attribute of the person from the image portion of the face region. Herein, the attribute of the person is, for example, gender, age (generation, adult, child), and race.
  • In the attribute analysis process, for example, the distance measurement unit 12 identifies the attribute of the person by extracting attribute identification information that is information necessary for identifying the attribute from the image portion of the face region, and comparing the extracted attribute identification information with previously registered attribute identification reference information. Herein, the attribute identification information is, for example, information representing a physical characteristic that generally appears in the face region of a person for each attribute such as gender or age. Because the attribute analysis process of identifying the attribute such as gender or age (generation) of the person can be realized by the existing technique, the detailed description of the process will be omitted. The attribute that can be identified is not limited to the abovementioned attributes, and may be any attribute.
  • Then, the distance measurement unit 12 sets a reference value corresponding to the identified attribute of the person. Herein, the reference value is previously registered in the storage device included by the face authentication system 10. For example, in this example embodiment, a reference value of an eye-to-eye distance representing the distance between both the eyes of a person is registered for each attribute. As an example, in a case where a certain numerical value is registered as an eye-to-eye distance that is the reference value of an attribute “male”, an eye-to-eye distance that is the reference value of an attribute “female” is set to a smaller value than the reference value of the attribute “male”. Moreover, for example, in a case where a certain numerical value is registered as an eye-to-eye distance that is the reference value of an attribute “adult” with age 15 years old to 70s, an eye-to-eye distance that is the reference value of an attribute “child” with age less than 15 years old is set to a smaller value than the reference value of the attribute “adult”. Thus, the reference value is a value corresponding to a general physical constitution of the attribute of a person. Then, the distance measurement unit 12 sets the reference value registered corresponding to the attribute identified with respect to the person extracted from the captured image, as the reference value of the person.
  • Furthermore, the distance measurement unit 12 measures the distance to the person by using the reference value set for the person as described above. To be specific, the distance measurement unit 12 firstly detects, as object information representing the feature of the person within the captured image, an eye-to-eye distance representing the distance between both the eyes of the person from an image portion of the face region of the person. For example, the distance measurement unit 12 detects the eye-to-eye distances of the persons P10, P11 and P12 within the captured image as denoted reference values d10, d11 and d12 in the lower view of FIG. 3 . Then, the distance measurement unit 12 compares the eye-to-eye distances d10, d11 and d12 having been detected with the reference values set for the persons P10, P11 and P12, and thereby measures the distances from the gate G to the respective persons. For example, the distance measurement unit 12 measures the distance between the gate G and the person based on the difference between the reference value set for the person and the eye-to-eye distance detected from the person or on the ratio of the difference. The distance measurement unit 12 may measure the relative distance of the person within the captured image, that is, what the person's order to the gate G is, as the distance to the gate G.
  • Herein, an example of measurement of the distances from the gate G to the persons P10, P11 and P12 will be described. In the example of FIG. 1 , the persons P10, P11 and P12 are in a line in the order of the person P10, the person P11 and the person P12 to the gate G, and the captured image is captured as shown in the lower view of FIG. 3 . At this time, when the physical constitutions and face sizes of the persons P10, P11 and P12 are almost the same, the eye-to-eye distances d10, d11 and d12 of the persons P10, P11 and P12 are generally d10>d11>d12. Meanwhile, when the person P10 is a child and the persons P11 and P12 are adults, it is generally thought that a child has a smaller face and a shorter eye-to-eye distance, so that the actually measured eye-to-eye distances are d11>d12>d10. According to this example embodiment, in such a situation, the attribute of the person P10 is identified to be “child”, a reference value of a smaller value is set, and the distance is measured using the reference value and the detected eye-to-eye distance d10. Then, the attributes of the persons P11 and P12 are identified to be “adult”, a reference value of a larger value than the reference value for child is set, and the distances are measured using the reference value and the detected eye-to-eye distances d11 and d12. Consequently, as shown in FIG. 4 , it is possible to measure distances D10, D11 and D12 to the respective persons so as to be in order of the person P10, the person P11 and the person P12 with reference to the gate G in the same order as the actual order to the gate G.
  • Then, the distance measurement unit 12 associates the measured distance with the person within the captured image, so that the distance is also associated with the feature value of the same person stored in the feature value storage unit 15. The person within the captured image may be tracked in a subsequent captured image and, in such a case, the measurement of the distance is performed on the tracked person in the same manner as described above, and the distance associated with the person is updated. The correspondence of persons within captured images that are temporally different from each other can be realized by tracking feature points, or the like.
  • The method for detecting the eye-to-eye distance by the distance measurement unit 12 is not limited to the method described above, and any method may be used. Moreover, the distance measurement unit 12 may measure the distance by detecting the size of another site of the person or another characteristic of the person as the object information, instead of the eye-to-eye distance. In this case, the reference value described above also becomes a value corresponding to the object information.
  • The distance measurement unit 12 is not necessarily limited to measuring the distance from the gate G to a person. For example, the distance measurement unit 12 may estimate the relative positional relation between persons with reference to the gate. As an example, the distance measurement unit 12 may estimate the proximity of each of persons to the gate, that is, the perspective relation between the persons with reference to the gate G, based on the object information such as the eye-to-eye distance described above and the reference value.
  • The matching unit 13 (a matching means) executes a process of matching between a person within a captured image and a previously registered person. At this time, the matching unit 13 executes the matching process based on the distance to the person measured as described above. For example, in a case where a person is located in a predetermined range located at a preset distance from the gate G set immediately before the gate G and the person is located the closest to the gate G within the captured image, the matching unit 13 executes the matching process on the person. The matching unit 13 may execute the matching process on a person simply when the person is located in a predetermined range located at a preset distance from the gate G set immediately before the gate G, or may execute the matching process on a person based on another criterion in accordance with the distance to the person. Moreover, in a case where the distance measurement unit 12 estimates only the relative positional relation between persons with reference to the gate G as described above, the matching unit 13 may execute the matching process on the person who is the closest to the gate G based on the positional relation.
  • The matching process by the matching unit 13 is executed using a feature value stored in the feature value storage unit 15 of a person to be processed based on the distance as described above. That is, the matching unit 13 does not newly generate a feature value from the face region of a person within a captured image. The matching unit 13 executes the matching by calculating a matching score such as the degree of similarity between the feature value stored in the feature value storage unit 15 and the feature value of a person previously registered in the matching data storage unit 16, and determining whether or not the matching score is higher than a threshold value. In a case where the matching score is higher than the threshold value, the matching unit 13 determines that the matching succeeds, and determines that the person who is about to pass through the gate G is the previously registered person. The matching method may be any method.
  • In a case where, on a person to be processed based on a distance, the matching process has already been executed according to an instruction by the feature value extraction unit 11 in the abovementioned manner, or the result of the matching has been stored in the feature value storage unit 15, the matching unit 13 only confirms the result of the matching. That is, the matching unit 13 checks success or failure of the matching result as stored matching information about the person to be processed based on the distance to determine whether or not the matching succeeds.
  • The gate control unit 14 (a gate controlling means) first determines whether or not a person is permitted to pass through the gate G based on the result of the matching by the matching unit 13. To be specific, the gate control unit 14 determines a person successfully matched by the matching unit 13 to be permitted to pass through. Moreover, the gate control unit 14 has a function to display the matching result, that is, success or failure of the matching onto the display device D. Moreover, the gate control unit 14 has a gate control function to open and close the gate G, and controls the gate G to open for a person determined to be permitted to pass through.
  • The display device D is placed with its display screen facing the pre-passing side region of the gate G so that a person who is about to pass through the gate G can visually recognize. Meanwhile, the display device D may not be installed necessarily.
  • [Operation]
  • Next, an operation of the abovementioned face authentication system 10 will be described with reference to flowcharts of FIGS. 5 and 6 . Herein, the operation of the face authentication system 10 corresponding to the gate G will be described using a case where persons are in a line with reference to the gate G as shown in FIGS. 3 and 4 as an example.
  • The capture device C corresponding to the gate G keeps capturing images of the pre-passing side region of the gate G. Then, the face authentication system 10 executes the following process at all times on the captured images having been captured.
  • First, the feature value extraction unit 11 extracts the persons (objects) P10, P11 and P12 to be processed from the captured image (step S1). When it is possible to extract the feature value of the extracted person under the situation of the captured image (Yes at step S2), the feature value extraction unit 11 generates a feature value that is necessary for the matching from a face region of the person (step S3). Then, the feature value extraction unit 11 stores the extracted feature value into the feature value storage unit 15 in association with the person within the captured image. When it is not possible to sufficiently generate the feature value of the person from the captured image because, for example, the captured image does not clearly show the face region of the person or the front of the face is not shown (No at step S2), the feature value extraction unit 11 generates the feature value from a subsequent captured image.
  • Subsequently, the distance measurement unit 12 measures the distance from the gate G to the person within the captured image that the feature value has been extracted (step S4). Herein, the process of measuring the distance by the distance measurement unit 12 will be described with reference to the flowchart of FIG. 6 .
  • The distance measurement unit 12 executes the attribute analysis process on image portions of the face regions of the persons within the captured image, and identifies the attributes of the persons. For example, in the example shown by FIGS. 1 and 3 , it is assumed that the attribute of the person P10 is identified as child and the attributes of the persons P11 and P12 are identified as adult. Then, the distance measurement unit 12 sets reference values previously registered in the face authentication system 10 corresponding to the identified attributes of the persons, as the reference values of the persons (step S11). For example, in the example shown by FIGS. 3 and 4 , the distance measurement unit 12 sets a reference value corresponding to child for the person P10, and sets a reference value corresponding to adult for the persons P11 and P12.
  • Subsequently, the distance measurement unit 12 detects the size of a predetermined site that is a person's feature as object information necessary for measuring the distance from a person in a captured image to the gate G, herein, detects the eye-to-eye distance of the person (step S12). For example, the distance measurement unit 12 detects the eye-to-eye distances of the persons P10, P11 and P12 as shown by reference numerals d10, d11 and d12 in the lower view of FIG. 3 .
  • Then, the distance measurement unit 12 measures the distances from the gate G to the persons P10, P11 and P12 by comparing the reference values set for the persons P10, P11 and P12 as described above with the eye-to-eye distances d10, d11 and d12 of the persons P10, P11 and P12 for each person (step S13). For example, in the example shown by the lower view of FIG. 3 , the eye-to-eye distance d11 of the person P11 who is the second from the gate among the three persons is shown the largest. However, since the reference values set for the persons P10, P11 and P12 are different, the distances D10, D11 and D12 to the persons are measured so as to be in order of the person P10, the person P11 and the person P12 as shown in FIG. 4 in the same order as the actual order with reference to the gate G.
  • Subsequently, the matching unit 13 executes the matching process on the respective persons based on the distances to the persons P10, P11 and P12 measured as described above. At this time, in a case where the person is located at a preset distance from the gate G set immediately before the gate G and the person is located the closest to the gate G in the captured image, the matching unit 13 executes the matching process on the person (Yes at step S5, step S6). Therefore, in the example shown by FIG. 4 , the matching unit 13 executes the matching process on the person P10 who is the closest to the gate G. To be specific, the matching unit 13 calculates a matching score such as the degree of similarity between the feature value stored in the feature value storage unit 15 in association with the person within the captured image to be processed and a feature value of a person previously registered in the matching data storage unit 16, and determines whether or not the matching score is higher than a threshold value (step S6). When the matching score is higher than the threshold value, the matching unit 13 determines that the matching succeeds (Yes at step S7), and determines that the person who is about to pass through the gate G is the previously registered person.
  • When the matching of the person P10 located immediately before the gate G succeeds as a result of the matching process by the matching unit 13 (Yes at step S7), the gate control unit 14 permits the person P10 to pass through the gate G, and controls the gate G to open (step S8). At this time, the gate control unit 14 displays “permitted to pass” on the display device D.
  • Thus, according to the face authentication system 10 in this example embodiment, a feature value is extracted from the face region of a person beforehand in the pre-passing side region of the gate G, and the matching process is executed immediately before the gate, so that it is possible to properly open and close the gate for a person. As a result, it is possible to realize that a person smoothly pass through the gate G.
  • Further, the feature value of a person is extracted and stored from a captured image captured not only immediately before the gate G but also at any timing when a person heads to the gate G, so that it is possible to extract a highly reliable feature value and execute the matching with accuracy. As a result, it is possible to realize that a person smoothly passes through the gate G.
  • Although a case in which an object that is about to pass through the gate G is a person is illustrated above, the object is not limited to a person and may be any object. For example, the object may be an object such as baggage. In accordance with this, information used for measuring the distance from the gate G, such as the reference value and the object information representing the feature of the object that are described above, may be information representing any feature that can be detected from an object. Moreover, when executing the matching process, any feature value that can be detected from an object may be used.
  • Second Example Embodiment
  • Next, a second example embodiment of the present invention will be described with reference to FIG. 7 . FIG. 7 is a block diagram showing the configuration of a face authentication system.
  • The face authentication system 10 in this example embodiment is configured almost the same as the face authentication system 10 in the first example embodiment, but is configured to measure the distance from the gate G to a person without using a captured image for extracting a feature value. Below, the configuration different from that of the first example embodiment will be described in detail.
  • As shown in FIG. 7 , the face authentication system 10 in this example embodiment includes a rangefinder E in addition to the configuration of the face authentication system 10 described in the first example embodiment. For example, the rangefinder E is an infrared depth sensor, and is a device which can measure the distance to a person who is in the pre-passing side region of the gate G by using a measured depth that is different information from a captured image for extracting the feature value of a person as described above.
  • For example, the rangefinder E is installed near the corresponding gate G, on the right side in view of a person heading to the gate G or above the gate in the same manner as the capture device C. In a case where the rangefinder E is configured by a single device, a single rangefinder is installed. In a case where the rangefinder E is configured by a plurality of devices, a plurality of rangefinders are installed. The rangefinder E may be a device that measures the distance to a person by another method.
  • The distance measurement unit 12 in this example embodiment acquires the measured depth from the rangefinder E, and associates the depth with a person within a captured image. For example, the distance measurement unit 12 specifies a depth at each position in the pre-passing side region of the gate G, and makes the depth correspond to a position in the captured image. With this, it is possible to specify the distance to each person located in the captured image, and associate the distance with the person. Thus, as in the first example embodiment, it is also possible to associate the measured distance with the feature value of the same person stored in the feature value storage unit 15.
  • The person within the captured image may be tracked in a subsequent captured image and, in such a case, the measurement of the distance is performed on the tracked person in the same manner as described above, and the distance associated with the person is updated. The correspondence of persons within captured images that are temporally different from each other can be realized by tracking feature points, or the like.
  • The matching unit 13 in this example embodiment executes the matching process based on the measured distance to the person as described above. Such a matching process is as in the first example embodiment. When the person is located immediately before the gate G, the matching unit 13 retrieves the stored feature value on the person to execute the matching process.
  • The rangefinder E and the distance measurement unit 12 are not limited to measuring the distance from the gate G to a person necessarily. For example, the rangefinder E and the distance measurement unit 12 may estimate the relative positional relation between persons with reference to the gate G. That is, the rangefinder E and the distance measurement unit 12 may estimate the perspective relation between persons with reference to the gate G. Then, the matching unit 13 may execute the matching process on the person who is the closest to the gate G based on the estimated relative positional relation between the persons with reference to the gate G.
  • With this, it is also possible to properly open and close the gate for a person immediately before the gate, and it is possible to realize that a person smoothly passes through the gate G.
  • Third Example Embodiment
  • Next, a third example embodiment of the present invention will be described with reference to FIGS. 8 and 9 . FIG. 8 is a block diagram showing the configuration of an information processing system in the third example embodiment. FIG. 9 is a block diagram showing the configuration of an information processing device in the third example embodiment. In this example embodiment, the overview of the configuration of the face authentication system described in the first and second example embodiments will be illustrated.
  • As shown in FIG. 8 , an information processing system 100 in this example embodiment includes: a capturing means 110 that acquires a captured image obtained by capturing a pre-passing side region with reference to a gate; an image processing means 120 that extracts a feature value of an object within the captured image and stores matching information used for matching of the object based on the feature value; a distance estimating means 130 that estimates a distance from the gate to the object within the captured image; and a matching means 140 that executes matching determination based on the estimated distance and the stored matching information of the object that the distance has been estimated.
  • Further, in this example embodiment, the capturing means 110 may be eliminated from the information processing system 100 shown in FIG. 8 .
  • That is to say, an information processing device 200 in this example embodiment includes: an image processing means 220 that extracts a feature value of an object within a captured image obtained by capturing a pre-passing side region with reference to a gate and stores matching information used for matching of the object based on the feature value; a distance estimating means 230 that estimates a distance from the gate to the object within the captured image; and a matching means 240 that executes matching determination based on the estimated distance and the stored matching information of the object that the distance has been estimated.
  • The image processing means 120, 220, the distance estimating means 130, 230 and the matching means 140, 240 described above may be structured by execution of a program by an arithmetic device, or may be structured by electronic circuits.
  • Then, the information processing system 100 and the information processing device 200 with the above configurations each operate to execute: a process of extracting a feature value of an object within a captured image obtained by capturing a pre-passing side region with reference to a gate, and storing matching information used for matching of the object based on the feature value; estimating a distance from the gate to the object within the captured image; and executing matching determination based on the estimated distance and the stored matching information of the object that the distance has been estimated.
  • The information processing system 100 and the information processing device 200 described above each extract a feature value from a face region of an object beforehand in a pre-passing side region with reference to a gate and execute a matching process beforehand using the feature value, and executes matching determination immediately before the gate, so that it is possible to properly open and close the gate for the person. Moreover, the information processing system 100 and the information processing device 200 each extracts a feature value of a person from a captured image captured at any timing when the person heads to the gate and stores the feature value, so that it is possible to extract a highly reliable feature value and execute matching with accuracy. As a result, it is possible to realize that the person smoothly passes through the gate.
  • <Supplementary Notes>
  • The whole or part of the example embodiments disclosed above can be described as the following supplementary notes. Below, the overview of the configurations of the information processing device, the information processing system, the program and the information processing method according to the present invention will be described. However, the present invention is not limited to the following configurations
  • (Supplementary Note 1)
  • An information processing device comprising:
      • an image processing means that extracts a feature value of an object within a captured image obtained by capturing a pre-passing region of a gate, and stores matching information relating to matching of the object based on the feature value;
      • a distance estimating means that estimates a distance from the gate to the object within the captured image; and
      • a matching means that executes matching determination based on the estimated distance and the stored matching information of the object that the distance has been estimated.
    (Supplementary Note 2)
  • The information processing device according to Supplementary Note 1, wherein the matching means executes a matching process of matching between the feature value that is the stored matching information of the object that the distance has been estimated and a previously registered feature value, based on the estimated distance.
  • (Supplementary Note 3)
  • The information processing device according to Supplementary Note 2, wherein
      • the matching means executes a matching process of matching between the stored feature value of the object located at a preset distance to the gate and the previously registered feature value.
    (Supplementary Note 4)
  • The information processing device according to Supplementary Note 2 or 3, wherein
      • the matching means executes a matching process of matching between the stored feature value of the object located closest to the gate and the previously registered feature value.
    (Supplementary Note 5)
  • The information processing device according to Supplementary Note 1, wherein:
      • the image processing means executes a matching process of matching between the stored feature value and a previously registered feature value, and stores a result of the matching as the matching information; and
      • the matching means executes the matching determination based on the estimated distance and the result of the matching that is the stored matching information of the object that the distance has been estimated.
    (Supplementary Note 6)
  • The information processing device according to any of Supplementary Notes 1 to 5, wherein
      • the image processing means performs extraction of the feature value of the object in a plurality of captured images, and updates and stores the matching information of the object based on the feature value.
    (Supplementary Note 7)
  • The information processing device according to any of Supplementary Notes 1 to 6, wherein
      • the distance estimating means estimates the distance to the object within the pre-passing region of the gate by using information which is different from the captured image for extracting the feature value.
    (Supplementary Note 8)
  • The information processing device according to any of Supplementary Notes 1 to 6, wherein
      • the distance estimating means estimates the distance to the object within the captured image by using an image portion of the object.
    (Supplementary Note 9)
  • The information processing device according to Supplementary Note 8, wherein
      • the distance estimating means identifies an attribute of the object within the captured image, sets a reference value corresponding to the attribute, and estimates the distance to the object within the captured image by using the reference value.
    (Supplementary Note 10)
  • The information processing device according to Supplementary Note 9, wherein
      • the distance estimating means detects object information representing a feature of the object within the captured image, and estimates the distance to the object within the captured image based on the reference value and the object information.
    (Supplementary Note 11)
  • The information processing device according to Supplementary Note 10, wherein
      • the distance estimating means detects a size of a predetermined site of the object within the captured image as the object information, and estimates the distance to the object within the captured image based on the size of the predetermined site of the object with reference to the reference value.
    (Supplementary Note 12)
  • An information processing system comprising:
      • a capturing means that acquires a captured image obtained by capturing a pre-passing region of a gate;
      • an image processing means that extracts a feature value of an object within the captured image, and stores matching information relating to matching of the object based on the feature value;
      • a distance estimating means that estimates a distance from the gate to the object within the captured image; and
      • a matching means that executes matching determination based on the estimated distance and the stored matching information of the object that the distance has been estimated.
    (Supplementary Note 12.1)
  • The information processing system according to Supplementary Note 12, wherein the matching means executes a matching process of matching between the stored feature value of the object located at a preset distance to the gate and the previously registered feature value.
  • (Supplementary Note 12.2)
  • The information processing system according to Supplementary Note 12 or 12.1, wherein the matching means executes a matching process of matching between the stored feature value of the object located closest to the gate and the previously registered feature value.
  • (Supplementary Note 13)
  • A program comprising instructions for causing an information processing device to realize:
      • an image processing means that extracts a feature value of an object within a captured image obtained by capturing a pre-passing region of a gate, and stores matching information relating to matching of the object based on the feature value;
      • a distance estimating means that estimates a distance from the gate to the object within the captured image; and
      • a matching means that executes matching determination based on the estimated distance and the stored matching information of the object that the distance has been estimated.
    (Supplementary Note 13.1)
  • The program according to Supplementary Note 13, wherein
      • the matching means executes a matching process of matching between the stored feature value of the object located at a preset distance to the gate and the previously registered feature value.
    (Supplementary Note 13.2)
  • The program according to Supplementary Note 13 or 13.1, wherein
      • the matching means executes a matching process of matching between the stored feature value of the object located closest to the gate and the previously registered feature value.
    (Supplementary Note 14)
  • An information processing method comprising:
      • extracting a feature value of an object within a captured image obtained by capturing a pre-passing region of a gate, and storing matching information relating to matching of the object based on the feature value;
      • estimating a distance from the gate to the object within the captured image; and
      • executing matching determination based on the estimated distance and the stored matching information of the object that the distance has been estimated.
    (Supplementary Note 15)
  • The information processing method according to Supplementary Note 14, comprising
      • executing a matching process of matching between the feature value that is the stored matching information of the object that the distance has been estimated and a previously registered feature value, based on the estimated distance.
    (Supplementary Note 16)
  • The information processing method according to Supplementary Note 15, comprising
      • executing a matching process of matching between the stored feature value of the object located at a preset distance to the gate and the previously registered feature value.
    (Supplementary Note 17)
  • The information processing method according to Supplementary Note 15 or 16, comprising
      • executing a matching process of matching between the stored feature value of the object located closest to the gate and the previously registered feature value.
    (Supplementary Note 18)
  • The information processing method according to any of Supplementary Notes 15 to 17, comprising
      • estimating the distance to the object within the pre-passing region of the gate by using information which is different from the captured image for extracting the feature value.
    (Supplementary Note 19)
  • The information processing method according to any of Supplementary Notes 15 to 17, comprising
      • estimating the distance to the object within the captured image by using an image portion of the object.
    (Supplementary Note 19.1)
  • The information processing method according to Supplementary Note 19, comprising
      • identifying an attribute of the object within the captured image, setting a reference value corresponding to the attribute, and estimating the distance to the object within the captured image by using the reference value.
    (Supplementary Note 19.2)
  • The information processing method according to Supplementary Note 19.1, comprising
      • detecting object information representing a feature of the object within the captured image, and estimating the distance to the object within the captured image based on the reference value and the object information.
    (Supplementary Note 19.3)
  • The information processing method according to Supplementary Note 19.2, comprising
      • detecting a size of a predetermined site of the object within the captured image as the object information, and estimating the distance to the object within the captured image based on the size of the predetermined site of the object with reference to the reference value.
  • The program described above can be stored using various types of non-transitory computer-readable mediums and supplied to a computer. The non-transitory computer-readable mediums include various types of tangible storage mediums. Examples of the non-transitory computer-readable mediums include a magnetic recording medium (for example, a flexible disk, a magnetic tape, a hard disk drive), a magneto-optical recording medium (for example, a magneto-optical disk), a CD-ROM (Read Only Memory), a CD-R, a CD-R/W, and a semiconductor memory (for example, a mask ROM, a PROM (Programmable ROM), an EPROM (Erasable PROM), a flash ROM, a RAM (Random Access Memory). The program may be supplied to a computer by various types of transitory computer-readable mediums. Examples of the transitory computer-readable mediums include electric signals, optical signals, and electromagnetic waves. The transitory computer-readable medium can supply a program to a computer via a wired communication path such as an electric line and an optical fiber, or a wireless communication path.
  • Although the present invention has been described above with reference to the example embodiments, the present invention is not limited to the example embodiments described above. The configurations and details of the present invention can be changed in various manners that can be understood by one skilled in the art within the scope of the present invention.
  • The present invention is based upon and claims the benefit of priority from Japanese patent application No. 2018-014275, filed on Jan. 31, 2018, the disclosure of which is incorporated herein in its entirety by reference.
  • DESCRIPTION OF NUMERALS
      • 10 face authentication system
      • 11 feature value extraction unit
      • 12 distance measurement unit
      • 13 matching unit
      • 14 gate control unit
      • feature value storage unit
      • 16 matching data storage unit
      • 100 information processing system
      • 200 information processing device
      • 110 capturing means
      • 120, 220 image processing means
      • 130, 230 distance estimating means
      • 140, 240 matching means
      • C capture device
      • D display device
      • E rangefinder
      • G gate

Claims (15)

1. An information processing method comprising:
with respect to each of a plurality of captured images in which an area of a gate is captured, extracting a feature amount of a same object in the captured image;
storing the feature amount;
estimating a distance from the gate to the object in the captured image; and
performing matching determination based on the estimated distance and the stored feature amount of the object from which the distance is estimated.
2. The information processing method according to claim 1, further comprising
updating and storing one of the feature amounts extracted.
3. The information processing method according to claim 1, further comprising
determining quality of the feature amounts extracted; and
updating and storing one of the feature amounts based on the quality of the feature amounts.
4. The information processing method according to claim 1, further comprising
performing matching determination between the stored feature value of the object located closest to the gate and the previously registered feature value.
5. The information processing method according to claim 1, further comprising
determining an attribute of the object in the captured image; and
estimating the distance from the gate to the object in the captured image based on a reference value corresponding to the attribute.
6. An information processing device comprising:
at least one memory configured to store instructions; and
at least one processor configured to execute instructions to:
with respect to each of a plurality of captured images in which an area of a gate is captured, extract a feature amount of a same object in the captured image;
store the feature amount;
estimate a distance from the gate to the object in the captured image; and
perform matching determination based on the estimated distance and the stored feature amount of the object from which the distance is estimated.
7. The information processing device according to claim 6, wherein the at least one processor is configured to execute the instructions to
update and store one of the feature amounts extracted.
8. The information processing device according to claim 6, wherein the at least one processor is configured to execute the instructions to
determine quality of the feature amounts extracted; and
update and store one of the feature amounts based on the quality of the feature amounts.
9. The information processing device according to claim 6, wherein the at least one processor is configured to execute the instructions to
perform matching determination between the stored feature value of the object located closest to the gate and the previously registered feature value.
10. The information processing device according to claim 6, wherein the at least one processor is configured to execute the instructions to
determine an attribute of the object in the captured image; and
estimate the distance from the gate to the object in the captured image based on a reference value corresponding to the attribute.
11. A non-transitory computer-readable medium storing thereon a program comprising instructions for causing a computer to execute processing to:
with respect to each of a plurality of captured images in which an area of a gate is captured, extract a feature amount of a same object in the captured image;
store the feature amount;
estimate a distance from the gate to the object in the captured image; and
perform matching determination based on the estimated distance and the stored feature amount of the object from which the distance is estimated.
12. The non-transitory computer-readable medium storing thereon the program according to claim 11, wherein the program comprises instructions for causing the computer to execute the processing to
update and store one of the feature amounts extracted.
13. The non-transitory computer-readable medium storing thereon the program according to claim 11, wherein the program comprises instructions for causing the computer to execute the processing to
determine quality of the feature amounts extracted; and
update and store one of the feature amounts based on the quality of the feature amounts.
14. The non-transitory computer-readable medium storing thereon the program according to claim 11, wherein the program comprises instructions for causing the computer to execute the processing to
perform matching determination between the stored feature value of the object located closest to the gate and the previously registered feature value.
15. The non-transitory computer-readable medium storing thereon the program according to claim 11, wherein the program comprises instructions for causing the computer to execute the processing to
determine an attribute of the object in the captured image; and
estimate the distance from the gate to the object in the captured image based on a reference value corresponding to the attribute.
US18/209,904 2018-01-31 2023-06-14 Information processing device Pending US20230326247A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US18/209,904 US20230326247A1 (en) 2018-01-31 2023-06-14 Information processing device

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
JP2018-014275 2018-01-31
JP2018014275A JP6601513B2 (en) 2018-01-31 2018-01-31 Information processing device
PCT/JP2019/002335 WO2019151117A1 (en) 2018-01-31 2019-01-24 Information processing device
US202016965097A 2020-07-27 2020-07-27
US17/717,266 US11727723B2 (en) 2018-01-31 2022-04-11 Information processing device
US18/209,904 US20230326247A1 (en) 2018-01-31 2023-06-14 Information processing device

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US17/717,266 Continuation US11727723B2 (en) 2018-01-31 2022-04-11 Information processing device

Publications (1)

Publication Number Publication Date
US20230326247A1 true US20230326247A1 (en) 2023-10-12

Family

ID=67478713

Family Applications (5)

Application Number Title Priority Date Filing Date
US16/965,097 Active US11335125B2 (en) 2018-01-31 2019-01-24 Information processing device
US17/717,266 Active US11727723B2 (en) 2018-01-31 2022-04-11 Information processing device
US18/209,901 Pending US20230326246A1 (en) 2018-01-31 2023-06-14 Information processing device
US18/209,904 Pending US20230326247A1 (en) 2018-01-31 2023-06-14 Information processing device
US18/209,887 Pending US20230326245A1 (en) 2018-01-31 2023-06-14 Information processing device

Family Applications Before (3)

Application Number Title Priority Date Filing Date
US16/965,097 Active US11335125B2 (en) 2018-01-31 2019-01-24 Information processing device
US17/717,266 Active US11727723B2 (en) 2018-01-31 2022-04-11 Information processing device
US18/209,901 Pending US20230326246A1 (en) 2018-01-31 2023-06-14 Information processing device

Family Applications After (1)

Application Number Title Priority Date Filing Date
US18/209,887 Pending US20230326245A1 (en) 2018-01-31 2023-06-14 Information processing device

Country Status (4)

Country Link
US (5) US11335125B2 (en)
EP (1) EP3748576A4 (en)
JP (1) JP6601513B2 (en)
WO (1) WO2019151117A1 (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6601513B2 (en) * 2018-01-31 2019-11-06 日本電気株式会社 Information processing device
WO2021059537A1 (en) * 2019-09-27 2021-04-01 日本電気株式会社 Information processing device, terminal device, information processing system, information processing method, and recording medium
JP7400975B2 (en) 2020-06-17 2023-12-19 日本電気株式会社 Face recognition method
CN111768542B (en) * 2020-06-28 2022-04-19 浙江大华技术股份有限公司 Gate control system, method and device, server and storage medium
WO2022034921A1 (en) * 2020-08-14 2022-02-17 Nec Corporation Information processing apparatus, information processing method, and storage medium

Family Cites Families (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006236244A (en) 2005-02-28 2006-09-07 Toshiba Corp Face authenticating device, and entering and leaving managing device
JP2007146986A (en) 2005-11-29 2007-06-14 Yoshinobu Wada Gear shift device for bicycle
JP2007148988A (en) * 2005-11-30 2007-06-14 Toshiba Corp Face authentication unit, face authentication method, and entrance/exit management device
JP2007148987A (en) 2005-11-30 2007-06-14 Toshiba Corp Face authentication system, and entrance and exit management system
JP4836633B2 (en) * 2006-03-31 2011-12-14 株式会社東芝 Face authentication device, face authentication method, and entrance / exit management device
JP2007328572A (en) 2006-06-08 2007-12-20 Matsushita Electric Ind Co Ltd Face authentication device and face authentication method
JP2007334623A (en) * 2006-06-15 2007-12-27 Toshiba Corp Face authentication device, face authentication method, and access control device
US20080080748A1 (en) * 2006-09-28 2008-04-03 Kabushiki Kaisha Toshiba Person recognition apparatus and person recognition method
JP4638899B2 (en) 2007-07-09 2011-02-23 株式会社クボタ Passenger rice transplanter
JP2009053914A (en) * 2007-08-27 2009-03-12 Seiko Epson Corp Image processor and image processing method
US8116713B2 (en) * 2008-11-26 2012-02-14 Visteon Global Technologies, Inc. Automatic bandwidth control with high-deviation detection
KR101740231B1 (en) * 2010-11-17 2017-05-26 삼성전자주식회사 Method and apparatus for estimating 3d face position
JP6151582B2 (en) * 2013-06-14 2017-06-21 セコム株式会社 Face recognition system
EP3072293B1 (en) * 2013-11-18 2017-03-08 Koninklijke Philips N.V. Video surveillance for mri safety monitoring
JP6396171B2 (en) 2014-10-27 2018-09-26 株式会社ソニー・インタラクティブエンタテインメント Information processing device
JP2018014275A (en) 2016-07-22 2018-01-25 三菱ケミカル株式会社 Porous electrode substrate
FR3055161B1 (en) * 2016-08-19 2023-04-07 Safran Identity & Security MONITORING METHOD BY MEANS OF A MULTI-SENSOR SYSTEM
JP6601513B2 (en) * 2018-01-31 2019-11-06 日本電気株式会社 Information processing device

Also Published As

Publication number Publication date
EP3748576A1 (en) 2020-12-09
JP6601513B2 (en) 2019-11-06
EP3748576A4 (en) 2021-03-31
US11727723B2 (en) 2023-08-15
US20230326246A1 (en) 2023-10-12
US20230326245A1 (en) 2023-10-12
US20210117655A1 (en) 2021-04-22
US20220230470A1 (en) 2022-07-21
JP2019133364A (en) 2019-08-08
WO2019151117A1 (en) 2019-08-08
US11335125B2 (en) 2022-05-17

Similar Documents

Publication Publication Date Title
US11727723B2 (en) Information processing device
US11978295B2 (en) Collation system
US9953225B2 (en) Image processing apparatus and image processing method
US8325981B2 (en) Human tracking apparatus, human tracking method, and human tracking processing program
US20070291998A1 (en) Face authentication apparatus, face authentication method, and entrance and exit management apparatus
EP3435665A1 (en) Monitoring device and monitoring system
US20220145690A1 (en) Information processing device
JP7006668B2 (en) Information processing equipment
CN110892412B (en) Face recognition system, face recognition method, and face recognition program
JP6915673B2 (en) Information processing system
US20230267788A1 (en) Face authentication method
JP7279774B2 (en) Information processing equipment
JP6774036B2 (en) Collation system
JP2020201999A (en) Collation system
JP2021193268A (en) Information processing system
US20230062785A1 (en) Estimation apparatus, estimation method, and computer program product

Legal Events

Date Code Title Description
STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER