WO2022230117A1 - 情報処理システム、情報処理方法、及び記録媒体 - Google Patents
情報処理システム、情報処理方法、及び記録媒体 Download PDFInfo
- Publication number
- WO2022230117A1 WO2022230117A1 PCT/JP2021/017010 JP2021017010W WO2022230117A1 WO 2022230117 A1 WO2022230117 A1 WO 2022230117A1 JP 2021017010 W JP2021017010 W JP 2021017010W WO 2022230117 A1 WO2022230117 A1 WO 2022230117A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- user
- information processing
- processing system
- line
- score
- Prior art date
Links
- 230000010365 information processing Effects 0.000 title claims abstract description 159
- 238000003672 processing method Methods 0.000 title claims description 14
- 238000012937 correction Methods 0.000 claims description 73
- 238000012545 processing Methods 0.000 claims description 41
- 238000004364 calculation method Methods 0.000 claims description 33
- 238000004590 computer program Methods 0.000 claims description 14
- 210000003128 head Anatomy 0.000 description 32
- 238000010586 diagram Methods 0.000 description 21
- 238000000034 method Methods 0.000 description 18
- 230000000694 effects Effects 0.000 description 17
- 230000006870 function Effects 0.000 description 8
- 230000033001 locomotion Effects 0.000 description 4
- 208000029152 Small face Diseases 0.000 description 2
- 238000012706 support-vector machine Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 1
- 230000001815 facial effect Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 238000004088 simulation Methods 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
- G06V40/193—Preprocessing; Feature extraction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/77—Determining position or orientation of objects or cameras using statistical methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/012—Head tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/161—Detection; Localisation; Normalisation
- G06V40/167—Detection; Localisation; Normalisation using comparisons between temporally consecutive images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/168—Feature extraction; Face representation
- G06V40/171—Local features and components; Facial parts ; Occluding parts, e.g. glasses; Geometrical relationships
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/40—Spoof detection, e.g. liveness detection
- G06V40/45—Detection of the body part being alive
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30196—Human being; Person
- G06T2207/30201—Face
Definitions
- This disclosure relates to the technical field of an information processing system, an information processing method, and a recording medium for making user-related decisions.
- Japanese Patent Application Laid-Open No. 2002-200002 discloses that authentication fails when the orientation of the detected face is greater than or equal to a predetermined angle.
- Patent Document 2 discloses analyzing the positional relationship between the estimated line of sight and the display to determine whether or not the image is a spoofed image.
- Patent Document 3 discloses that when the detected line-of-sight direction is the authentication-permitted line-of-sight direction, it is determined that authentication has been performed correctly.
- Patent Literature 4 discloses determining the possibility that a face represented by a face image sequence is impersonation based on information on changes in line of sight over time.
- JP 2007-148968 A Japanese Patent Application Laid-Open No. 2008-015800 JP 2017-142859 A Japanese Patent Application Laid-Open No. 2020-194608
- the purpose of this disclosure is to improve the technology disclosed in prior art documents.
- One aspect of the information processing system disclosed herein includes face orientation obtaining means for obtaining a face orientation of a user, sight line direction obtaining means for obtaining a sight line direction of the user, and when the face orientation is equal to or greater than a predetermined threshold, , determination means for determining whether or not the user is a living body based on the difference between the face direction and the line-of-sight direction; and an output means for outputting the result of the determination.
- face orientation acquisition means for acquiring the face orientation of a user
- line-of-sight direction acquisition means for acquiring the line-of-sight direction of the user, and the difference between the face direction and the line-of-sight direction.
- a calculating means for calculating a score indicating the bio-likeness of the user
- a correcting means for correcting the score according to the size of the face orientation
- an output means for outputting the result of the determination.
- One aspect of the information processing method of this disclosure acquires a user's face direction, acquires the user's line-of-sight direction, and, if the face direction is equal to or greater than a predetermined threshold, determines whether the face direction and the line-of-sight direction Based on the difference, it is determined whether or not the user is a living body, and the result of the determination is output.
- a computer acquires a user's face direction, acquires the user's line-of-sight direction, and if the face direction is equal to or greater than a predetermined threshold, the face direction and the line-of-sight direction and a computer program for executing an information processing method for determining whether or not the user is a living body based on the difference between and outputting the result of the determination.
- FIG. 2 is a block diagram showing the hardware configuration of the information processing system according to the first embodiment
- FIG. 1 is a block diagram showing a functional configuration of an information processing system according to a first embodiment
- FIG. 4 is a flowchart showing the flow of operations of the information processing system according to the first embodiment
- It is a block diagram which shows the functional structure of the information processing system which concerns on 2nd Embodiment
- 9 is a flowchart showing the flow of operations of an information processing system according to the second embodiment; It is a graph which shows the distribution example of the score calculated by the information processing system which concerns on 3rd Embodiment. It is a graph which shows an example of the correction method of the score in the information processing system which concerns on 3rd Embodiment.
- FIG. 1 is a block diagram showing a functional configuration of an information processing system according to a first embodiment
- FIG. 4 is a flowchart showing the flow of operations of the information processing system according to the first embodiment
- It is a block diagram which shows the functional structure of
- FIG. 12 is a block diagram showing a functional configuration of an information processing system according to a fourth embodiment
- FIG. 14 is a flow chart showing the flow of operations of an information processing system according to the fourth embodiment
- FIG. FIG. 12 is a block diagram showing a functional configuration of an information processing system according to a fifth embodiment
- FIG. 14 is a flow chart showing the flow of operations of an information processing system according to the fifth embodiment
- It is a figure which shows the example of a display by the information processing system which concerns on 5th Embodiment.
- FIG. 16 is a graph showing acquisition timings of the face direction and line-of-sight direction by the information processing system according to the sixth embodiment
- FIG. FIG. 11 is a conceptual diagram showing a score calculation method by an information processing system according to a sixth embodiment;
- FIG. 22 is a block diagram showing a functional configuration of an information processing system according to a seventh embodiment
- FIG. FIG. 16 is a flow chart showing the flow of operations of an information processing system according to the seventh embodiment
- FIG. FIG. 21 is a diagram (part 1) showing a display example by the information processing system according to the seventh embodiment
- FIG. 21 is a diagram (part 2) showing a display example by the information processing system according to the seventh embodiment
- FIG. 22 is a block diagram showing a functional configuration of an information processing system according to an eighth embodiment
- FIG. FIG. 21 is a flow chart showing the flow of operations of an information processing system according to the eighth embodiment
- FIG. 1 is a block diagram showing the hardware configuration of an information processing system according to the first embodiment.
- an information processing system 10 includes a processor 11, a RAM (Random Access Memory) 12, a ROM (Read Only Memory) 13, and a storage device .
- Information processing system 10 may further include input device 15 , output device 16 and camera 20 .
- Processor 11 , RAM 12 , ROM 13 , storage device 14 , input device 15 , output device 16 and camera 20 are connected via data bus 17 .
- the processor 11 reads a computer program.
- processor 11 is configured to read a computer program stored in at least one of RAM 12, ROM 13 and storage device .
- the processor 11 may read a computer program stored in a computer-readable recording medium using a recording medium reader (not shown).
- the processor 11 may acquire (that is, read) a computer program from a device (not shown) arranged outside the information processing system 10 via a network interface.
- the processor 11 controls the RAM 12, the storage device 14, the input device 15 and the output device 16 by executing the read computer program.
- a functional block for determining whether or not the user is a living body is implemented within the processor 11 .
- the processor 11 may be configured as, for example, a CPU (Central Processing Unit), GPU (Graphics Processing Unit), FPGA (Field-Programmable Gate Array), DSP (Demand-Side Platform), and ASIC (Application Specific Integrate).
- the processor 11 may be configured with one of these, or may be configured to use a plurality of them in parallel.
- the RAM 12 temporarily stores computer programs executed by the processor 11.
- the RAM 12 temporarily stores data temporarily used by the processor 11 while the processor 11 is executing the computer program.
- the RAM 12 may be, for example, a D-RAM (Dynamic RAM).
- the ROM 13 stores computer programs executed by the processor 11 .
- the ROM 13 may also store other fixed data.
- the ROM 13 may be, for example, a P-ROM (Programmable ROM).
- the storage device 14 stores data that the information processing system 10 saves for a long period of time.
- Storage device 14 may act as a temporary storage device for processor 11 .
- the storage device 14 may include, for example, at least one of a hard disk device, a magneto-optical disk device, an SSD (Solid State Drive), and a disk array device.
- the input device 15 is a device that receives input instructions from the user of the information processing system 10 .
- Input device 15 may include, for example, at least one of a keyboard, mouse, and touch panel.
- the output device 16 is a device that outputs information about the information processing system 10 to the outside.
- the output device 16 may be a display device (eg, display) capable of displaying information regarding the information processing system 10 .
- the camera 20 is a camera installed at a location where an image of the user (for example, an image including the user's face) can be captured.
- the camera 20 may be a camera that captures still images, or may be a camera that captures moving images.
- the camera 20 may be configured as a visible light camera or as a near-infrared camera.
- FIG. 2 is a block diagram showing the functional configuration of the information processing system according to the first embodiment.
- the information processing system 10 includes, as processing blocks for realizing the functions thereof, a face orientation acquisition unit 110, a line-of-sight direction acquisition unit 120, a determination unit 130, an output and a part 140 .
- the user information database 110 may be configured including, for example, the storage device 14 (see FIG. 1) described above.
- Each of the face orientation acquisition unit 110, the line-of-sight direction acquisition unit 120, the determination unit 130, and the output unit 140 may be implemented by, for example, the above-described processor 11 (see FIG. 1).
- the output unit 140 may be configured to output its output via the output device 16 (see FIG. 1) described above.
- the face direction acquisition unit 110 is configured to be able to acquire the user's face direction (that is, the direction in which the user's face is facing).
- the user's face orientation may be acquired as information indicating, for example, how many degrees it deviates from a reference direction.
- the state in which the user is facing the camera may be assumed to be the front, and may be acquired as information indicating how many times the camera is deviated from the front.
- the face orientation acquisition unit 110 may be configured, for example, to acquire (estimate) the user's face orientation from the user's face image. Note that existing techniques can be appropriately adopted for a specific method of acquiring the face orientation from the image, so detailed description thereof will be omitted here.
- the line-of-sight direction acquisition unit 120 is configured to be able to acquire the user's line-of-sight direction (that is, the direction in which the user's line of sight is directed).
- the user's line-of-sight direction may be acquired as information indicating how many degrees it deviates from a reference direction, for example, in the same way as the face orientation.
- the state in which the user is facing the camera may be assumed to be the front, and may be acquired as information indicating how many times the camera is deviated from the front.
- the line-of-sight direction acquisition unit 120 may be configured to acquire (estimate) the user's line-of-sight direction from, for example, the user's face image (more specifically, an image including the eye area). It should be noted that existing techniques can be appropriately adopted for a specific method of acquiring the line-of-sight direction from the image, so detailed description thereof will be omitted here.
- the determining unit 130 determines whether the user is a living body (in other words, , whether spoofing is performed) can be determined.
- the determination section 130 includes a threshold determination section 131 and a difference calculation section 132 .
- the threshold determination unit 131 is configured to be able to determine whether or not the user's face orientation acquired by the face orientation acquisition unit 110 is greater than or equal to a predetermined threshold.
- the "predetermined threshold” is a threshold for determining whether the face orientation is large enough to determine whether or not the user is a living body (that is, the user is facing sideways). It is sufficient if the optimum value is obtained by simulation or the like. Further, the predetermined threshold may be set as a threshold that can determine not only the size of the user's face orientation, but also whether or not the direction of the user's face orientation is in a predetermined direction.
- the threshold when the user is instructed to look to the right, the threshold may be set so that it can be determined that the user is not looking to another direction but to the right.
- the determination unit 130 is configured to determine whether or not the user is a living body when the threshold value determination unit 131 determines that the face orientation of the user is greater than or equal to a predetermined threshold value. In other words, the determination unit 130 is configured not to determine whether or not the user is a living body when it is determined that the face direction of the user is less than the predetermined threshold.
- the difference calculation unit 132 is configured to be able to calculate the difference between the user's face direction acquired by the face direction acquisition unit 110 and the user's line-of-sight direction acquired by the line-of-sight direction acquisition unit 120 .
- the determination unit 130 is configured to determine whether the user is a living body based on the difference between the face direction and the line-of-sight direction. A specific information processing method using the difference between the face direction and the line-of-sight direction will be described in detail in another embodiment described later.
- the output unit 140 is configured to be able to output the determination result of the determination unit 130 (that is, the determination result as to whether or not the user is a living body).
- the output mode of the output unit 140 is not particularly limited, but the output unit 140 may output an image using, for example, a display. Alternatively, the output unit 140 may output audio using a speaker.
- FIG. 3 is a flow chart showing the operation flow of the information processing system according to the first embodiment.
- the face orientation acquisition unit 110 first acquires the user's face orientation (step S101). Also, the line-of-sight direction acquisition unit 120 acquires the line-of-sight direction of the user (step S102). Note that the processes of steps S101 and S102 may be executed in sequence with each other, or may be executed in parallel at the same time.
- the threshold determination unit 131 determines whether or not the user's face orientation acquired by the face orientation acquisition unit 110 is equal to or greater than a predetermined threshold (step S103). Note that if it is determined that the user's face orientation is not equal to or greater than the predetermined threshold (step S103: NO), subsequent processing is omitted, and the series of processing ends. On the other hand, when it is determined that the user's face orientation is greater than or equal to the predetermined threshold (step S103: YES), the difference calculation unit 132 calculates the difference between the user's face orientation and the line-of-sight direction (step S104).
- the determination unit 130 determines whether or not the user is a living body, based on the difference between the user's face orientation and line-of-sight direction (step S105). Then, the output unit 140 outputs the determination result of the determination unit 130 (step S106).
- the information processing system 10 it is determined whether or not the user is a living body when the face orientation is greater than or equal to a predetermined threshold. In this way, it is possible to prevent an erroneous determination result from being output due to a small face orientation of the user. Therefore, it is possible to more accurately determine whether or not the user is a living body.
- FIG. 4 An information processing system 10 according to the second embodiment will be described with reference to FIGS. 4 and 5.
- FIG. The second embodiment may differ from the above-described first embodiment only in a part of configuration and operation, and the other parts may be the same as those of the first embodiment. For this reason, in the following, portions different from the already described first embodiment will be described in detail, and descriptions of overlapping portions will be omitted as appropriate.
- FIG. 4 is a block diagram showing the functional configuration of an information processing system according to the second embodiment.
- the same reference numerals are given to the same elements as those shown in FIG.
- the information processing system 10 includes processing blocks for realizing the functions thereof, including a face orientation acquisition unit 110, a line-of-sight direction acquisition unit 120, a determination unit 130, an output and a part 140 .
- the determination unit 130 according to the second embodiment particularly includes a threshold determination unit 131 , a difference calculation unit 132 , a score calculation unit 133 , a score correction unit 134 and a determination processing unit 135 . That is, the determination unit 130 according to the second embodiment further includes a score calculation unit 133, a score correction unit 134, and a determination processing unit 135 in addition to the configuration of the first embodiment (see FIG. 2). there is
- the score calculation unit 133 is configured to be able to calculate a score indicating the user's bio-likeness from the difference between the user's face orientation and line-of-sight direction. For example, the higher the score, the higher the possibility that the user is a living organism. A specific score calculation method will be described in detail in another embodiment described later.
- the score correction unit 134 is configured to be able to correct the score calculated by the score calculation unit 133 according to the size of the user's face orientation. Correction of the score is performed so that an erroneous determination result is less likely to be obtained in the determination unit 130 . A specific method for correcting the score will be described in detail in another embodiment described later.
- the determination processing unit 135 is configured to be able to determine whether or not the user is a living body based on the score corrected by the score correction unit 134 .
- the determination processing unit 135 stores, for example, a determination threshold value (that is, a threshold value for determining whether or not the user is a living body), and compares the corrected score with the determination threshold value to determine whether the user is a living body. It may be determined whether
- FIG. 5 is a flow chart showing the operation flow of the information processing system according to the second embodiment.
- the same reference numerals are assigned to the same processes as those shown in FIG.
- the face orientation acquisition unit 110 first acquires the user's face orientation (step S101). Also, the line-of-sight direction acquisition unit 120 acquires the line-of-sight direction of the user (step S102).
- the threshold determination unit 131 determines whether or not the user's face orientation acquired by the face orientation acquisition unit 110 is equal to or greater than a predetermined threshold (step S103). Note that if it is determined that the user's face orientation is not equal to or greater than the predetermined threshold (step S103: NO), subsequent processing is omitted, and the series of processing ends. On the other hand, when it is determined that the user's face orientation is greater than or equal to the predetermined threshold (step S103: YES), the difference calculation unit 132 calculates the difference between the user's face orientation and the line-of-sight direction (step S104).
- the score calculation unit 133 calculates a score from the difference between the user's face orientation and line-of-sight direction (step S201). Then, the score correction unit 134 corrects the score according to the size of the user's face orientation (step S202). After that, the determination processing unit 135 determines whether or not the user is a living body based on the corrected score (step S203). Then, the output unit 140 outputs the determination result of the determination unit 130 (step S106).
- the score calculated from the difference between the face orientation and the line-of-sight direction is corrected according to the size of the face orientation. Based on the obtained score, it is determined whether or not the user is a living body. By doing so, it is possible to determine whether or not the user is a living body more accurately than when the score is not corrected.
- FIG. 6 An information processing system 10 according to the third embodiment will be described with reference to FIGS. 6 and 7.
- FIG. The third embodiment describes a specific correction example of the above-described second embodiment, and other configurations and operations may be the same as those of the first and second embodiments. Therefore, in the following, portions different from the already described embodiments will be described in detail, and descriptions of other overlapping portions will be omitted as appropriate.
- FIG. 6 is a graph showing an example distribution of scores calculated by the information processing system according to the third embodiment.
- the biometric score is calculated to be a positive value
- the 3D mask (that is, spoofing) score is calculated to be a negative value. be.
- the higher the user's face orientation the higher the score tends to be. For this reason, as can be seen from the figure, when the face direction is relatively small, the score may become a negative value even for a living body. In addition, the score may become a positive value even if the subject is not a living body due to an estimation error of the direction of the face and the direction of the line of sight.
- the information processing system 10 corrects the score according to the size of the face orientation.
- FIG. 7 is a graph showing an example of a score correction method in the information processing system according to the third embodiment.
- correction is performed such that the larger the face direction, the larger the correction amount to be subtracted from the score. Therefore, the score when the face direction is small is corrected to be slightly smaller, and the score when the face direction is large is corrected to be much smaller.
- Such correction can be performed using, for example, the following correction formula (1).
- Score after correction ⁇ A ⁇ face orientation angle+B ⁇ score before correction+C (1) It should be noted that A, B, and C in the above formula are predetermined coefficients and are obtained in advance.
- the scores for non-living bodies are all negative values. Therefore, it is possible to prevent a user who is not a living body from being erroneously determined to be a living body.
- the score in the case of a living body is a negative value in some parts with small face orientations, but the determination target region (that is, the region to be determined by the threshold value determination unit 131) that is equal to or greater than the predetermined threshold value ) are all positive values. Therefore, it is possible to prevent the user, who is a living body, from being erroneously determined not to be a living body.
- the larger the user's face orientation the larger the correction amount to be subtracted from the score (that is, the smaller the score). ) is corrected.
- the calculated score is corrected to an appropriate value, so it is possible to more accurately determine whether or not the user is a living body.
- FIG. 8 An information processing system 10 according to the fourth embodiment will be described with reference to FIGS. 8 and 9.
- FIG. 8 It should be noted that the fourth embodiment may differ from the above-described first to third embodiments only in a part of configuration and operation, and other parts may be the same as those of the first to third embodiments. Therefore, in the following description, descriptions of portions that overlap with the already described embodiments will be omitted as appropriate.
- FIG. 8 is a block diagram showing the functional configuration of an information processing system according to the fourth embodiment.
- elements similar to those shown in FIGS. 2 and 4 are given the same reference numerals.
- the information processing system 10 includes processing blocks for realizing the functions thereof, including a face orientation acquisition unit 110, a line-of-sight direction acquisition unit 120, a determination unit 130, an output and a part 140 .
- the score correction unit 134 particularly includes a correction formula storage unit 1341 and a correction formula selection unit 1342 .
- the correction formula storage unit 1341 stores a plurality of correction formulas for correcting scores.
- the correction formula storage unit 1341 may store a plurality of correction formulas such as the above formula (1) with different coefficients.
- the correction formula selection unit 1342 is configured to be able to select one correction formula to be used for score correction from a plurality of correction formulas stored in the correction formula storage unit 1341 .
- the correction formula selection unit 1342 selects a correction formula according to information related to the user.
- Information related to the user is information related to the user, such as information indicating characteristics, attributes, and states of the user.
- the information related to the user may be, for example, information including numerical values such as the position of the eyes (for example, how far apart the eyes are), or information indicating facial tendencies (for example, information on race). There may be.
- the information related to the user may be information including the sex of the user, the size of the face, and the like. Information related to the user may be obtained from the user's image, or may be obtained from other means (eg, input by the user, pre-registration information, etc.).
- FIG. 9 is a flow chart showing the operation flow of the information processing system according to the fourth embodiment.
- the same reference numerals are assigned to the same processes as those shown in FIGS.
- the face orientation acquisition unit 110 first acquires the user's face orientation (step S101). Also, the line-of-sight direction acquisition unit 120 acquires the line-of-sight direction of the user (step S102).
- the threshold determination unit 131 determines whether or not the user's face orientation acquired by the face orientation acquisition unit 110 is equal to or greater than a predetermined threshold (step S103). Note that if it is determined that the user's face orientation is not equal to or greater than the predetermined threshold (step S103: NO), subsequent processing is omitted, and the series of processing ends. On the other hand, when it is determined that the user's face orientation is greater than or equal to the predetermined threshold (step S103: YES), the difference calculation unit 132 calculates the difference between the user's face orientation and the line-of-sight direction (step S104).
- the score calculation unit 133 calculates a score from the difference between the user's face orientation and line-of-sight direction (step S201).
- the correction formula selection unit 1342 selects one correction formula to be used for score correction from a plurality of correction formulas stored in the correction formula storage unit 1341 (step S401).
- the score correction unit 134 corrects the score using the correction formula selected by the correction formula selection unit 1342 (step S402).
- the determination processing unit 135 determines whether or not the user is a living body based on the corrected score (step S203). Then, the output unit 140 outputs the determination result of the determination unit 130 (step S106).
- the score correction unit stores a plurality of correction formulas, and selects a correction formula according to the user (i.e., Correction formula) to correct the score.
- Correction formula a correction formula according to the user
- FIG. 10 An information processing system 10 according to the fifth embodiment will be described with reference to FIGS. 10 to 12.
- FIG. 10 It should be noted that the fifth embodiment may differ from the first to fourth embodiments described above only in a part of the configuration and operation, and the other parts may be the same as those of the first to fourth embodiments. Therefore, in the following, portions different from the respective embodiments already described will be described in detail, and descriptions of other overlapping portions will be omitted as appropriate.
- FIG. 10 is a block diagram showing the functional configuration of an information processing system according to the fifth embodiment.
- symbol is attached
- the information processing system 10 includes processing blocks for realizing the functions thereof, including a face direction acquisition unit 110, a line-of-sight direction acquisition unit 120, a determination unit 130, an output It is configured with a unit 140 and a notification unit 150 . That is, the information processing system 10 according to the fifth embodiment further includes a notification unit 150 in addition to the configuration of the second embodiment (see FIG. 4).
- the notification unit 150 is configured to be able to make a notification according to the determination result of the determination processing unit 135. More specifically, when the determination result of the determination processing unit 135 changes before and after the score is corrected, the notification unit 150 is configured to be able to notify that effect.
- the determination processing unit 135 according to the fifth embodiment is configured to perform not only determination based on the corrected score but also determination based on the score before correction (hereinafter referred to as “provisional determination” as appropriate).
- FIG. 11 is a flow chart showing the operation flow of the information processing system according to the fifth embodiment.
- symbol is attached
- the face orientation acquisition unit 110 first acquires the user's face orientation (step S101). Also, the line-of-sight direction acquisition unit 120 acquires the line-of-sight direction of the user (step S102).
- the threshold determination unit 131 determines whether or not the user's face orientation acquired by the face orientation acquisition unit 110 is equal to or greater than a predetermined threshold (step S103). Note that if it is determined that the user's face orientation is not equal to or greater than the predetermined threshold (step S103: NO), subsequent processing is omitted, and the series of processing ends. On the other hand, when it is determined that the user's face orientation is greater than or equal to the predetermined threshold (step S103: YES), the difference calculation unit 132 calculates the difference between the user's face orientation and the line-of-sight direction (step S104).
- the score calculation unit 133 calculates a score from the difference between the user's face orientation and line-of-sight direction (step S201).
- the determination processing unit 135 temporarily determines whether or not the user is a living body based on the score before correction (step S501).
- the score correction unit 134 corrects the score according to the size of the user's face orientation (step S202). Then, the determination processing unit 135 determines whether or not the user is a living body based on the corrected score (step S203).
- the notification unit 150 determines whether the determination result of the determination processing unit 135 has changed before and after the score is corrected (step S502). If it is determined that the determination result has changed before and after the score correction (step S502: YES), the notification unit 150 notifies that the determination result has changed due to the correction (step S503). Then, the output unit 140 outputs the determination result of the determination unit 130 (step S106).
- the notification by the notification unit 150 may be performed together with the output by the output unit 140.
- the notification by the notification unit 150 and the output by the output unit 140 may be performed using, for example, a common display.
- the notification by the notification unit 150 may be made to the user, or may be made to the system administrator or the like.
- FIG. 12 is a diagram showing a display example by the information processing system according to the fifth embodiment.
- the determination result by the determination unit 130 (that is, the output by the output unit 140) may be displayed together with the user's face image.
- the user's determination result is output as "OK" (that is, living body).
- a notification by the notification unit 150 may be displayed in addition to the determination result.
- a message "Results have changed due to score correction" is displayed below the judgment results. Note that instead of such a message, for example, a message suggesting that the reliability is low or a message prompting redoing the determination may be displayed.
- the A process is executed to notify that In this way, the notification will be made in a situation where the original score is near the judgment threshold. It is possible to know that there is a low possibility).
- FIG. 13 and 14 An information processing system 10 according to the sixth embodiment will be described with reference to FIGS. 13 and 14.
- FIG. 13 It should be noted that the sixth embodiment may differ from the first to fifth embodiments described above only in a part of the configuration and operation, and the other parts may be the same as those of the first to fifth embodiments. Therefore, in the following, portions different from the respective embodiments already described will be described in detail, and descriptions of other overlapping portions will be omitted as appropriate.
- FIG. 13 is a graph showing acquisition timings of the face direction and line-of-sight direction by the information processing system according to the sixth embodiment.
- the face direction and line-of-sight direction of the user are acquired. More specifically, a plurality of images are captured while the user is shaking his or her head, and the user's face orientation and line-of-sight direction are acquired (estimated) from these images.
- the user's face direction and line-of-sight direction are acquired at the timings (P1, P2, P3, and P4 in the figure) when the user's face direction peaks.
- the user's face orientation and line-of-sight direction are obtained four times in total, two times at the peak in the right direction and two times at the peak in the left direction.
- the number of acquisitions here is an example, and the user's face direction and line-of-sight direction may be acquired, for example, four times or more.
- FIG. 14 is a conceptual diagram showing a score calculation method by the information processing system according to the sixth embodiment.
- a plurality of differences are calculated from face orientations and line-of-sight directions obtained at a plurality of timings. Therefore, as shown in FIG. 13, when the user's face direction and line-of-sight direction are obtained four times, P1, P2, P3, and P4, the difference between them is four points corresponding to P1, P2, P3, and P4. It is computed as containing one value (eg, a four-dimensional vector).
- the difference calculated in this manner is input to the score calculation unit 133, and the score is calculated as its output.
- the score calculation unit 133 in this case may be configured as a learned SVM (Support Vector Machine), for example.
- the face direction and line-of-sight direction are acquired at the timing of the number of seals from a plurality of images captured while the user is shaking his/her head. be done. By doing so, it is possible to determine whether or not the user is a living body more accurately than when the face direction and line-of-sight direction are obtained only once.
- FIG. 15 to 18 An information processing system 10 according to the seventh embodiment will be described with reference to FIGS. 15 to 18.
- FIG. It should be noted that the seventh embodiment may be different from the sixth embodiment described above only in part of the operation, and the other parts may be the same as those of the first to sixth embodiments. Therefore, in the following, portions different from the respective embodiments already described will be described in detail, and descriptions of other overlapping portions will be omitted as appropriate.
- FIG. 15 is a block diagram showing the functional configuration of an information processing system according to the seventh embodiment.
- symbol is attached
- the information processing system 10 according to the seventh embodiment includes processing blocks for realizing the functions thereof, including a face direction acquisition unit 110, a line-of-sight direction acquisition unit 120, a determination unit 130, an output 140 and a swing instruction section 160.
- the information processing system 10 according to the seventh embodiment further includes a swing instruction section 160 in addition to the configuration of the first embodiment (see FIG. 2).
- the determination unit 130 according to the seventh embodiment includes a swing determination unit 136 in addition to the threshold value determination unit 131 and the difference calculation unit 132 .
- the head swing determination unit 136 is configured to be able to determine whether or not the head swing (sixth embodiment: see FIG. 13) is performed appropriately when acquiring the user's face direction and gaze direction.
- the head shaking determination unit 136 determines, for example, whether the user is shaking his head sufficiently (for example, whether the peak face direction exceeds a predetermined angle), or whether the user is shaking his head in an appropriate direction (for example, , whether or not the head is swung in another direction (for example, downward) at the timing when the head should be swung in the right direction).
- the head swing instruction section 160 is configured to be capable of outputting an instruction to the user to perform appropriate head swings when the head swing determination section 136 determines that the user's head swings are inappropriate.
- the instruction by the swing instruction unit 160 may be performed together with the output by the output unit 140 .
- the instruction by the swing instruction unit and the output by the output unit 140 may be performed using, for example, a common display.
- instructions may be output for only some of the motions. For example, out of P1, P2, P3, and P4 in FIG. 14, if only the face orientation obtained at P3 is not appropriate ), the swing instruction unit 160 may output only the instruction for the swing motion corresponding to P3. Specifically, an instruction to shake the head more greatly in the direction corresponding to the action of P3 may be output.
- FIG. 16 is a flow chart showing the operation flow of the information processing system according to the seventh embodiment.
- the same reference numerals are given to the same processes as those shown in FIG.
- the face orientation acquisition unit 110 first acquires the user's face orientation (step S101). Also, the line-of-sight direction acquisition unit 120 acquires the line-of-sight direction of the user (step S102).
- the swing determination unit 136 determines whether or not the user has properly swung his/her head (step S701). Then, when it is determined that the user is not properly swinging his head (step S701: NO), the head swing instruction unit 160 outputs an instruction to the user to properly swing his head. Then, the processing of steps S101 and S102 (that is, acquisition of face direction and line-of-sight direction) is executed again.
- the threshold determination unit 131 determines that the user's face orientation acquired by the face orientation acquisition unit 110 is equal to or greater than the predetermined threshold. It is determined whether or not (step S103). Note that if it is determined that the user's face orientation is not equal to or greater than the predetermined threshold (step S103: NO), subsequent processing is omitted, and the series of processing ends. On the other hand, when it is determined that the user's face orientation is greater than or equal to the predetermined threshold (step S103: YES), the difference calculation unit 132 calculates the difference between the user's face orientation and the line-of-sight direction (step S104).
- the determination unit 130 determines whether or not the user is a living body, based on the difference between the user's face orientation and line-of-sight direction (step S105). Then, the output unit 140 outputs the determination result of the determination unit 130 (step S106).
- FIG. 17 is a diagram (Part 1) showing a display example by the information processing system according to the seventh embodiment.
- FIG. 18 is a diagram (part 2) showing a display example by the information processing system according to the seventh embodiment.
- a message prompting the user to shake his or her head more greatly may be displayed together with the image of the user. For example, as in the example of FIG. 17, a message "Please shake your head a little more" may be displayed. In this way, it can be expected that the user will shake his/her head greatly, and as a result, it is possible to acquire the user's face orientation and line-of-sight direction in a state in which the face orientation is sufficiently large.
- a message indicating the direction in which the user should shake his/her head may be displayed together with the image of the user. For example, as in the example of FIG. 18, the message "Please shake your head a little more to the left” may be displayed. In this way, for example, when the right side swing is sufficiently large, but the left side swing is insufficient, the user can issue a more accurate instruction.
- FIG. 19 An information processing system 10 according to the eighth embodiment will be described with reference to FIGS. 19 and 20.
- FIG. It should be noted that the information processing system 10 according to the eighth embodiment differs from the above-described first to seventh embodiments only in some operations, and other parts are the same as those in the first to seventh embodiments. can be Therefore, in the following, portions different from the respective embodiments already described will be described in detail, and descriptions of other overlapping portions will be omitted as appropriate.
- FIG. 19 is a block diagram showing the functional configuration of an information processing system according to the eighth embodiment.
- symbol is attached
- the information processing system 10 includes processing blocks for realizing the functions thereof, including a face direction acquisition unit 110, a line-of-sight direction acquisition unit 120, a determination unit 130, and an output and a part 140 .
- the determination unit 130 according to the eighth embodiment particularly includes a difference calculation unit 132 , a score calculation unit 133 , a score correction unit 134 and a determination processing unit 135 .
- the information processing system 10 is not provided with the threshold determination unit 131, unlike the first to seventh embodiments described above. Therefore, determination by the determining unit 130 is performed regardless of whether the face direction of the user is equal to or greater than the predetermined threshold. A specific operation of the determination unit 130 will be described in detail below.
- FIG. 20 is a flow chart showing the operation flow of the information processing system according to the eighth embodiment.
- the same reference numerals are given to the same processes as those shown in FIGS.
- the face orientation acquisition unit 110 first acquires the user's face orientation (step S801). Also, the line-of-sight direction acquisition unit 120 acquires the line-of-sight direction of the user (step S802).
- the difference calculation unit 132 calculates the difference between the user's face orientation and line-of-sight direction (step S803). Then, the score calculation unit 133 calculates a score from the difference between the user's face orientation and line-of-sight direction (step S804). After that, the score correction unit 134 corrects the score according to the size of the user's face orientation (step S805).
- the determination processing unit 135 determines whether or not the user is a living body based on the corrected score (step S806). Then, the output unit 140 outputs the determination result of the determination unit 130 (step S807).
- the score calculated from the difference between the face direction and the line-of-sight direction is corrected according to the size of the face direction. Then, it is determined whether or not the user is a living body based on the corrected score. In this way, since the score becomes an appropriate value by correction, it becomes possible to more accurately determine whether or not the user is a living body.
- a processing method of recording a program for operating the configuration of each embodiment so as to realize the functions of each embodiment described above on a recording medium, reading the program recorded on the recording medium as a code, and executing it on a computer is also implemented. Included in the category of form. That is, a computer-readable recording medium is also included in the scope of each embodiment. In addition to the recording medium on which the above program is recorded, the program itself is also included in each embodiment.
- a floppy (registered trademark) disk, hard disk, optical disk, magneto-optical disk, CD-ROM, magnetic tape, non-volatile memory card, and ROM can be used as recording media.
- the program recorded on the recording medium alone executes the process, but also the one that operates on the OS and executes the process in cooperation with other software and functions of the expansion board. included in the category of
- the information processing system includes: face orientation acquisition means for acquiring a face orientation of a user; gaze direction acquisition means for acquiring a gaze direction of the user; The information processing system includes: determination means for determining whether or not the user is a living body based on the difference between the face orientation and the line-of-sight direction; and an output means for outputting the result of the determination.
- the determination means includes calculation means for calculating a score indicating the bio-likeness of the user from the difference between the face orientation and the line-of-sight direction, and The information processing system according to Supplementary Note 1, further comprising correcting means for correcting the score by using the score, and determination processing means for determining whether or not the user is a living body based on the corrected score.
- the information processing system according to Supplementary Note 3 is the information processing system according to Supplementary Note 2, wherein the correcting means corrects such that the larger the face direction, the larger the correction amount to be subtracted from the score.
- the correction means stores a plurality of correction formulas for correcting the score, selects the correction formula according to information related to the user, and corrects the score
- the determination processing means determines whether or not the user is a living body based on the corrected score, and based on the score before correction, 4.
- Supplementary notes 2 to 4 further comprising notification means for determining whether or not the user is a living body, and notifying that the determination result has changed when the determination result of the determination processing means has changed before and after the correction.
- the information processing system according to any one of the items.
- the face direction obtaining means and the line-of-sight direction obtaining means obtain the face direction and the line of sight direction at a plurality of timings from a plurality of images captured while the user shakes his or her head sideways. 6.
- Appendix 7 The information processing system according to appendix 7, according to appendix 6, further comprising instruction means for instructing the user to swing his or her head appropriately when it is detected that the user's head swing is not appropriate. information processing system.
- the information processing system includes: face orientation acquisition means for acquiring a face orientation of a user; line-of-sight direction acquisition means for acquiring a line-of-sight direction of the user; Calculation means for calculating a score indicating the biolikeness of the user; Correction means for correcting the score according to the size of the face orientation; and Judgment as to whether the user is a living body based on the corrected score. and output means for outputting the result of the determination.
- the information processing method acquires a user's face direction, acquires the user's line-of-sight direction, and, if the face direction is equal to or greater than a predetermined threshold, determines the difference between the face direction and the line-of-sight direction. Based on this, it is determined whether the user is a living body or not, and the information processing method outputs the result of the determination.
- a computer acquires a user's face orientation, acquires the user's line-of-sight direction, and, when the face orientation is equal to or greater than a predetermined threshold value, calculates the relationship between the face orientation and the line-of-sight direction.
- the recording medium stores a computer program for executing an information processing method for determining whether or not the user is a living body based on the difference and outputting the result of the determination.
- the computer program according to Supplementary Note 11 acquires the user's face orientation, acquires the user's line-of-sight direction, and, if the face orientation is equal to or greater than a predetermined threshold value, determines the difference between the face orientation and the line-of-sight direction.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Health & Medical Sciences (AREA)
- General Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Health & Medical Sciences (AREA)
- Multimedia (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Bioinformatics & Computational Biology (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Evolutionary Biology (AREA)
- Probability & Statistics with Applications (AREA)
- Life Sciences & Earth Sciences (AREA)
- Ophthalmology & Optometry (AREA)
- Image Analysis (AREA)
- Image Processing (AREA)
Abstract
Description
第1実施形態に係る情報処理システムについて、図1から図3を参照して説明する。
まず、図1を参照しながら、第1実施形態に係る情報処理システム10のハードウェア構成について説明する。図1は、第1実施形態に係る情報処理システムのハードウェア構成を示すブロック図である。
次に、図2を参照しながら、第1実施形態に係る情報処理システム10の機能的構成について説明する。図2は、第1実施形態に係る情報処理システムの機能的構成を示すブロック図である。
次に、図3を参照しながら、第1実施形態に係る情報処理システム10の動作の流れについて説明する。図3は、第1実施形態に係る情報処理システムの動作の流れを示すフローチャートである。
次に、第1実施形態に係る情報処理システム10によって得られる技術的効果について説明する。
第2実施形態に係る情報処理システム10について、図4及び図5を参照して説明する。なお、第2実施形態は、上述した第1実施形態と一部の構成及び動作が異なるのみであり、その他の部分については第1実施形態と同一であってよい。このため、以下では、すでに説明した第1実施形態と異なる部分について詳細に説明し、重複する部分については適宜説明を省略するものとする。
まず、図4を参照しながら、第2実施形態に係る情報処理システム10の機能的構成について説明する。図4は、第2実施形態に係る情報処理システムの機能的構成を示すブロック図である。なお、図4では、図2で示した要素と同様のものに同一の符号を付している。
次に、図5を参照しながら、第2実施形態に係る情報処理システム10の動作の流れについて説明する。図5は、第2実施形態に係る情報処理システムの動作の流れを示すフローチャートである。なお、図5では、図3で示した処理と同様の処理に同一の符号を付している。
次に、第2実施形態に係る情報処理システム10によって得られる技術的効果について説明する。
第3実施形態に係る情報処理システム10について、図6及び図7を参照して説明する。なお、第3実施形態は、上述した第2実施形態の具体的な補正例を説明するものであり、その他の構成や動作については第1及び第2実施形態と同一であってよい。このため、以下では、すでに説明した各実施形態と異なる部分について詳細に説明し、他の重複する部分について適宜説明を省略するものとする。
まず、図6を参照しながら、第3実施形態に係る情報処理システム10において算出されるスコアについて具体的に説明する。図6は、第3実施形態に係る情報処理システムで算出されるスコアの分布例を示すグラフである。
次に、図7を参照しながら、第3実施形態に係る情報処理システム10におけるスコアの補正について具体的に説明する。図7は、第3実施形態に係る情報処理システムにおけるスコアの補正方法の一例を示すグラフである。
なお、上記式のA、B、Cは所定の係数であり、予め求められているとする。
次に、第3実施形態に係る情報処理システム10によって得られる技術的効果について説明する。
第4実施形態に係る情報処理システム10について、図8及び図9を参照して説明する。なお、第4実施形態は、上述した第1から第3実施形態と一部の構成及び動作が異なるのみであり、その他の部分については第1から第3実施形態と同一であってよい。このため、以下では、すでに説明した各実施形態と重複する部分については適宜説明を省略するものとする。
まず、図8を参照しながら、第4実施形態に係る情報処理システム10の機能的構成について説明する。図8は、第4実施形態に係る情報処理システムの機能的構成を示すブロック図である。なお、図8では、図2及び図4で示した要素と同様のものに同一の符号を付している。
次に、図9を参照しながら、第4実施形態に係る情報処理システム10の動作の流れについて説明する。図9は、第4実施形態に係る情報処理システムの動作の流れを示すフローチャートである。なお、図9では、図3及び図5で示した処理と同様の処理に同一の符号を付している。
次に、第4実施形態に係る情報処理システム10によって得られる技術的効果について説明する。
第5実施形態に係る情報処理システム10について、図10から図12を参照して説明する。なお、第5実施形態は、上述した第1から第4実施形態と比べて一部の構成及び動作が異なるのみで、その他の部分については第1から第4実施形態と同一であってよい。このため、以下では、すでに説明した各実施形態と異なる部分について詳細に説明し、他の重複する部分については適宜説明を省略するものとする。
まず、図10を参照しながら、第5実施形態に係る情報処理システム10の機能的構成について説明する。図10は、第5実施形態に係る情報処理システムの機能的構成を示すブロック図である。なお、図10では、図4で示した構成要素と同様の要素に同一の符号を付している。
次に、図11を参照しながら、第5実施形態に係る情報処理システム10の動作の流れについて説明する。図11は、第5実施形態に係る情報処理システムの動作の流れを示すフローチャートである。なお、図11では、図5で示した処理と同様の処理に同一の符号を付している。
次に、図12を参照しながら、第5実施形態に係る情報処理システム10における表示例(具体的には、通知部150による通知を反映した表示例)について説明する。図12は、第5実施形態に係る情報処理システムによる表示例を示す図である。
次に、第5実施形態に係る情報処理システム10によって得られる技術的効果について説明する。
第6実施形態に係る情報処理システム10について、図13及び図14を参照して説明する。なお、第6実施形態は、上述した第1から第5実施形態と比べて一部の構成及び動作が異なるのみで、その他の部分については第1から第5実施形態と同一であってよい。このため、以下では、すでに説明した各実施形態と異なる部分について詳細に説明し、他の重複する部分については適宜説明を省略するものとする。
まず、図13を参照しながら、第6実施形態に係る情報処理システム10における顔向き及び先方向の取得方法について具体的に説明する。図13は、第6実施形態に係る情報処理システムによる顔向き及び視線方向の取得タイミングを示すグラフである。
次に、図14を参照しながら、第6実施形態に係る情報処理システム10におけるスコアの算出方法について具体的に説明する。図14は、第6実施形態に係る情報処理システムによるスコアの算出方法を示す概念図である。
次に、第6実施形態に係る情報処理システム10によって得られる技術的効果について説明する。
第7実施形態に係る情報処理システム10について、図15から図18を参照して説明する。なお、第7実施形態は、上述した第6実施形態と比べて一部の動作が異なるのみであり、その他の部分については第1から第6実施形態と同一であってよい。このため、以下では、すでに説明した各実施形態と異なる部分について詳細に説明し、他の重複する部分については適宜説明を省略するものとする。
まず、図15を参照しながら、第7実施形態に係る情報処理システム10の機能的構成について説明する。図15は、第7実施形態に係る情報処理システムの機能的構成を示すブロック図である。なお、図15では、図2で示した構成要素と同様の要素に同一の符号を付している。
次に、図16を参照しながら、第7実施形態に係る情報処理システム10の動作の流れについて説明する。図16は、第7実施形態に係る情報処理システムの動作の流れを示すフローチャートである。なお、図16では、図3で示した処理と同様の処理に同一の符号を付している。
次に、図17及び図18を参照しながら、第7実施形態に係る情報処理システム10における表示例(具体的には、首振り指示部160による指示を反映した表示例)について説明する。図17は、第7実施形態に係る情報処理システムによる表示例を示す図(その1)である。図18は、第7実施形態に係る情報処理システムによる表示例を示す図(その2)である。
次に、第7実施形態に係る情報処理システム10によって得られる技術的効果について説明する。
第8実施形態に係る情報処理システム10について、図19及び図20を参照して説明する。なお、第8実施形態に係る情報処理システム10は、上述した第1から第7実施形態と比べて一部の動作が異なるのみであり、その他の部分については第1から第7実施形態と同一であってよい。このため、以下では、すでに説明した各実施形態と異なる部分について詳細に説明し、他の重複する部分については適宜説明を省略するものとする。
まず、図19を参照しながら、第8実施形態に係る情報処理システム10の機能的構成について説明する。図19は、第8実施形態に係る情報処理システムの機能的構成を示すブロック図である。なお、図19では、図2及び図4で示した構成要素と同様の要素に同一の符号を付している。
次に、図20を参照しながら、第8実施形態に係る情報処理システム10の動作の流れについて説明する。図20は、第8実施形態に係る情報処理システムの動作の流れを示すフローチャートである。なお、図20では、図3及び図5で示した処理と同様の処理に同一の符号を付している。
次に、第8実施形態に係る情報処理システム10によって得られる技術的効果について説明する。
以上説明した実施形態に関して、更に以下の付記のようにも記載されうるが、以下には限られない。
付記1に記載の情報処理システムは、ユーザの顔向きを取得する顔向き取得手段と、前記ユーザの視線方向を取得する視線方向取得手段と、前記顔向きが所定閾値以上である場合に、前記顔向きと前記視線方向との差分に基づいて、前記ユーザが生体か否かの判定を行う判定手段と、前記判定の結果を出力する出力手段と、を備える情報処理システムである。
付記2に記載の情報処理システムは、前記判定手段は、前記顔向きと前記視線方向との差分から、前記ユーザの生体らしさを示すスコアを算出する算出手段と、前記顔向きの大きさに応じて前記スコアを補正する補正手段と、補正された前記スコアに基づいて、前記ユーザが生体か否かを判定する判定処理手段と、を有する付記1に記載の情報処理システムである。
付記3に記載の情報処理システムは、前前記補正手段は、前記顔向きが大きいほど、前記スコアから差し引く補正量が大きくなるように補正する付記2に記載の情報処理システムである。
付記4に記載の情報処理システムは、前記補正手段は、前記スコアを補正するための補正式を複数記憶しており、前記ユーザに関連する情報に応じて前記補正式を選択して、前記スコアを補正する付記2又は3に記載の情報処理システムである。
付記5に記載の情報処理システムは、前記判定処理手段は、補正された前記スコアに基づいて、前記ユーザが生体か否かの判定を行うことに加え、補正される前の前記スコアに基づいて、前記ユーザが生体か否かの判定を行い、補正の前後で前記判定処理手段の判定結果が変わった場合に、前記判定結果が変わったことを通知する通知手段を更に備える付記2から4のいずれか一項に記載の情報処理システムである。
付記6に記載の情報処理システムは、前記顔向き取得手段及び前記視線方向取得手段は、前記ユーザが首を横に振る動作中に撮像される複数の画像から、複数タイミングの前記顔向き及び前記視線方向を取得し、前記判定手段は、複数タイミングにおける前記顔向きと前記視線方向との差分に基づいて、前記ユーザが生体か否かを判定する付記1から5のいずれか一項に記載の情報処理システムである。
付記7に記載の情報処理システムは、前記ユーザの首の振り方が適切でないことを検知した場合に、前記ユーザに対して適切な首の振り方を指示する指示手段を更に備える付記6に記載の情報処理システムである。
付記8に記載の情報処理システムは、ユーザの顔向きを取得する顔向き取得手段と、前記ユーザの視線方向を取得する視線方向取得手段と、前記顔向きと前記視線方向との差分から、前記ユーザの生体らしさを示すスコアを算出する算出手段と、前記顔向きの大きさに応じて前記スコアを補正する補正手段と、補正された前記スコアに基づいて、前記ユーザが生体か否かの判定を行う判定処理手段と、前記判定の結果を出力する出力手段と、を備える情報処理システムである。
付記9に記載の情報処理方法は、ユーザの顔向きを取得し、前記ユーザの視線方向を取得し、前記顔向きが所定閾値以上である場合に、前記顔向きと前記視線方向との差分に基づいて、前記ユーザが生体か否かの判定を行い、前記判定の結果を出力する情報処理方法である。
付記10に記載の記録媒体は、コンピュータに、ユーザの顔向きを取得し、前記ユーザの視線方向を取得し、前記顔向きが所定閾値以上である場合に、前記顔向きと前記視線方向との差分に基づいて、前記ユーザが生体か否かの判定を行い、前記判定の結果を出力する情報処理方法を実行させるコンピュータプログラムが記録された記録媒体である。
付記11に記載のコンピュータプログラムは、ユーザの顔向きを取得し、前記ユーザの視線方向を取得し、前記顔向きが所定閾値以上である場合に、前記顔向きと前記視線方向との差分に基づいて、前記ユーザが生体か否かの判定を行い、前記判定の結果を出力する情報処理方法を実行させるコンピュータプログラムである。
11 プロセッサ
16 出力装置
20 カメラ
110 顔向き取得部
120 視線方向取得部
130 判定部
131 閾値判定部
132 差分算出部
133 スコア算出部
134 スコア補正部
1341 補正式記憶部
1342 補正式選択部
135 判定処理部
136 首振り判定部
140 出力部
150 通知部
160 首振り指示部
Claims (10)
- ユーザの顔向きを取得する顔向き取得手段と、
前記ユーザの視線方向を取得する視線方向取得手段と、
前記顔向きが所定閾値以上である場合に、前記顔向きと前記視線方向との差分に基づいて、前記ユーザが生体か否かの判定を行う判定手段と、
前記判定の結果を出力する出力手段と、
を備える情報処理システム。 - 前記判定手段は、
前記顔向きと前記視線方向との差分から、前記ユーザの生体らしさを示すスコアを算出する算出手段と、
前記顔向きの大きさに応じて前記スコアを補正する補正手段と、
補正された前記スコアに基づいて、前記ユーザが生体か否かを判定する判定処理手段と、
を有する請求項1に記載の情報処理システム。 - 前記補正手段は、前記顔向きが大きいほど、前記スコアから差し引く補正量が大きくなるように補正する請求項2に記載の情報処理システム。
- 前記補正手段は、前記スコアを補正するための補正式を複数記憶しており、前記ユーザに関連する情報に応じて前記補正式を選択して、前記スコアを補正する請求項2又は3に記載の情報処理システム。
- 前記判定処理手段は、補正された前記スコアに基づいて、前記ユーザが生体か否かの判定を行うことに加え、補正される前の前記スコアに基づいて、前記ユーザが生体か否かの判定を行い、
補正の前後で前記判定処理手段の判定結果が変わった場合に、前記判定結果が変わったことを通知する通知手段を更に備える
請求項2から4のいずれか一項に記載の情報処理システム。 - 前記顔向き取得手段及び前記視線方向取得手段は、前記ユーザが首を横に振る動作中に撮像される複数の画像から、複数タイミングの前記顔向き及び前記視線方向を取得し、
前記判定手段は、複数タイミングにおける前記顔向きと前記視線方向との差分に基づいて、前記ユーザが生体か否かを判定する
請求項1から5のいずれか一項に記載の情報処理システム。 - 前記ユーザの首の振り方が適切でないことを検知した場合に、前記ユーザに対して適切な首の振り方を指示する指示手段を更に備える請求項6に記載の情報処理システム。
- ユーザの顔向きを取得する顔向き取得手段と、
前記ユーザの視線方向を取得する視線方向取得手段と、
前記顔向きと前記視線方向との差分から、前記ユーザの生体らしさを示すスコアを算出する算出手段と、
前記顔向きの大きさに応じて前記スコアを補正する補正手段と、
補正された前記スコアに基づいて、前記ユーザが生体か否かの判定を行う判定処理手段と、
前記判定の結果を出力する出力手段と、
を備える情報処理システム。 - ユーザの顔向きを取得し、
前記ユーザの視線方向を取得し、
前記顔向きが所定閾値以上である場合に、前記顔向きと前記視線方向との差分に基づいて、前記ユーザが生体か否かの判定を行い、
前記判定の結果を出力する
情報処理方法。 - コンピュータに、
ユーザの顔向きを取得し、
前記ユーザの視線方向を取得し、
前記顔向きが所定閾値以上である場合に、前記顔向きと前記視線方向との差分に基づいて、前記ユーザが生体か否かの判定を行い、
前記判定の結果を出力する
情報処理方法を実行させるコンピュータプログラムが記録された記録媒体。
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2021/017010 WO2022230117A1 (ja) | 2021-04-28 | 2021-04-28 | 情報処理システム、情報処理方法、及び記録媒体 |
US18/272,564 US20240078704A1 (en) | 2021-04-28 | 2021-04-28 | Information processing system, information processing method, and recording medium |
EP21939278.4A EP4332880A4 (en) | 2021-04-28 | 2021-04-28 | INFORMATION PROCESSING SYSTEM, INFORMATION PROCESSING METHOD AND RECORDING MEDIUM |
JP2023516960A JPWO2022230117A1 (ja) | 2021-04-28 | 2021-04-28 | |
CN202180097472.1A CN117256007A (zh) | 2021-04-28 | 2021-04-28 | 信息处理系统、信息处理方法及记录介质 |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2021/017010 WO2022230117A1 (ja) | 2021-04-28 | 2021-04-28 | 情報処理システム、情報処理方法、及び記録媒体 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022230117A1 true WO2022230117A1 (ja) | 2022-11-03 |
Family
ID=83847902
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2021/017010 WO2022230117A1 (ja) | 2021-04-28 | 2021-04-28 | 情報処理システム、情報処理方法、及び記録媒体 |
Country Status (5)
Country | Link |
---|---|
US (1) | US20240078704A1 (ja) |
EP (1) | EP4332880A4 (ja) |
JP (1) | JPWO2022230117A1 (ja) |
CN (1) | CN117256007A (ja) |
WO (1) | WO2022230117A1 (ja) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007148968A (ja) | 2005-11-30 | 2007-06-14 | Omron Corp | 顔認証装置、セキュリティ強度変更方法およびプログラム |
JP2008015800A (ja) | 2006-07-06 | 2008-01-24 | Omron Corp | なりすまし検知装置 |
JP2014206932A (ja) * | 2013-04-15 | 2014-10-30 | オムロン株式会社 | 認証装置、認証方法、制御プログラムおよび記録媒体 |
WO2016059786A1 (ja) * | 2014-10-15 | 2016-04-21 | 日本電気株式会社 | なりすまし検知装置、なりすまし検知方法、および、記憶媒体 |
JP2017142859A (ja) | 2017-05-11 | 2017-08-17 | オムロン株式会社 | 認証装置、認証方法、制御プログラムおよび記録媒体 |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
ES2495425T3 (es) * | 2011-07-11 | 2014-09-17 | Accenture Global Services Limited | Detección de vida |
US10997396B2 (en) * | 2019-04-05 | 2021-05-04 | Realnetworks, Inc. | Face liveness detection systems and methods |
-
2021
- 2021-04-28 JP JP2023516960A patent/JPWO2022230117A1/ja active Pending
- 2021-04-28 CN CN202180097472.1A patent/CN117256007A/zh active Pending
- 2021-04-28 US US18/272,564 patent/US20240078704A1/en active Pending
- 2021-04-28 WO PCT/JP2021/017010 patent/WO2022230117A1/ja active Application Filing
- 2021-04-28 EP EP21939278.4A patent/EP4332880A4/en active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007148968A (ja) | 2005-11-30 | 2007-06-14 | Omron Corp | 顔認証装置、セキュリティ強度変更方法およびプログラム |
JP2008015800A (ja) | 2006-07-06 | 2008-01-24 | Omron Corp | なりすまし検知装置 |
JP2014206932A (ja) * | 2013-04-15 | 2014-10-30 | オムロン株式会社 | 認証装置、認証方法、制御プログラムおよび記録媒体 |
WO2016059786A1 (ja) * | 2014-10-15 | 2016-04-21 | 日本電気株式会社 | なりすまし検知装置、なりすまし検知方法、および、記憶媒体 |
JP2020194608A (ja) | 2014-10-15 | 2020-12-03 | 日本電気株式会社 | 生体検知装置、生体検知方法、および、生体検知プログラム |
JP2017142859A (ja) | 2017-05-11 | 2017-08-17 | オムロン株式会社 | 認証装置、認証方法、制御プログラムおよび記録媒体 |
Non-Patent Citations (1)
Title |
---|
See also references of EP4332880A4 |
Also Published As
Publication number | Publication date |
---|---|
JPWO2022230117A1 (ja) | 2022-11-03 |
EP4332880A4 (en) | 2024-05-29 |
US20240078704A1 (en) | 2024-03-07 |
EP4332880A1 (en) | 2024-03-06 |
CN117256007A (zh) | 2023-12-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR101365789B1 (ko) | 얼굴 특징점 위치 보정 장치, 얼굴 특징점 위치 보정 방법, 및 얼굴 특징점 위치 보정 프로그램을 기록한 컴퓨터 판독가능 기록 매체 | |
US9652861B2 (en) | Estimating device and estimation method | |
US20180075291A1 (en) | Biometrics authentication based on a normalized image of an object | |
JP2007042072A (ja) | 追跡装置 | |
JPWO2019003973A1 (ja) | 顔認証装置、顔認証方法およびプログラム | |
JP2006245677A (ja) | 画像処理装置、画像処理方法および画像処理プログラム | |
JP6065842B2 (ja) | 辞書学習装置、パターン照合装置、辞書学習方法およびコンピュータプログラム | |
US11036974B2 (en) | Image processing apparatus, image processing method, and storage medium | |
JPWO2019123554A1 (ja) | 画像処理装置、画像処理方法、及び、プログラム | |
WO2022230117A1 (ja) | 情報処理システム、情報処理方法、及び記録媒体 | |
EP4252187A1 (en) | Enhanced video stabilization based on machine learning models | |
JP2021144359A (ja) | 学習装置、推定装置、学習方法、及びプログラム | |
JPWO2019065784A1 (ja) | 画像処理装置、画像処理方法、及びプログラム | |
JP2016149678A (ja) | カメラ校正ユニット、カメラ校正方法、およびカメラ校正プログラム | |
US20220309704A1 (en) | Image processing apparatus, image processing method and recording medium | |
WO2023007730A1 (ja) | 情報処理システム、情報処理装置、情報処理方法、及び記録媒体 | |
JP2021114111A (ja) | 撮像支援装置、撮像支援方法、及びプログラム | |
JP7276968B2 (ja) | 3次元データ更新装置、顔向き推定装置、3次元データ更新方法およびプログラム | |
WO2022176323A1 (ja) | 生体認証システム、生体認証方法、及び記録媒体 | |
JP2019205002A (ja) | 監視方法 | |
JPWO2020095400A1 (ja) | 特徴点抽出装置、特徴点抽出方法およびコンピュータプログラム | |
JP6255968B2 (ja) | 画像処理装置、陽炎補正方法及びプログラム | |
WO2024111113A1 (ja) | 情報処理装置、情報処理方法及び記録媒体 | |
WO2022181252A1 (ja) | 関節点検出装置、学習モデル生成装置、関節点検出方法、学習モデル生成方法、及びコンピュータ読み取り可能な記録媒体 | |
WO2023157070A1 (ja) | 情報処理装置、情報処理方法、及び記録媒体 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21939278 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2023516960 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 18272564 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202180097472.1 Country of ref document: CN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2021939278 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2021939278 Country of ref document: EP Effective date: 20231128 |