CN112966666A - Living body identification method and device, electronic equipment and server - Google Patents

Living body identification method and device, electronic equipment and server Download PDF

Info

Publication number
CN112966666A
CN112966666A CN202110357582.9A CN202110357582A CN112966666A CN 112966666 A CN112966666 A CN 112966666A CN 202110357582 A CN202110357582 A CN 202110357582A CN 112966666 A CN112966666 A CN 112966666A
Authority
CN
China
Prior art keywords
human body
picture
acquiring
key points
positions
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110357582.9A
Other languages
Chinese (zh)
Inventor
徐炎
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Alipay Hangzhou Information Technology Co Ltd
Original Assignee
Alipay Hangzhou Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alipay Hangzhou Information Technology Co Ltd filed Critical Alipay Hangzhou Information Technology Co Ltd
Priority to CN202110357582.9A priority Critical patent/CN112966666A/en
Publication of CN112966666A publication Critical patent/CN112966666A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/40Spoof detection, e.g. liveness detection
    • G06V40/45Detection of the body part being alive
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands

Abstract

The embodiment of the specification provides a living body identification method, a living body identification device, an electronic device and a server, wherein in the living body identification method, after a trigger operation of a user is detected, the electronic device responds to the trigger operation, acquires a picture through a camera of the electronic device, stops acquiring the picture after a preset acquisition stopping condition is met, sends the acquired picture to the server, and then carries out living body identification on the user through the server according to the picture, so that the purpose of identifying injection attacks through the image acquired without sense and intercepting the injection attacks can be achieved.

Description

Living body identification method and device, electronic equipment and server
[ technical field ] A method for producing a semiconductor device
The embodiment of the specification relates to the technical field of internet, in particular to a living body identification method and device, electronic equipment and a server.
[ background of the invention ]
In an electronic identification of an EKYC (electronic customer client) scene, a face-brushing login scene, or a face-brushing payment scene, it is necessary to check whether the acquired face image and the comparison source are the same face, and determine whether the acquired face image is an attack, for example: copying, printing, and/or masking, etc. In the currently known living body identification algorithm, the server may perform the following operations by interacting with the client, for example: blinking and/or panning, etc., and live body recognition using a silent live body model, etc.
However, the existing living body identification method cannot solve the problem of injection attack, and the injection attack means that an attacker hooks (hook) a front camera of the electronic device and injects a picture or a video prepared in advance into an application program (APP) to achieve the illegal purpose of attack. Therefore, there is a need to provide a live identification scheme that defends against and intercepts injection attacks.
[ summary of the invention ]
The embodiment of the specification provides a living body identification method and device, electronic equipment and a server, so that injection attacks can be identified through images acquired in a non-sensory mode, and the purpose of intercepting the injection attacks is achieved.
In a first aspect, an embodiment of the present specification provides a living body identification method, including: after the triggering operation of a user is detected, responding to the triggering operation, and acquiring a picture through a camera of the electronic equipment; stopping capturing the picture if a predetermined capture stop condition is met; and sending the acquired picture to a server so that the server can identify the living body of the user according to the picture.
According to the living body identification method, after the trigger operation of a user is detected, the electronic equipment responds to the trigger operation, the camera of the electronic equipment is used for collecting pictures, after the preset collection stopping condition is met, the electronic equipment stops collecting the pictures, the collected pictures are sent to the server, and then the server identifies the living body of the user according to the pictures, so that the injection attack can be identified through the pictures collected in a non-sensing mode, and the purpose of intercepting the injection attack is achieved.
In one possible implementation manner, the acquiring the picture by the camera of the electronic device includes: and acquiring pictures according to a preset acquisition frequency through a camera of the electronic equipment.
In one possible implementation manner, the predetermined condition for stopping acquisition includes: detecting that the user triggers an operation of acquiring a face image; or after detecting that the user triggers the operation of acquiring the face image, acquiring the face image meeting the preset image requirement.
In a second aspect, an embodiment of the present specification provides a living body identification method, including: acquiring a picture acquired by electronic equipment, and acquiring a target picture including a human body in the picture; acquiring key points of the human body from the target picture; detecting whether the positions of key points of the human body between two continuous frames of target pictures are changed or not; and if the positions of the key points of the human body are changed and the change directions of the positions of the key points of the human body are consistent, determining that the human body in the target picture is a living body image.
In the living body identification method, after a server acquires a picture acquired by electronic equipment and a target picture including a human body in the picture, key points of the human body are acquired from the target picture, whether the positions of the key points of the human body between two continuous frames of the target picture are changed or not is detected, if the positions of the key points of the human body are changed and the change directions of the positions of the key points of the human body are consistent, the server determines the human body in the target picture as a living body image, and therefore the purpose of identifying injection attacks through the picture acquired without sense can be achieved, and the purpose of intercepting the injection attacks is achieved.
In one possible implementation manner, after detecting whether the position of the key point of the human body between two consecutive target pictures changes, the method further includes: and if the positions of the key points of the human body are not changed or the positions of the key points of the human body are changed but the change directions of the positions of the key points of the human body are inconsistent, determining that the target picture is an injection attack.
In one possible implementation manner, the key points of the human body include one or a combination of the following: nose, eyes, shoulders and arms.
In a third aspect, embodiments of the present specification provide a living body identification apparatus provided in an electronic device, the living body identification apparatus including: the detection module is used for detecting the triggering operation of a user; the acquisition module is used for responding to the trigger operation after the detection module detects the trigger operation of the user and acquiring pictures through a camera of the electronic equipment; and stopping capturing the picture when a predetermined capture stop condition is met; and the sending module is used for sending the picture acquired by the acquisition module to a server so that the server can identify the living body of the user according to the picture.
In one possible implementation manner, the acquisition module is specifically configured to acquire the picture according to a predetermined acquisition frequency through a camera of the electronic device.
In one possible implementation manner, the predetermined condition for stopping acquisition includes: detecting that the user triggers an operation of acquiring a face image; or after detecting that the user triggers the operation of acquiring the face image, acquiring the face image meeting the preset image requirement.
In a fourth aspect, an embodiment of the present specification provides a living body identification apparatus provided in a server, the living body identification apparatus including: the acquisition module is used for acquiring pictures acquired by electronic equipment and acquiring target pictures including human bodies in the pictures; acquiring key points of a human body from a target picture comprising the human body; the detection module is used for detecting whether the positions of key points of the human body between two continuous frames of target pictures are changed or not; and the determining module is used for determining that the human body in the target picture is a living body image when the positions of the key points of the human body are changed and the change directions of the positions of the key points of the human body are consistent.
In one possible implementation manner, the determining module is further configured to determine that the target picture is an injection attack when the position of the key point of the human body is not changed or the position of the key point of the human body is changed but the change directions of the positions of the key points of the human body are not consistent.
In a fifth aspect, an embodiment of the present specification provides an electronic device, including: at least one processor; and at least one memory communicatively coupled to the processor, wherein: the memory stores program instructions executable by the processor, the processor calling the program instructions to be able to perform the method provided by the first aspect.
In a sixth aspect, embodiments of the present specification provide a non-transitory computer-readable storage medium storing computer instructions for causing a computer to perform the method provided in the first aspect.
In a seventh aspect, an embodiment of the present specification provides a server, including: at least one processor; and at least one memory communicatively coupled to the processor, wherein: the memory stores program instructions executable by the processor, the processor calling the program instructions to be able to perform the method provided by the second aspect.
In an eighth aspect, embodiments of the present specification provide a non-transitory computer-readable storage medium storing computer instructions that cause the computer to perform the method provided by the second aspect.
It should be understood that the third, fifth, and sixth aspects of the embodiments in this specification are consistent with the technical solution of the first aspect of the embodiments in this specification, and beneficial effects achieved by various aspects and corresponding possible implementation manners are similar and will not be described again;
the fourth, seventh and eighth aspects of the embodiments in this specification are consistent with the technical solution of the second aspect of the embodiments in this specification, and the beneficial effects obtained by each aspect and the corresponding possible implementation are similar, and are not described again.
[ description of the drawings ]
In order to more clearly illustrate the technical solutions of the embodiments of the present disclosure, the drawings needed to be used in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present disclosure, and it is obvious for those skilled in the art to obtain other drawings based on the drawings without creative efforts.
FIG. 1 is a flow chart of a method for living body identification provided in one embodiment of the present description;
FIG. 2 is a flow chart of a method for identifying a living body according to another embodiment of the present disclosure;
FIG. 3 is a schematic structural diagram of a living body identification device according to an embodiment of the present disclosure;
FIG. 4 is a schematic structural diagram of a living body identification device according to another embodiment of the present disclosure;
fig. 5 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure.
[ detailed description ] embodiments
For better understanding of the technical solutions in the present specification, the following detailed description of the embodiments of the present specification is provided with reference to the accompanying drawings.
It should be understood that the described embodiments are only a few embodiments of the present specification, and not all embodiments. All other embodiments obtained by a person skilled in the art based on the embodiments in the present specification without any inventive step are within the scope of the present specification.
The terminology used in the embodiments of the specification is for the purpose of describing particular embodiments only and is not intended to be limiting of the specification. As used in the specification examples and the appended claims, the singular forms "a", "an", and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
The injection attack means that an attacker hooks (hook) a front camera of electronic equipment and injects a picture or a video prepared in advance into an APP to achieve the illegal purpose of attack, but the living body identification method provided in the prior art cannot solve the problem of injection attack, so the embodiment of the specification provides a living body identification method.
Fig. 1 is a flowchart of a living body identification method according to an embodiment of the present disclosure, and as shown in fig. 1, the living body identification method may include:
step 102, after detecting a trigger operation of a user, the electronic device responds to the trigger operation and acquires a picture through a camera of the electronic device.
In this embodiment, the triggering operation may be an operation in which a user clicks an icon of a target APP, and after the operation is detected, the electronic device runs the target APP and automatically acquires a picture through a camera of the electronic device; or, the triggering operation may be a user's operation of clicking a face-brushing login or login icon on a display interface of the target APP, and after the operation is detected, the electronic device automatically acquires a picture through a camera of the electronic device. The above are only two examples of the trigger operation, the trigger operation is not limited thereto, and the specific operation form of the trigger operation is not limited in this embodiment. Therefore, the above triggering operation is only some operations of the user using the target APP, and the user does not consciously trigger the acquisition of the picture, so that the user is unaware of the operation of acquiring the picture by the electronic device.
The camera of the electronic device may be a front camera or a rear camera in the electronic device, which is not limited in this embodiment, but this embodiment is described by taking the front camera as an example.
During the concrete realization, gather the picture through electronic equipment's camera and can be: and acquiring pictures according to a preset acquisition frequency through a camera of the electronic equipment. The predetermined acquisition frequency may be set according to system performance and/or implementation requirements during specific implementation, and the predetermined acquisition frequency is not limited in this embodiment, for example, the predetermined acquisition frequency may be one frame of picture acquired every 1 second. That is to say, in this embodiment, in order to reduce the resource consumption of the electronic device and avoid affecting the performance of the electronic device, the electronic device may be configured to acquire one frame of picture every 1 second, and the resolution of the picture is set to be lower, for example: 120 × 150.
And 104, if the preset collection stopping condition is met, stopping collecting the picture by the electronic equipment.
Wherein the predetermined stop acquisition condition may include: detecting that a user triggers an operation of acquiring a face image; or after detecting that the user triggers the operation of acquiring the face image, acquiring the face image meeting the preset image requirement. The operation of triggering and acquiring the facial image by the user may be a click operation of the user on an image acquisition button on a current display interface of the electronic device, and may also be a trigger operation in other forms, and the specific form of the operation of triggering and acquiring the facial image by the user is not limited in this embodiment; the predetermined image requirement may be set according to system performance and/or implementation requirements during specific implementation, and the predetermined image requirement is not limited in this embodiment, for example, the predetermined image requirement may include one or a combination of the following: the human face in the face image is clear, the human face is a front face (i.e. no head turning or head lowering, etc.), and the five sense organs are clear without occlusion, etc.
That is, in a specific implementation, after the user clicks the image capture button on the current display interface of the electronic device, or after the user clicks the image capture button on the current display interface of the electronic device and the electronic device captures a facial image meeting a predetermined image requirement, the electronic device may stop capturing the picture.
And 106, the electronic equipment sends the acquired picture to a server so that the server can identify the living body of the user according to the picture.
According to the living body identification method, after the trigger operation of a user is detected, the electronic equipment responds to the trigger operation, the camera of the electronic equipment is used for collecting pictures, after the preset collection stopping condition is met, the electronic equipment stops collecting the pictures, the collected pictures are sent to the server, and then the server identifies the living body of the user according to the pictures, so that the injection attack can be identified through the pictures collected in a non-sensing mode, and the purpose of intercepting the injection attack is achieved.
Fig. 2 is a flowchart of a living body identification method according to another embodiment of the present disclosure, and as shown in fig. 2, the living body identification method may include:
step 202, the server acquires a picture acquired by the electronic equipment and acquires a target picture including a human body in the picture.
And step 204, the server acquires the key points of the human body from the target picture.
Wherein, the key points of the human body may include one or a combination of the following: nose, eyes, shoulders and arms.
In step 206, the server detects whether the positions of the key points of the human body between two consecutive target pictures are changed. Then step 208 or step 210 is performed.
In specific implementation, the server may detect whether the position of the key of the human body between two consecutive target pictures changes by using a Convolutional Neural Network (CNN). Specifically, the server may input multiple frames of target pictures acquired by the electronic device to the prediction module of the CNN, where an output result of the CNN is a multi-label result, for example: if the position of the nose changes, the direction of change of the position of the nose is output together.
And step 208, if the positions of the key points of the human body are changed and the change directions of the positions of the key points of the human body are consistent, the server determines that the human body in the target picture is a living body image.
Specifically, the server may detect whether the positions of one or more key points among a nose, eyes, shoulders, and arms are changed when detecting whether the positions of key points of the human body are changed. In this way, the server may determine that the human body in the target picture is the living image only when the position of one key point in the nose, the eyes, the shoulders, and the arms changes and the direction of the change of the position coincides, or may determine that the human body in the target picture is the living image when the positions of a plurality of key points in the nose, the eyes, the shoulders, and the arms change and the direction of the change of the positions of the plurality of key points coincides.
Step 210, if the position of the key point of the human body is not changed or the position of the key point of the human body is changed but the change directions of the positions of the key points of the human body are not consistent, determining that the target picture is an injection attack.
In the living body identification method, after a server acquires a picture acquired by electronic equipment and a target picture including a human body in the picture, key points of the human body are acquired from the target picture, whether the positions of the key points of the human body between two continuous frames of the target picture are changed or not is detected, if the positions of the key points of the human body are changed and the change directions of the positions of the key points of the human body are consistent, the server determines the human body in the target picture as a living body image, and therefore the purpose of identifying injection attacks through the picture acquired without sense can be achieved, and the purpose of intercepting the injection attacks is achieved.
The foregoing description has been directed to specific embodiments of this disclosure. Other embodiments are within the scope of the following claims. In some cases, the actions or steps recited in the claims may be performed in a different order than in the embodiments and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing may also be possible or may be advantageous.
Fig. 3 is a schematic structural diagram of a living body identification device provided in an electronic apparatus according to an embodiment of the present disclosure, and as shown in fig. 3, the living body identification device may include: a detection module 31, an acquisition module 32 and a sending module 33;
the detection module 31 is configured to detect a trigger operation of a user;
the acquisition module 32 is configured to, after the detection module 31 detects a trigger operation of a user, respond to the trigger operation and acquire a picture through a camera of the electronic device; when a preset collection stopping condition is met, stopping collecting the picture;
and a sending module 33, configured to send the picture acquired by the acquisition module 32 to a server, so that the server performs living body identification on the user according to the picture.
In this embodiment, the collecting module 32 is specifically configured to collect the pictures according to a predetermined collecting frequency through a camera of the electronic device.
Wherein the predetermined stop collecting condition includes: detecting that a user triggers an operation of acquiring a face image; or after detecting that the user triggers the operation of acquiring the face image, acquiring the face image meeting the preset image requirement.
The living body identification apparatus provided by the embodiment shown in fig. 3 can be used for executing the technical scheme of the method embodiment shown in fig. 1 in the present specification, and the implementation principle and the technical effect thereof can be further referred to the related description in the method embodiment.
Fig. 4 is a schematic structural diagram of a living body identification device provided in a server according to another embodiment of the present disclosure, and as shown in fig. 4, the living body identification device may include: an acquisition module 41, a detection module 42 and a determination module 43;
the acquiring module 41 is configured to acquire a picture acquired by an electronic device and acquire a target picture including a human body in the picture; acquiring key points of the human body from the target picture;
a detection module 42, configured to detect whether positions of the key points of the human body between two consecutive frames of target pictures change;
and a determining module 43, configured to determine that the human body in the target picture is a living body image when the positions of the key points of the human body are changed and the changing directions of the positions of the key points of the human body are consistent.
Further, the determining module 43 is further configured to determine that the target picture is an injection attack when the position of the key point of the human body is not changed or the position of the key point of the human body is changed but the change directions of the positions of the key points of the human body are not consistent.
The living body identification apparatus provided in the embodiment shown in fig. 4 can be used to implement the technical solution of the embodiment of the method shown in fig. 2 of the present application, and the implementation principle and technical effects thereof can be further referred to the related description in the embodiment of the method.
Fig. 5 is a schematic structural diagram of an electronic device provided in an embodiment of the present specification, and as shown in fig. 5, the electronic device may include at least one processor; and at least one memory communicatively coupled to the processor, wherein: the memory stores program instructions executable by the processor, and the processor calls the program instructions to execute the living body identification method provided by the embodiment shown in fig. 1 in the present specification.
The electronic device may be an intelligent terminal device such as a smart phone, a tablet computer, or a smart watch, and the form of the electronic device is not limited in this embodiment.
FIG. 5 illustrates a block diagram of an exemplary electronic device suitable for use in implementing embodiments of the present specification. The electronic device shown in fig. 5 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present specification.
As shown in fig. 5, the electronic device is in the form of a general purpose computing device. Components of the electronic device may include, but are not limited to: one or more processors 410, a communication interface 420, a memory 430, and a communication bus 440 that connects the various components (including the memory 430, the communication interface 420, and the processing unit 410).
Communication bus 440 represents one or more of any of several types of bus structures, including a memory bus or memory controller, a peripheral bus, an accelerated graphics port, or a local bus using any of a variety of bus architectures. For example, communication bus 440 may include, but is not limited to, an Industry Standard Architecture (ISA) bus, a micro channel architecture (MAC) bus, an enhanced ISA bus, a Video Electronics Standards Association (VESA) local bus, and a Peripheral Component Interconnect (PCI) bus.
Electronic devices typically include a variety of computer system readable media. Such media may be any available media that is accessible by the electronic device and includes both volatile and nonvolatile media, removable and non-removable media.
Memory 430 may include computer system readable media in the form of volatile memory, such as Random Access Memory (RAM) and/or cache memory. Memory 430 may include at least one program product having a set (e.g., at least one) of program modules that are configured to carry out the functions of the embodiments of the description and illustrated in fig. 1.
A program/utility having a set (at least one) of program modules, including but not limited to an operating system, one or more application programs, other program modules, and program data, may be stored in memory 430, each of which examples or some combination may include an implementation of a network environment. The program modules generally perform the functions and/or methods of the embodiments described in FIG. 1 herein.
The processor 410 executes various functional applications and data processing by executing programs stored in the memory 430, for example, implementing the living body identification method provided by the embodiment shown in fig. 1 in this specification.
Embodiments of the present specification further provide a server, where the server may include at least one processor; and at least one memory communicatively coupled to the processor, wherein: the memory stores program instructions executable by the processor, and the processor calls the program instructions to execute the living body identification method provided by the embodiment shown in fig. 2 in the present specification. The server may be a cloud server, and the type of the server is not limited in this embodiment. The server in this embodiment may be implemented by using the structure shown in fig. 5, and is not described herein again.
Embodiments of the present description provide a non-transitory computer-readable storage medium storing computer instructions that cause a computer to perform a liveness identification method as provided by the embodiment shown in fig. 1 of the present description.
Embodiments of the present description provide a non-transitory computer-readable storage medium storing computer instructions that cause a computer to perform a liveness identification method as provided by the embodiment shown in fig. 2 of the present description.
The non-transitory computer readable storage medium described above may take any combination of one or more computer readable media. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a Read Only Memory (ROM), an Erasable Programmable Read Only Memory (EPROM) or flash memory, an optical fiber, a portable compact disc read only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, Radio Frequency (RF), etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for aspects of the present description may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
The foregoing description has been directed to specific embodiments of this disclosure. Other embodiments are within the scope of the following claims. In some cases, the actions or steps recited in the claims may be performed in a different order than in the embodiments and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing may also be possible or may be advantageous.
In the description of the specification, reference to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the specification. In this specification, the schematic representations of the terms used above are not necessarily intended to refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include at least one such feature. In the description of the present specification, "a plurality" means at least two, e.g., two, three, etc., unless explicitly defined otherwise.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing steps of a custom logic function or process, and alternate implementations are included within the scope of the preferred embodiment of the present description in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the embodiments of the present description.
The word "if" as used herein may be interpreted as "at … …" or "when … …" or "in response to a determination" or "in response to a detection", depending on the context. Similarly, the phrases "if determined" or "if detected (a stated condition or event)" may be interpreted as "when determined" or "in response to a determination" or "when detected (a stated condition or event)" or "in response to a detection (a stated condition or event)", depending on the context.
It should be noted that the terminal referred to in the embodiments of the present specification may include, but is not limited to, a Personal Computer (PC), a Personal Digital Assistant (PDA), a wireless handheld device, a tablet computer (tablet computer), a mobile phone, an MP3 player, an MP4 player, and the like.
In the several embodiments provided in this specification, it should be understood that the disclosed system, apparatus, and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and there may be other divisions in actual implementation, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
In addition, functional units in the embodiments of the present description may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, or in a form of hardware plus a software functional unit.
The integrated unit implemented in the form of a software functional unit may be stored in a computer readable storage medium. The software functional unit is stored in a storage medium and includes several instructions to enable a computer device (which may be a personal computer, a server, or a network device) or a processor (processor) to execute some steps of the methods described in the embodiments of the present disclosure. And the aforementioned storage medium includes: various media capable of storing program codes, such as a U disk, a removable hard disk, a Read Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The above description is only a preferred embodiment of the present disclosure, and should not be taken as limiting the present disclosure, and any modifications, equivalents, improvements, etc. made within the spirit and principle of the present disclosure should be included in the scope of the present disclosure.

Claims (15)

1. A living body identification method, comprising:
after the triggering operation of a user is detected, responding to the triggering operation, and acquiring a picture through a camera of the electronic equipment;
stopping capturing the picture if a predetermined capture stop condition is met;
and sending the acquired picture to a server so that the server can identify the living body of the user according to the picture.
2. The method of claim 1, wherein the capturing of the picture by the camera of the electronic device comprises:
and acquiring pictures according to a preset acquisition frequency through a camera of the electronic equipment.
3. The method according to claim 1 or 2, wherein the predetermined stop acquisition condition comprises: detecting that the user triggers an operation of acquiring a face image; or after detecting that the user triggers the operation of acquiring the face image, acquiring the face image meeting the preset image requirement.
4. A living body identification method, comprising:
acquiring a picture acquired by electronic equipment, and acquiring a target picture including a human body in the picture;
acquiring key points of the human body from the target picture;
detecting whether the positions of key points of the human body between two continuous frames of target pictures are changed or not;
and if the positions of the key points of the human body are changed and the change directions of the positions of the key points of the human body are consistent, determining that the human body in the target picture is a living body image.
5. The method according to claim 4, wherein after detecting whether the positions of the key points of the human body are changed between two consecutive target pictures, the method further comprises:
and if the positions of the key points of the human body are not changed or the positions of the key points of the human body are changed but the change directions of the positions of the key points of the human body are inconsistent, determining that the target picture is an injection attack.
6. The method of claim 4 or 5, wherein the key points of the human body comprise one or a combination of: nose, eyes, shoulders and arms.
7. A living body identification apparatus provided in an electronic device, the living body identification apparatus comprising:
the detection module is used for detecting the triggering operation of a user;
the acquisition module is used for responding to the trigger operation after the detection module detects the trigger operation of the user and acquiring pictures through a camera of the electronic equipment; and stopping capturing the picture when a predetermined capture stop condition is met;
and the sending module is used for sending the picture acquired by the acquisition module to a server so that the server can identify the living body of the user according to the picture.
8. The apparatus of claim 7, wherein,
the acquisition module is specifically used for acquiring pictures according to a preset acquisition frequency through a camera of the electronic equipment.
9. The apparatus of claim 7 or 8, wherein the predetermined stop acquisition condition comprises: detecting that the user triggers an operation of acquiring a face image; or after detecting that the user triggers the operation of acquiring the face image, acquiring the face image meeting the preset image requirement.
10. A living body identification apparatus provided in a server, the living body identification apparatus comprising:
the acquisition module is used for acquiring pictures acquired by electronic equipment and acquiring target pictures including human bodies in the pictures; acquiring key points of the human body from the target picture;
the detection module is used for detecting whether the positions of key points of the human body between two continuous frames of target pictures are changed or not;
and the determining module is used for determining that the human body in the target picture is a living body image when the positions of the key points of the human body are changed and the change directions of the positions of the key points of the human body are consistent.
11. The apparatus of claim 10, wherein,
the determining module is further configured to determine that the target picture is an injection attack when the positions of the key points of the human body are not changed or the positions of the key points of the human body are changed but the change directions of the positions of the key points of the human body are inconsistent.
12. An electronic device, comprising:
at least one processor; and
at least one memory communicatively coupled to the processor, wherein:
the memory stores program instructions executable by the processor, the processor invoking the program instructions to perform the method of any of claims 1 to 3.
13. A non-transitory computer-readable storage medium storing computer instructions that cause the computer to perform the method of any of claims 1-3.
14. A server, comprising:
at least one processor; and
at least one memory communicatively coupled to the processor, wherein:
the memory stores program instructions executable by the processor, the processor invoking the program instructions to perform the method of any of claims 4 to 6.
15. A non-transitory computer readable storage medium storing computer instructions that cause the computer to perform the method of any of claims 4 to 6.
CN202110357582.9A 2021-04-01 2021-04-01 Living body identification method and device, electronic equipment and server Pending CN112966666A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110357582.9A CN112966666A (en) 2021-04-01 2021-04-01 Living body identification method and device, electronic equipment and server

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110357582.9A CN112966666A (en) 2021-04-01 2021-04-01 Living body identification method and device, electronic equipment and server

Publications (1)

Publication Number Publication Date
CN112966666A true CN112966666A (en) 2021-06-15

Family

ID=76280838

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110357582.9A Pending CN112966666A (en) 2021-04-01 2021-04-01 Living body identification method and device, electronic equipment and server

Country Status (1)

Country Link
CN (1) CN112966666A (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN201845368U (en) * 2010-09-21 2011-05-25 北京海鑫智圣技术有限公司 Human face and fingerprint access control with living body detection function
CN109389002A (en) * 2017-08-02 2019-02-26 阿里巴巴集团控股有限公司 Biopsy method and device
CN110688878A (en) * 2018-07-06 2020-01-14 北京三快在线科技有限公司 Living body identification detection method, living body identification detection device, living body identification detection medium, and electronic device
CN112364825A (en) * 2020-11-30 2021-02-12 支付宝(杭州)信息技术有限公司 Method, apparatus and computer-readable storage medium for face recognition

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN201845368U (en) * 2010-09-21 2011-05-25 北京海鑫智圣技术有限公司 Human face and fingerprint access control with living body detection function
CN109389002A (en) * 2017-08-02 2019-02-26 阿里巴巴集团控股有限公司 Biopsy method and device
CN110688878A (en) * 2018-07-06 2020-01-14 北京三快在线科技有限公司 Living body identification detection method, living body identification detection device, living body identification detection medium, and electronic device
CN112364825A (en) * 2020-11-30 2021-02-12 支付宝(杭州)信息技术有限公司 Method, apparatus and computer-readable storage medium for face recognition

Similar Documents

Publication Publication Date Title
KR20170019823A (en) Method for processing image and electronic device supporting the same
KR20180003235A (en) Electronic device and image capturing method thereof
KR20170136920A (en) Method for Outputting Screen and the Electronic Device supporting the same
US20190051147A1 (en) Remote control method, apparatus, terminal device, and computer readable storage medium
KR102482067B1 (en) Electronic apparatus and operating method thereof
KR102437698B1 (en) Apparatus and method for encoding image thereof
US10491884B2 (en) Image processing method and electronic device supporting the same
CN112712498A (en) Vehicle damage assessment method and device executed by mobile terminal, mobile terminal and medium
CN112270302A (en) Limb control method and device and electronic equipment
CN113255516A (en) Living body detection method and device and electronic equipment
CN110705356B (en) Function control method and related equipment
CN111401206A (en) Panorama sharing method, system, device and medium
KR102457247B1 (en) Electronic device for processing image and method for controlling thereof
JP7150896B2 (en) Face recognition method and device, electronic device, and storage medium
CN113342170A (en) Gesture control method, device, terminal and storage medium
CN112417209A (en) Real-time video annotation method, system, terminal and medium based on browser
CN110730305A (en) Multi-source snapshot image processing and accessing method and device based on blocking queue
CN110751120A (en) Detection method and device and electronic equipment
CN112966666A (en) Living body identification method and device, electronic equipment and server
KR102317624B1 (en) Electronic device and method for processing image of the same
CN114740975A (en) Target content acquisition method and related equipment
KR102568387B1 (en) Electronic apparatus and method for processing data thereof
CN114071024A (en) Image shooting method, neural network training method, device, equipment and medium
CN114945072A (en) Dual-camera frame synchronization processing method and device, user terminal and storage medium
CN106296722B (en) Information processing method and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20210615