CN114298081A - Target behavior identity recognition method and device, electronic equipment and storage medium - Google Patents
Target behavior identity recognition method and device, electronic equipment and storage medium Download PDFInfo
- Publication number
- CN114298081A CN114298081A CN202111149623.1A CN202111149623A CN114298081A CN 114298081 A CN114298081 A CN 114298081A CN 202111149623 A CN202111149623 A CN 202111149623A CN 114298081 A CN114298081 A CN 114298081A
- Authority
- CN
- China
- Prior art keywords
- information
- coordinate
- identity
- behavior
- target behavior
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Landscapes
- Image Analysis (AREA)
Abstract
The application relates to an identity recognition method, an identity recognition device, electronic equipment and a storage medium for target behaviors, wherein the method comprises the following steps: determining identity coordinate information according to all the face information acquired in the current personnel arrangement mode and target behavior information matched with each face information; the target behavior information comprises behavior coordinates under a camera coordinate system, and the identity coordinate information comprises identities and corresponding behavior coordinates of all personnel under the camera coordinate system; when a face image matched with the current target behavior cannot be identified, acquiring a coordinate position corresponding to the current target behavior in a camera coordinate system; and matching the coordinate position with the identity coordinate information to obtain identity information corresponding to the current target behavior. Through the method and the device, the data volume of the captured effective behaviors is improved, and the accuracy of the analysis result obtained by supporting the data is ensured.
Description
Technical Field
The present application relates to the field of image processing technologies, and in particular, to an identity recognition method and apparatus for a target behavior, an electronic device, and a storage medium.
Background
With the progress of society, the attention on user identification is higher and higher. However, in the current face recognition scene, the accuracy of the recognition result cannot be indicated by using valid data as a support. With the development of science and technology, the target behaviors of the target personnel can be collected through the technologies of video collection, face recognition and behavior recognition, and the information of multiple dimensions of the target personnel can be rapidly judged through background image recognition for subsequent application analysis, such as application to the fields of security, attendance and door lock. The process is generally as follows: acquiring behavior information of a target person and corresponding face information, comparing the face information with face pictures prestored in a database, determining an identity corresponding to the behavior information, and finally storing the identity in the database.
When the identity is identified by the method, the collected face image is compared with the face image prestored on the equipment, if the comparison is successful, the record is recorded, and if the comparison is unsuccessful, the abandon is performed. However, in a general situation, there are many cases where behavior data is collected, but a face recognized by the behavior data is invalid, for example, a target person is headed low for a certain period of time. In this case, the behavior data is lost, so that only part of the collected data is obtained, and finally, the result obtained by performing statistical analysis on the part of the collected data is low in accuracy.
Disclosure of Invention
The embodiment of the application provides an identity recognition method and device of a target behavior, electronic equipment and a storage medium, and aims to at least solve the problem of low accuracy of statistical analysis based on target behavior information in the related art.
In a first aspect, an embodiment of the present application provides an identity recognition method for a target behavior, including:
determining identity coordinate information according to all the face information acquired in the current personnel arrangement mode and target behavior information matched with each face information; the target behavior information comprises behavior coordinates under a camera coordinate system, and the identity coordinate information comprises identities and corresponding behavior coordinates of all personnel under the camera coordinate system;
when a face image matched with the current target behavior cannot be identified, acquiring a coordinate position corresponding to the current target behavior in a camera coordinate system;
and matching the coordinate position with the identity coordinate information to obtain identity information corresponding to the current target behavior.
In some embodiments, determining the identity coordinate information according to all the face information acquired in the current people arrangement mode and the target behavior information matched with each face information includes:
acquiring face information under a current personnel arrangement mode, and identifying the face information to obtain identity information;
determining a plurality of potential position coordinates corresponding to the face information based on the target behavior information matched with the face information;
and determining identity coordinate information corresponding to the identity information based on the identity information and the plurality of potential position coordinates.
In some of these embodiments, determining, based on the identity information and a plurality of the potential location coordinates, identity coordinate information corresponding to the identity information comprises:
determining a coordinate frame based on a plurality of the potential location coordinates;
and determining identity coordinate information corresponding to the identity information based on the identity information and the coordinate interval in which the coordinate frame is located.
In some of these embodiments, determining a coordinate frame based on a plurality of the potential location coordinates comprises:
determining at least two first boundary coordinates based on a plurality of the potential location coordinates;
calibrating the first boundary coordinate based on a preset offset value to obtain a corresponding second boundary coordinate;
defining a coordinate frame based on the second boundary coordinates.
In some embodiments, obtaining the coordinate position corresponding to the current target behavior in the camera coordinate system includes:
acquiring target behavior information corresponding to the current target behavior;
and inputting the target behavior information into a trained coordinate detection model to obtain a coordinate position corresponding to the current target behavior matched with the face image.
In some embodiments, matching the coordinate position with the identity coordinate information to obtain the identity information corresponding to the current target behavior includes:
matching the coordinate position with a behavior coordinate corresponding to the identity coordinate information;
and when the coordinate position is matched with the behavior coordinate, determining the identity information corresponding to the behavior coordinate as the identity information corresponding to the target behavior information.
In some of these embodiments, the target behavior information further includes at least one of:
behavior picture, behavior identification, behavior type and behavior occurrence time.
In a second aspect, an embodiment of the present application provides an apparatus for identifying a target behavior, including:
the identity coordinate information determining unit is used for determining identity coordinate information according to all the face information acquired in the current personnel arrangement mode and target behavior information matched with each face information; the target behavior information comprises behavior coordinates under a camera coordinate system, and the identity coordinate information comprises identities and corresponding behavior coordinates of all personnel under the camera coordinate system;
the coordinate position acquisition unit is used for acquiring a coordinate position corresponding to the current target behavior in a camera coordinate system when a face image matched with the current target behavior cannot be identified;
and the identity information identification unit is used for matching the coordinate position with the identity coordinate information to obtain identity information corresponding to the current target behavior.
In a third aspect, an embodiment of the present application provides an electronic device, which includes a memory, a processor, and a computer program stored on the memory and executable on the processor, and when the processor executes the computer program, the processor implements the method for identifying an identity of a target behavior according to the first aspect.
In a fourth aspect, the present application provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the method for identifying an identity of a target behavior according to the first aspect.
Compared with the prior art, the identity identification method of the target behavior provided by the embodiment of the application determines the identity coordinate information according to all the face information acquired in the current personnel arrangement mode and the target behavior information matched with each face information, determines the identity coordinate information by using the data collected in the current personnel arrangement mode, and maximally mines the potential value of the existing data. When a face image matched with the current target behavior cannot be identified, acquiring a coordinate position corresponding to the current target behavior in a camera coordinate system; and matching the coordinate position with the identity coordinate information to obtain identity information corresponding to the current target behavior. The target behavior information corresponding to the face information is matched by using the identity coordinate information to obtain the corresponding identity information, so that the identity recognition rate of the target behavior information is obviously improved, the data volume of the captured effective behaviors is improved, and the accuracy of an analysis result obtained by supporting the data is ensured.
The details of one or more embodiments of the application are set forth in the accompanying drawings and the description below to provide a more thorough understanding of the application.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the application and together with the description serve to explain the application and not to limit the application. In the drawings:
FIG. 1 is a flow chart illustrating a method for identifying a target behavior according to an embodiment of the present disclosure;
FIG. 2 is a schematic flow chart illustrating the determination of identity coordinate information in one embodiment of the present application;
FIG. 3 is a schematic flow chart illustrating a process of determining identity coordinate information corresponding to the identity information according to an embodiment of the present application;
FIG. 4 is a block diagram of an apparatus for identifying a target behavior according to an embodiment of the present disclosure;
fig. 5 is a schematic structural diagram of an electronic device in one embodiment of the present application.
Description of the drawings: 201. an identity coordinate information determination unit; 202. a coordinate position acquisition unit; 203. an identity information recognition unit; 30. a bus; 31. a processor; 32. a memory; 33. a communication interface.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application will be described and illustrated below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments provided in the present application without any inventive step are within the scope of protection of the present application.
It is obvious that the drawings in the following description are only examples or embodiments of the present application, and that it is also possible for a person skilled in the art to apply the present application to other similar contexts on the basis of these drawings without inventive effort. Moreover, it should be appreciated that in the development of any such actual implementation, as in any engineering or design project, numerous implementation-specific decisions must be made to achieve the developers' specific goals, such as compliance with system-related and business-related constraints, which may vary from one implementation to another.
Reference in the specification to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the specification. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Those of ordinary skill in the art will explicitly and implicitly appreciate that the embodiments described herein may be combined with other embodiments without conflict.
Unless defined otherwise, technical or scientific terms referred to herein shall have the ordinary meaning as understood by those of ordinary skill in the art to which this application belongs. Reference to "a," "an," "the," and similar words throughout this application are not to be construed as limiting in number, and may refer to the singular or the plural. The present application is directed to the use of the terms "including," "comprising," "having," and any variations thereof, which are intended to cover non-exclusive inclusions; for example, a process, method, system, article, or apparatus that comprises a list of steps or modules (elements) is not limited to the listed steps or elements, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus. Reference to "connected," "coupled," and the like in this application is not intended to be limited to physical or mechanical connections, but may include electrical connections, whether direct or indirect. The term "plurality" as referred to herein means two or more. "and/or" describes an association relationship of associated objects, meaning that three relationships may exist, for example, "A and/or B" may mean: a exists alone, A and B exist simultaneously, and B exists alone. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship. Reference herein to the terms "first," "second," "third," and the like, are merely to distinguish similar objects and do not denote a particular ordering for the objects.
The method for identifying the target behavior described in the present application may be applied to various types of application scenarios, including but not limited to attendance, administrative management, performance, and the like, and the present application is not limited in any way.
The embodiment provides an identity recognition method of target behaviors. Fig. 1 is a flowchart of an identity recognition method of a target behavior according to an embodiment of the present application, where as shown in fig. 1, the flowchart includes the following steps:
step S11, determining identity coordinate information according to all face information acquired under the current personnel arrangement mode and target behavior information matched with each face information; the target behavior information comprises behavior coordinates in a camera coordinate system, and the identity coordinate information comprises identities and corresponding behavior coordinates of all people in the camera coordinate system.
Generally, in a fixed arrangement of people, the position of the same person in the same space does not change greatly, so that the target behavior of the people has a relatively fixed range of motion. In order to furthest mine the potential value of the existing data, the face information and the matched target behavior information under the current personnel arrangement mode can be collected to determine the identity coordinate information. Wherein, the current personnel arrangement mode can be fixed positions for personnel arrangement, such as seats, formation and the like; the identity coordinate information comprises the identity of each person and corresponding coordinate information, and the identity and the corresponding coordinate information are bound to be used as an identity identification basis of a subsequent target behavior.
In this embodiment, after the face information is collected, the corresponding identity information may be acquired through an image recognition technology. Meanwhile, target behavior information matched with the face information is obtained, the target behavior information comprises behavior coordinates under a camera coordinate system, the behavior coordinates can be coordinates where preset key points (such as the head, the mouth and the like) corresponding to personnel are located, or all coordinates or coordinate mean values corresponding to a plurality of preset key points (such as the left wrist, the right wrist and the like), custom configuration can be performed, and the method and the device are not limited herein. Optionally, the target behavior information further comprises at least one of: the behavior image, the behavior identification (such as sitting, standing, lifting hands and the like), the behavior type (such as posture, face orientation and the like), the behavior occurrence time and the like are convenient for information tracing of the target behavior by identifying and recording the multi-dimensional behavior information.
Step S12, when the face image matched with the current target behavior cannot be identified, acquiring the coordinate position corresponding to the current target behavior in the camera coordinate system.
In the related art, when the target behavior information is subjected to identity recognition, the situation that the face recognition cannot obtain an effective identity (for example, when a target person lowers the head, leans to the side or faces away from a camera) often occurs, so that the acquired target behavior information is invalid.
In this embodiment, when there is a face image with a matching current target behavior that cannot be identified, the face image is also used as valid data and a corresponding coordinate position in a camera coordinate system is acquired. Specifically, a camera coordinate system where a camera acquisition picture is located is established, and a coordinate position corresponding to target behavior information is obtained by identifying and analyzing data information in a target picture acquired by the camera. In some embodiments, the coordinate location may be a behavior coordinate in a corresponding camera coordinate system, thereby being consistent with the identity coordinate information for matching with the identity coordinate information; in other embodiments, the coordinate position may also be other user-defined position information obtained based on the current target behavior, as long as the matching correspondence with the identity coordinate information can be realized, and the application is not limited herein.
And step S13, matching the coordinate position with the identity coordinate information to obtain identity information corresponding to the current target behavior.
Specifically, in this embodiment, the coordinate position is subjected to coordinate matching with the behavior coordinate in the identity coordinate information, and the identity information corresponding to the behavior coordinate when the coordinate positions are matched with the behavior coordinate is determined as the identity information corresponding to the current target behavior. The matching mode may be that the coordinates are completely consistent, partially consistent, or that the coordinate position corresponding to the current target behavior falls within the coordinate interval where the behavior coordinate is located, which is not specifically limited in the present application.
The embodiments of the present application are described and illustrated below by means of preferred embodiments.
In summary, according to the identity recognition method for the target behaviors provided by the embodiment of the application, identity coordinate information is determined according to all face information acquired in the current personnel arrangement mode and target behavior information matched with each face information, the identity coordinate information comprises the identity of each personnel in a camera coordinate system and corresponding coordinate information, the identity coordinate information is determined by using data collected in the current personnel arrangement mode, and the potential value of the existing data is furthest mined. When a face image matched with the current target behavior cannot be identified, acquiring a coordinate position corresponding to the current target behavior in a camera coordinate system; and matching the coordinate position with the identity coordinate information to obtain identity information corresponding to the current target behavior. The target behavior information corresponding to the face information is matched by using the identity coordinate information to obtain the corresponding identity information, so that the identity recognition rate of the target behavior information is obviously improved, the data volume of the captured effective behaviors is improved, and the accuracy of an analysis result obtained by supporting the data is ensured.
As shown in fig. 2, on the basis of the above embodiments, in some embodiments, determining the identity coordinate information according to all the face information collected in the current people arrangement mode and the target behavior information matched with each face information includes:
and step S111, obtaining the face information under the current personnel arrangement mode, and identifying the face information to obtain identity information. Specifically, the face information may be identified using an identification algorithm based on a face feature point, an identification algorithm based on an entire face image, an identification algorithm based on a template, an algorithm for identification using a neural network, or the like. The face recognition algorithm is a general algorithm, and the detailed description and limitation are not provided in the present application.
Step S112, determining a plurality of potential position coordinates corresponding to the face information based on the target behavior information matched with the face information. Specifically, after the identity coordinate information is determined, the position coordinate corresponding to each individual inevitably changes within a certain range in the current personnel arrangement mode. Therefore, in order to adapt the recognition method to the weak movement of the person, for the face information capable of recognizing the identity information, a plurality of potential position coordinates can be further obtained based on the target behavior information matched with the face information. The potential position coordinates are behavior coordinates of a certain target behavior corresponding to the same face information, and the selection and calculation modes of different potential position coordinates are the same.
Step S113, determining identity coordinate information corresponding to the identity information based on the identity information and the plurality of potential location coordinates.
Specifically, in this embodiment, firstly, behavior coordinates corresponding to the current target behavior may be calculated and obtained based on a plurality of the potential position coordinates. The specific calculation method may directly obtain the behavior coordinate from an average of the plurality of potential position coordinates, or may calculate the behavior coordinate after further data screening is performed based on the plurality of potential position coordinates, which is not limited in this embodiment. And then determining identity coordinate information based on the identity information and the behavior coordinate corresponding to the current target behavior.
As shown in fig. 3, on the basis of the foregoing embodiments, in some of the embodiments, determining, based on the identity information and a plurality of the potential location coordinates, identity coordinate information corresponding to the identity information includes:
step S1131, determining a coordinate frame based on the plurality of potential position coordinates.
In this embodiment, based on a plurality of the potential position coordinates, a coordinate interval in which the behavior coordinate is located may be obtained. Specifically, first boundary coordinates are determined based on a plurality of the potential position coordinates. From at least two of the first boundary coordinates, a coordinate interval (P1[ x1, y1], P2[ x2, y2]) can be derived to encompass all potential position coordinates. In some embodiments, a coordinate frame may be determined from the two first boundary coordinates; in other embodiments, the maximum and minimum axial coordinate values in the X and Y directions are taken from three or four first boundary coordinates, respectively, to obtain corresponding coordinate intervals. For example: when a plurality of the potential positions are (A1[1,0.5], A2[2,4], A3[3,2], A4[4,3]), a coordinate frame can be determined according to A1[1,0.5], A2[2,4] and A4[4,3], and a corresponding coordinate interval (P1[1,4], P2[4,0.5]) is obtained by respectively taking the maximum and minimum axial coordinate values of X direction and Y direction from three or four first boundary coordinates.
And calibrating the first boundary coordinate based on a preset offset value to obtain a corresponding second boundary coordinate, and defining a coordinate frame based on the second boundary coordinate. Since the coordinate interval defined based on the first boundary coordinates is rough, it is necessary to perform offset processing to further refine the range of the coordinate interval, and to ensure the accuracy of the recognition result. Specifically, in some embodiments, the offset value has the highest accuracy when the interval width of the coordinate interval is 1/4-1/8 times. Exemplarily, x1 in a coordinate interval (P1[ x1, y1], P2[ x2, y2]) is translated to the right (x2-x 1)/8), x2 is translated to the left (x2-x1)/8, y1 is translated downwards (y2-y1)/8, and y2 is translated upwards (y2-y 1)/8. The final coordinate interval is optimized to be (P1[ x1+ (x2-x1)/8, y1+ (y2-y1)/8], P2[ x2- (x2-x1)/8, y2- (y2-y1)/8 ]). The offset value can be configured in a user-defined mode, and the size of the offset value depends on the personnel distribution situation under the current personnel arrangement mode.
Step S1132, determining identity coordinate information corresponding to the identity information based on the identity information and a coordinate interval in which the coordinate frame is located.
In this embodiment, the coordinate interval where the coordinate frame is located is determined as the behavior coordinate of the target behavior information in the camera coordinate system, and the behavior coordinate is bound with the identity information corresponding to the target behavior information to obtain the identity coordinate information.
Through the steps, the coordinate frame is determined based on the potential position coordinates, the coordinate interval where the coordinate frame is located is used as the behavior coordinate of the target behavior information in the camera coordinate system, and the behavior coordinate is bound with the identity information corresponding to the target behavior information, so that the method can adapt to weak movement of personnel, and the identity recognition rate of the target behavior information is improved.
On the basis of the foregoing embodiments, in some embodiments, the acquiring a coordinate position corresponding to the current target behavior in a camera coordinate system includes:
and acquiring target behavior information corresponding to the current target behavior, inputting the target behavior information into a trained coordinate detection model, and acquiring a coordinate position corresponding to the current target behavior matched with the face image.
In this embodiment, the coordinate position corresponding to the target behavior information may be obtained by identifying through a trained coordinate detection model. The method comprises the steps of establishing a video image training sample set containing a coordinate position label corresponding to target behavior information, and training a preset detection model by using the training sample set to obtain a coordinate detection model, wherein the preset detection model is an open source model such as an OpenPose model, a Detectron model or an AlphaPose model.
It should be noted that the steps illustrated in the above-described flow diagrams or in the flow diagrams of the figures may be performed in a computer system, such as a set of computer-executable instructions, and that, although a logical order is illustrated in the flow diagrams, in some cases, the steps illustrated or described may be performed in an order different than here.
The embodiment also provides a target behavior identity recognition device, which is used for implementing the above embodiments and preferred embodiments, and the description of the device is omitted. As used hereinafter, the terms "module," "unit," "subunit," and the like may implement a combination of software and/or hardware for a predetermined function. Although the means described in the embodiments below are preferably implemented in software, an implementation in hardware, or a combination of software and hardware is also possible and contemplated.
Fig. 4 is a block diagram of a target behavior identification apparatus according to an embodiment of the present application, and as shown in fig. 4, the apparatus includes: an identity coordinate information determination unit 201, a coordinate position acquisition unit 202, and an identity information identification unit 203.
The identity coordinate information determining unit 201 is configured to determine identity coordinate information according to all face information acquired in the current person arrangement mode and target behavior information matched with each face information; the target behavior information comprises behavior coordinates under a camera coordinate system, and the identity coordinate information comprises identities and corresponding behavior coordinates of all personnel under the camera coordinate system;
a coordinate position obtaining unit 202, configured to obtain a coordinate position corresponding to a current target behavior in a camera coordinate system when a face image matching the current target behavior cannot be identified;
and the identity information identification unit 203 is configured to match the coordinate position with the identity coordinate information to obtain identity information corresponding to the current target behavior.
In some of these embodiments, the target behavior information further includes at least one of:
behavior picture, behavior identification, behavior type and behavior occurrence time.
In some of these embodiments, the identity coordinate information determination unit 201 includes: the system comprises an identity information acquisition module, a potential position coordinate acquisition module and an identity coordinate acquisition module.
The identity information acquisition module is used for acquiring face information under the current personnel arrangement mode and identifying the face information to obtain identity information;
the potential position coordinate acquisition module is used for determining a plurality of potential position coordinates corresponding to the face information based on the target behavior information matched with the face information;
and the identity coordinate acquisition module is used for determining identity coordinate information corresponding to the identity information based on the identity information and the plurality of potential position coordinates.
In some embodiments, the identity coordinate acquisition module comprises: a coordinate frame determination module and an identity coordinate determination module.
A coordinate frame determination module for determining a coordinate frame based on a plurality of the potential location coordinates;
and the identity coordinate determination module is used for determining identity coordinate information corresponding to the identity information based on the identity information and the coordinate interval where the coordinate frame is located.
In some of these embodiments, the coordinate frame determination module comprises: the device comprises a first boundary coordinate determination module, a second boundary coordinate determination module and a coordinate frame delimiting module.
A first boundary coordinate determination module to determine at least two first boundary coordinates based on a plurality of the potential location coordinates;
the second boundary coordinate determination module is used for calibrating the first boundary coordinate based on a preset offset value to obtain a corresponding second boundary coordinate;
and the coordinate frame delineating module is used for delineating a coordinate frame based on the second boundary coordinate.
In some of these embodiments, the coordinate position obtaining unit 202 includes: the device comprises a target behavior information acquisition module and a coordinate position determination module.
The target behavior information acquisition module is used for acquiring target behavior information matched with the face information;
and the coordinate position determining module is used for inputting the target behavior information into a trained coordinate detection model to obtain a coordinate position corresponding to the target behavior information matched with the face information.
In some embodiments, the identity information identifying unit 203 comprises: a matching module and an identity recognition module.
The matching module is used for matching the coordinate position with the behavior coordinate corresponding to the identity coordinate information;
and the identity recognition module is used for determining the identity information corresponding to the behavior coordinate as the identity information corresponding to the target behavior information when the coordinate position is matched with the behavior coordinate.
The above modules may be functional modules or program modules, and may be implemented by software or hardware. For a module implemented by hardware, the modules may be located in the same processor; or the modules can be respectively positioned in different processors in any combination.
In addition, the method for identifying the target behavior in the embodiment of the present application described in conjunction with fig. 1 may be implemented by an electronic device. Fig. 5 is a schematic diagram of a hardware structure of an electronic device according to an embodiment of the present application.
The electronic device may comprise a processor 31 and a memory 32 in which computer program instructions are stored.
Specifically, the processor 31 may include a Central Processing Unit (CPU), or A Specific Integrated Circuit (ASIC), or may be configured to implement one or more Integrated circuits of the embodiments of the present Application.
The memory 32 may be used to store or cache various data files that need to be processed and/or used for communication, as well as possible computer program instructions executed by the processor 31.
The processor 31 reads and executes the computer program instructions stored in the memory 32 to implement the identity recognition method of any one of the target behaviors in the above embodiments.
In some of these embodiments, the electronic device may also include a communication interface 33 and a bus 30. As shown in fig. 5, the processor 31, the memory 32, and the communication interface 33 are connected via the bus 30 to complete mutual communication.
The communication interface 33 is used for implementing communication between modules, devices, units and/or equipment in the embodiment of the present application. The communication interface 33 may also enable communication with other components such as: the data communication is carried out among external equipment, image/data acquisition equipment, a database, external storage, an image/data processing workstation and the like.
The bus 30 includes hardware, software, or both that couple the components of the electronic device to one another. Bus 30 includes, but is not limited to, at least one of the following: data Bus (Data Bus), Address Bus (Address Bus), Control Bus (Control Bus), Expansion Bus (Expansion Bus), and Local Bus (Local Bus). By way of example, and not limitation, Bus 30 may include an Accelerated Graphics Port (AGP) or other Graphics Bus, an Enhanced Industry Standard Architecture (EISA) Bus, a Front-Side Bus (Front Side Bus), an FSB (FSB), a Hyper Transport (HT) Interconnect, an ISA (ISA) Bus, an InfiniBand (InfiniBand) Interconnect, a Low Pin Count (LPC) Bus, a memory Bus, a microchannel Architecture (MCA) Bus, a PCI (Peripheral Component Interconnect) Bus, a PCI-Express (PCI-X) Bus, a Serial Advanced Technology Attachment (SATA) Bus, a Video Electronics Bus (audio Association) Bus, abbreviated VLB) bus or other suitable bus or a combination of two or more of these. Bus 30 may include one or more buses, where appropriate. Although specific buses are described and shown in the embodiments of the application, any suitable buses or interconnects are contemplated by the application.
The electronic device may execute the method for identifying an identity of a target behavior in the embodiment of the present application based on the obtained program instruction, so as to implement the method for identifying an identity of a target behavior described in conjunction with fig. 1.
In addition, in combination with the method for identifying an identity of a target behavior in the foregoing embodiments, the embodiments of the present application may provide a computer-readable storage medium to implement. The computer readable storage medium having stored thereon computer program instructions; the computer program instructions, when executed by a processor, implement a method for identifying an identity of a target activity of any of the above embodiments.
It should be noted that, the user information (including but not limited to user device information, user personal information, etc.) and data (including but not limited to data for analysis, stored data, presented data, etc.) referred to in the present application are information and data authorized by the user or sufficiently authorized by each party.
The technical features of the embodiments described above may be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the embodiments described above are not described, but should be considered as being within the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.
Claims (10)
1. An identity recognition method for target behaviors, comprising:
determining identity coordinate information according to all the face information acquired in the current personnel arrangement mode and target behavior information matched with each face information; the target behavior information comprises behavior coordinates under a camera coordinate system, and the identity coordinate information comprises identities and corresponding behavior coordinates of all personnel under the camera coordinate system;
when a face image matched with the current target behavior cannot be identified, acquiring a coordinate position corresponding to the current target behavior in a camera coordinate system;
and matching the coordinate position with the identity coordinate information to obtain identity information corresponding to the current target behavior.
2. The method for identifying the identity of the target behavior according to claim 1, wherein determining the identity coordinate information according to all the face information collected in the current people arrangement mode and the target behavior information matched with each face information comprises:
acquiring face information under a current personnel arrangement mode, and identifying the face information to obtain identity information;
determining a plurality of potential position coordinates corresponding to the face information based on the target behavior information matched with the face information;
and determining identity coordinate information corresponding to the identity information based on the identity information and the plurality of potential position coordinates.
3. The method of claim 2, wherein determining identity coordinate information corresponding to the identity information based on the identity information and a plurality of the potential location coordinates comprises:
determining a coordinate frame based on a plurality of the potential location coordinates;
and determining identity coordinate information corresponding to the identity information based on the identity information and the coordinate interval in which the coordinate frame is located.
4. The method of claim 3, wherein determining a coordinate frame based on the plurality of potential location coordinates comprises:
determining at least two first boundary coordinates based on a plurality of the potential location coordinates;
calibrating the first boundary coordinate based on a preset offset value to obtain a corresponding second boundary coordinate;
defining a coordinate frame based on the second boundary coordinates.
5. The method for identifying the identity of the target behavior according to claim 1, wherein obtaining the coordinate position corresponding to the current target behavior in a camera coordinate system comprises:
acquiring target behavior information corresponding to the current target behavior;
and inputting the target behavior information into a trained coordinate detection model to obtain a coordinate position corresponding to the current target behavior matched with the face image.
6. The identity recognition method of the target behavior according to claim 1, wherein matching the coordinate position with the identity coordinate information to obtain the identity information corresponding to the current target behavior comprises:
matching the coordinate position with a behavior coordinate corresponding to the identity coordinate information;
and when the coordinate position is matched with the behavior coordinate, determining the identity information corresponding to the behavior coordinate as the identity information corresponding to the target behavior information.
7. The method for identifying the target behavior according to claim 1, wherein the target behavior information further comprises at least one of the following:
behavior picture, behavior identification, behavior type and behavior occurrence time.
8. An apparatus for identifying a target behavior, comprising:
the identity coordinate information determining unit is used for determining identity coordinate information according to all the face information acquired in the current personnel arrangement mode and target behavior information matched with each face information; the target behavior information comprises behavior coordinates under a camera coordinate system, and the identity coordinate information comprises identities and corresponding behavior coordinates of all personnel under the camera coordinate system;
the coordinate position acquisition unit is used for acquiring a coordinate position corresponding to the current target behavior in a camera coordinate system when a face image matched with the current target behavior cannot be identified;
and the identity information identification unit is used for matching the coordinate position with the identity coordinate information to obtain identity information corresponding to the current target behavior.
9. An electronic device comprising a memory and a processor, wherein the memory has stored therein a computer program, and the processor is configured to execute the computer program to perform the method for identifying an identity of a target activity of any one of claims 1 to 7.
10. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out a method of identification of a target activity according to any one of claims 1 to 7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111149623.1A CN114298081A (en) | 2021-09-29 | 2021-09-29 | Target behavior identity recognition method and device, electronic equipment and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111149623.1A CN114298081A (en) | 2021-09-29 | 2021-09-29 | Target behavior identity recognition method and device, electronic equipment and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN114298081A true CN114298081A (en) | 2022-04-08 |
Family
ID=80964415
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111149623.1A Pending CN114298081A (en) | 2021-09-29 | 2021-09-29 | Target behavior identity recognition method and device, electronic equipment and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114298081A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116343102A (en) * | 2023-05-30 | 2023-06-27 | 深圳酷源数联科技有限公司 | Underground personnel safety early warning method, device, system and storage medium |
CN117708616A (en) * | 2024-02-05 | 2024-03-15 | 四川大学华西医院 | Person similarity calculation method, device, electronic equipment and computer storage medium |
-
2021
- 2021-09-29 CN CN202111149623.1A patent/CN114298081A/en active Pending
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116343102A (en) * | 2023-05-30 | 2023-06-27 | 深圳酷源数联科技有限公司 | Underground personnel safety early warning method, device, system and storage medium |
CN117708616A (en) * | 2024-02-05 | 2024-03-15 | 四川大学华西医院 | Person similarity calculation method, device, electronic equipment and computer storage medium |
CN117708616B (en) * | 2024-02-05 | 2024-05-24 | 四川大学华西医院 | Person similarity calculation method, device, electronic equipment and computer storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109948408B (en) | Activity test method and apparatus | |
CN108804884B (en) | Identity authentication method, identity authentication device and computer storage medium | |
CN106897658B (en) | Method and device for identifying human face living body | |
US20190012450A1 (en) | Biometric-based authentication method, apparatus and system | |
CN107844748A (en) | Auth method, device, storage medium and computer equipment | |
CN114298081A (en) | Target behavior identity recognition method and device, electronic equipment and storage medium | |
CN109756458B (en) | Identity authentication method and system | |
CN106056083B (en) | A kind of information processing method and terminal | |
CN111898538B (en) | Certificate authentication method and device, electronic equipment and storage medium | |
CN110688878B (en) | Living body identification detection method, living body identification detection device, living body identification detection medium, and electronic device | |
Smith-Creasey et al. | Continuous face authentication scheme for mobile devices with tracking and liveness detection | |
CN108959884B (en) | Human authentication verification device and method | |
US9378406B2 (en) | System for estimating gender from fingerprints | |
US20220164423A1 (en) | Method and apparatus for user recognition | |
CN108875549B (en) | Image recognition method, device, system and computer storage medium | |
CN109543635A (en) | Biopsy method, device, system, unlocking method, terminal and storage medium | |
KR101961462B1 (en) | Object recognition method and the device thereof | |
JP6311237B2 (en) | Collation device and collation method, collation system, and computer program | |
CN112183167A (en) | Attendance checking method, authentication method, living body detection method, device and equipment | |
CN110163164B (en) | Fingerprint detection method and device | |
CN108875467B (en) | Living body detection method, living body detection device and computer storage medium | |
WO2023024473A1 (en) | Living body detection method and apparatus, and electronic device, computer-readable storage medium and computer program product | |
US10867022B2 (en) | Method and apparatus for providing authentication using voice and facial data | |
CN113920590A (en) | Living body detection method, living body detection device, living body detection equipment and readable storage medium | |
CN110348361B (en) | Skin texture image verification method, electronic device, and recording medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |