CN111291623A - Heart physiological characteristic prediction method and device based on face information - Google Patents

Heart physiological characteristic prediction method and device based on face information Download PDF

Info

Publication number
CN111291623A
CN111291623A CN202010045198.0A CN202010045198A CN111291623A CN 111291623 A CN111291623 A CN 111291623A CN 202010045198 A CN202010045198 A CN 202010045198A CN 111291623 A CN111291623 A CN 111291623A
Authority
CN
China
Prior art keywords
user
face
clustering
face feature
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010045198.0A
Other languages
Chinese (zh)
Inventor
徐涛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Lianxin Technology Co ltd
Original Assignee
Zhejiang Lianxin Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Lianxin Technology Co ltd filed Critical Zhejiang Lianxin Technology Co ltd
Priority to CN202010045198.0A priority Critical patent/CN111291623A/en
Publication of CN111291623A publication Critical patent/CN111291623A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/168Feature extraction; Face representation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/174Facial expression recognition

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • General Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

The application discloses a method and a device for predicting cardiac physiological characteristics based on face information, electronic equipment and a readable storage medium. The method comprises the following steps: acquiring first user face features to form a first user face feature set according to the first user face features; matching the first user face feature set with a second user clustering result; and determining the psychological characteristics and/or the physiological characteristics of the first user according to the matching result of the first user face characteristic set and the second user clustering result. The method and the device solve the technical problem that the application of the face recognition technology is limited due to the fact that the face recognition method in the related technology cannot effectively predict the psychophysiological characteristics of the user. By the method and the device, the purpose of effectively predicting the psychophysiological characteristics of the user based on the face recognition information is achieved, and therefore the technical effects of expanding the application range of the face recognition technology and accurately predicting the psychophysiological characteristics of the user are achieved.

Description

Heart physiological characteristic prediction method and device based on face information
Technical Field
The present application relates to the technical field of psychological and physiological feature prediction, and in particular, to a method and an apparatus for predicting psychological and physiological features based on face information, an electronic device, and a readable storage medium.
Background
Face recognition is a biometric technology for identity recognition based on facial feature information of a person. A series of related technologies, also commonly called face recognition and face recognition, are used to capture an image or video stream containing a face with a camera or a video camera, automatically detect and track the face in the image, and then perform face recognition on the detected face.
The inventor finds that the face recognition technology in the related technology is mainly applied to judgment of physical attributes of users such as living body detection, face matching, certificate recognition and the like, cannot further predict psychological characteristics or physiological characteristics of the users based on face recognition results, and is very limited in application.
Aiming at the problem that the application of the face recognition technology is limited because the face recognition method in the related technology cannot effectively predict the psychophysiological characteristics of the user, an effective solution is not provided at present.
Disclosure of Invention
The present application mainly aims to provide a method and an apparatus for predicting psychophysiological characteristics based on face information, an electronic device, and a readable storage medium, so as to solve the problem that the application of a face recognition technology is limited due to the fact that a face recognition method in the related art cannot effectively predict the psychophysiological characteristics of a user.
In order to achieve the above object, according to a first aspect of the present application, a method for predicting psychophysiological characteristics based on face information is provided.
The heart physiological characteristic prediction method based on the face information comprises the following steps: acquiring first user face features to form a first user face feature set according to the first user face features; matching the first user face feature set with a second user clustering result, wherein the second user clustering result comprises a second user face feature clustering result and a second user psychophysiological feature clustering result corresponding to the second user face feature clustering result; and determining the psychological characteristics and/or the physiological characteristics of the first user according to the matching result of the first user face characteristic set and the second user clustering result.
Further, the matching the first user face feature set with the second user clustering result includes: collecting second user face information; extracting second user face features according to the second user face information; and clustering the second user face features according to a preset face feature clustering threshold value to obtain a second user face feature clustering result.
Further, the second user face feature clustering result includes a plurality of face feature user groups, and matching the first user face feature set with the second user face feature clustering result includes: collecting human-computer interaction information of each human face feature user group, wherein the human-computer interaction information comprises any one or more of second user personal information, second user psychological information and second user physiological information; extracting a second user psychophysiological characteristic according to the man-machine interaction information; and clustering the psychophysiological characteristics of the second user according to a preset psychophysiological characteristic clustering threshold value to obtain a second user psychophysiological characteristic clustering result corresponding to each face characteristic user group.
Further, the second user face features include a plurality of face features, and the clustering the second user face features according to a preset face feature clustering threshold to obtain a second user face feature clustering result includes: clustering a plurality of second user face features to obtain a plurality of face feature clustering sub-results; and comparing the plurality of face feature clustering sub-results with the preset face feature clustering threshold respectively to obtain a second user face feature clustering result.
Further, the second user clustering result comprises a plurality of face feature user groups, and the determining of the psychological and/or physiological features of the first user according to the matching result of the first user face feature set and the second user clustering result comprises: calculating the matching degree of the first user face feature set and each face feature user group to obtain a plurality of matching results; and comparing each matching result with a preset matching threshold respectively to determine a face feature user group to which the first user belongs and the cardiac physiological features corresponding to the face feature user group according to the comparison result.
In order to achieve the above object, according to a second aspect of the present application, there is provided a psychophysiological characteristic prediction apparatus based on face information.
The heart physiological characteristic prediction device based on the face information comprises: the acquisition module is used for acquiring first user face features so as to form a first user face feature set according to the first user face features; the matching module is used for matching the first user face feature set with a second user clustering result, wherein the second user clustering result comprises a second user face feature clustering result and a second user psychophysiological feature clustering result corresponding to the second user face feature clustering result; and the determining module is used for determining the psychological characteristics and/or the physiological characteristics of the first user according to the matching result of the first user face characteristic set and the second user clustering result.
Further, the matching module comprises: the first acquisition unit is used for acquiring second user face information; the first extraction unit is used for extracting second user face features according to the second user face information; and the first clustering unit is used for clustering the second user face features according to a preset face feature clustering threshold value so as to obtain a second user face feature clustering result.
Further, the matching module further comprises: the second acquisition unit is used for acquiring the human-computer interaction information of each human face feature user group, wherein the human-computer interaction information comprises any one or more of second user personal information, second user psychological information and second user physiological information; the second extraction unit is used for extracting the psychophysiological characteristics of a second user according to the human-computer interaction information; and the second clustering unit is used for clustering the second user psychophysiological characteristics according to a preset psychophysiological characteristic clustering threshold value so as to obtain a second user psychophysiological characteristic clustering result corresponding to each face characteristic user group.
In order to achieve the above object, according to a third aspect of the present application, there is provided an electronic apparatus comprising: one or more processors; storage means for storing one or more programs; the one or more programs, when executed by the one or more processors, cause the one or more processors to implement the method of any of the preceding claims.
To achieve the above object, according to a fourth aspect of the present application, there is provided a non-transitory readable storage medium having stored thereon computer instructions which, when executed by a processor, implement the steps of the method according to any of the preceding claims.
In the embodiment of the application, a first user face feature is obtained to form a first user face feature set according to the first user face feature; matching the first user face feature set with a second user clustering result, wherein the second user clustering result comprises a second user face feature clustering result and a second user psychophysiological feature clustering result corresponding to the second user face feature clustering result, the purpose of effectively predicting the psychology and physiology characteristics of the user based on the face recognition information is achieved by determining the psychology and/or physiology characteristics of the first user according to the matching result of the first user face characteristic set and the second user clustering result, thereby realizing the technical effects of expanding the application range of the face recognition technology and accurately predicting the psychophysiological characteristics of the user, further, the technical problem that the application of the face recognition technology is limited due to the fact that the face recognition method in the related technology cannot effectively predict the psychophysiological characteristics of the user is solved.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this application, serve to provide a further understanding of the application and to enable other features, objects, and advantages of the application to be more apparent. The drawings and their description illustrate the embodiments of the invention and do not limit it. In the drawings:
fig. 1 is a schematic flowchart of a method for predicting psychophysiological characteristics based on face information according to a first embodiment of the present application;
fig. 2 is a flowchart illustrating a method for predicting psychophysiological characteristics based on face information according to a second embodiment of the present application;
fig. 3 is a flowchart illustrating a method for predicting psychophysiological characteristics based on face information according to a third embodiment of the present application;
fig. 4 is a flowchart illustrating a method for predicting psychophysiological characteristics based on face information according to a fourth embodiment of the present application;
fig. 5 is a flowchart illustrating a method for predicting psychophysiological characteristics based on face information according to a fifth embodiment of the present application;
fig. 6 is a schematic structural diagram of a component of a psychophysiological characteristic prediction device based on face information according to an embodiment of the present application; and
fig. 7 is a schematic diagram of a composition structure of an electronic device according to an embodiment of the present application.
Detailed Description
In order to make the technical solutions better understood by those skilled in the art, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only partial embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
It should be noted that the terms "first," "second," and the like in the description and claims of this application and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It should be understood that the data so used may be interchanged under appropriate circumstances such that embodiments of the application described herein may be used. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict. The present application will be described in detail below with reference to the embodiments with reference to the attached drawings.
According to an embodiment of the present invention, there is provided a method for predicting psychophysiological characteristics based on face information, as shown in fig. 1, the method includes the following steps S101 to S103:
step S101, obtaining first user face features, and forming a first user face feature set according to the first user face features.
In specific implementation, firstly, the face of the user needs to be recognized based on a face recognition technology, for example, the face information of the user can be collected by a camera under the condition that the user knows and agrees, and one or more face features in a face image are recognized and extracted by a face image feature extraction method in the related technology, so that a face feature set of the user is obtained. Features available in existing face recognition systems are generally classified into visual features, pixel statistical features, face image transform coefficient features, face image algebraic features, and the like. The face feature extraction is performed on some features of the face. Face feature extraction, also known as face characterization, is a process of feature modeling for a face. The methods for extracting human face features are classified into two main categories: one is a knowledge-based characterization method; the other is a characterization method based on algebraic features or statistical learning. Specifically, a person skilled in the art can flexibly select which face feature extraction method is adopted according to actual conditions, and is not specifically limited herein.
Step S102, matching the first user face feature set with a second user clustering result, wherein the second user clustering result comprises a second user face feature clustering result and a second user psychophysiological feature clustering result corresponding to the second user face feature clustering result.
In specific implementation, after the face feature set of the current user is obtained, the face features of the user are combined and matched with feature clustering results of other users obtained in advance, and specifically, the feature clustering results of the other users include face feature clustering results of the other users and psychophysiological feature clustering results corresponding to the face feature clustering results. That is, in the embodiment of the present application, the face feature information and the psychophysiological feature information of a large number of users are obtained in advance, and are subjected to cluster analysis, so as to obtain different user groups corresponding to different face features and psychophysiological feature labels of the user groups, and the result is used as a basis for performing user psychophysiological feature prediction based on the face features.
Step S103, determining the psychological characteristics and/or the physiological characteristics of the first user according to the matching result of the first user face characteristic set and the second user clustering result.
In specific implementation, a certain matching threshold may be preset as a standard for determining which user group the face feature set of the current user belongs to. Specifically, the face feature set of the current user is respectively matched with the face feature sets of other user groups, when the matching degree reaches a preset matching threshold, the current user is considered to belong to the user group matched with the current user, and the corresponding psychophysiological features and the corresponding prediction probability of the user are determined according to the user group. Through the process, the user group to which the user belongs can be determined based on the face feature information of the user, so that the psychophysiological features of the user are determined, the application of the face recognition technology in the prior art is greatly expanded, meanwhile, the psychophysiological features of the user can be accurately determined, the related departments can be helped to know individuals more quickly, and corresponding measures are taken.
As a preferred implementation manner of the embodiment of the present application, as shown in fig. 2, the matching the first user facial feature set and the second user clustering result includes steps S201 to S203 as follows:
step S201, second user face information is collected.
In specific implementation, the embodiment of the application needs to collect face information of a certain number of users in advance and perform system analysis, so as to provide a basis for subsequently predicting the psychophysiological characteristics of other users according to an analysis result. Specifically, the face information of the user can be collected through the camera under the condition that the user knows and agrees.
And step S202, extracting second user face features according to the second user face information.
In specific implementation, a face feature extraction method in the related technology is used for extracting face features of the collected face information of the user, and a face feature point set of the user is formed.
And step S203, clustering the second user face features according to a preset face feature clustering threshold value to obtain a second user face feature clustering result.
In specific implementation, a human face feature similarity group threshold value can be preset, and according to the threshold value, it can be determined which users' human face feature information can be grouped into one type, that is, the similarity is higher, so as to obtain one or more user groups. For example, the users include a1, a2, A3 and a4 … … An, and the similarity of the facial features of a2 and A3 is found to exceed a preset facial feature cluster threshold value through cluster analysis, and the similarity of the facial features of a1 and a4 is found to exceed a preset facial feature cluster threshold value, so that the a2 and A3 are classified into a user group, and the a1 and a4 are classified into a user group. The clustering method may include a K-MEANS clustering algorithm, a mean shift clustering algorithm, a DBSCAN clustering algorithm, Expectation Maximization (EM) clustering using a Gaussian Mixture Model (GMM), a hierarchical clustering algorithm, and the like, and a specific clustering method may be flexibly selected by those skilled in the art according to actual needs, and is not specifically limited herein.
As a preferred implementation manner of the embodiment of the present application, as shown in fig. 3, the second user face feature clustering result includes a plurality of face feature user groups, and the matching between the first user face feature set and the second user face feature clustering result includes the following steps S301 to S303:
step S301, collecting human-computer interaction information of each human face feature user group, wherein the human-computer interaction information comprises any one or more of second user personal information, second user psychological information and second user physiological information.
In specific implementation, when the clustering results of other users are obtained in advance, in addition to obtaining face information of a certain number of users, human-computer interaction information corresponding to the users needs to be obtained, wherein the human-computer interaction information refers to data of multiple dimensions collected by the artificial intelligent robot in the interaction process of the users and the artificial intelligent robot. The data of multiple dimensions mainly includes personal information (age, sex, residence, occupation, academic calendar, income level, family condition, marital condition, etc.) of the user, psychological information (psychological disturbance, psychological state, psychological crisis condition, psychological trait, psychological ability, psychological tendency, psychological disease history, etc.), physiological information (height, weight, blood type, blood pressure, blood sugar, blood fat, physiological disease history, etc.), and user behavior information (behavior track, action mode, operation mode, chat sentence, etc.), etc.
And step S302, extracting the psychophysiological characteristics of the second user according to the man-machine interaction information.
In specific implementation, the obtained man-machine interaction information of each dimension is subjected to feature extraction, and then a corresponding personal attribute label (such as age, gender and the like) and a psychophysiological feature label of each user are obtained.
Step S303, clustering the psychophysiological characteristics of the second user according to a preset psychophysiological characteristic clustering threshold value to obtain a second user psychophysiological characteristic clustering result corresponding to each face characteristic user group.
In specific implementation, different psychophysiological characteristic clustering threshold values and corresponding prediction probabilities can be preset, and for each human face characteristic user group, statistical clustering of psychological characteristics and physiological characteristics is performed on the group through a data clustering analysis method, so that a significant psychophysiological characteristic set of each human face characteristic group is obtained.
As a preferred implementation manner of the embodiment of the present application, as shown in fig. 4, the clustering of the second user face features according to the preset face feature clustering threshold to obtain a second user face feature clustering result includes steps S401 to S402 as follows:
step S401, clustering a plurality of second user face features to obtain a plurality of face feature clustering sub-results;
in specific implementation, when clustering the face feature sets of a plurality of users, firstly, similarity clustering information between every two face feature sets, namely, a face feature clustering sub-result, is obtained, and whether the users corresponding to any two face feature sets are the same user group or not can be determined based on the face feature clustering sub-result.
Step S402, comparing the plurality of face feature clustering sub-results with the preset face feature clustering threshold respectively to obtain the second user face feature clustering result.
In specific implementation, the obtained similarity of any two face feature sets is compared with a preset face feature clustering threshold, and if the similarity is higher than the preset face feature clustering threshold, users corresponding to the two face feature sets are considered to belong to the same user group, so that face feature clustering results of the users are obtained.
As a preferred implementation manner of the embodiment of the present application, as shown in fig. 5, the second user clustering result includes a plurality of facial feature user groups, and the determining the psychological and/or physiological features of the first user according to the matching result of the first user facial feature set and the second user clustering result includes steps S501 to S502 as follows:
step S501, calculating the matching degree of the first user face feature set and each face feature user group to obtain a plurality of matching results.
In specific implementation, when determining the psychophysiological characteristics of the first user according to the matching result of the face characteristic set of the first user and the clustering result of the second user, firstly, the matching degree between the face characteristic set of the current user and the face characteristic sets of a plurality of user groups obtained by clustering in advance needs to be calculated, and then a plurality of matching results are obtained.
Step S502, comparing each matching result with a preset matching threshold respectively, so as to determine a face feature user group to which the first user belongs and a cardiac physiological feature corresponding to the face feature user group according to the comparison result.
In specific implementation, the matching results are respectively compared with preset matching thresholds, if the matching degree of the face feature set of the current user and the face feature set of a certain user group obtained by clustering in advance is higher than a preset threshold, the current user is considered to belong to the user group matched with the current user, and then the psychophysiological feature tag corresponding to the user can be determined according to the user group.
From the above description, it can be seen that the present invention achieves the following technical effects: acquiring first user face features to form a first user face feature set according to the first user face features; and matching the first user face feature set with a second user clustering result, wherein the second user clustering result comprises a second user face feature clustering result and a second user psychophysiological feature clustering result corresponding to the second user face feature clustering result, and the psychophysiological features and/or the physiological features of the first user are determined according to the matching result of the first user face feature set and the second user clustering result, so that the aim of effectively predicting the psychophysiological features of the user based on face recognition information is fulfilled, and the technical effects of expanding the application range of the face recognition technology and accurately predicting the psychophysiological features of the user are achieved. In addition, through the analysis of the face information of the user, the possible important psychophysiological characteristics and the prediction probability of the user can be obtained, so that the relevant departments can be helped to know the individual more quickly, and corresponding measures can be taken.
It should be noted that the steps illustrated in the flowcharts of the figures may be performed in a computer system such as a set of computer-executable instructions and that, although a logical order is illustrated in the flowcharts, in some cases, the steps illustrated or described may be performed in an order different than presented herein.
According to an embodiment of the present invention, there is also provided an apparatus for implementing the above method for predicting psychophysiological characteristics based on face information, as shown in fig. 6, the apparatus includes: the device comprises an acquisition module 1, a matching module 2 and a determination module 3. The acquisition module 1 of the embodiment of the application is used for acquiring a first user face feature to form a first user face feature set according to the first user face feature; the matching module 2 in the embodiment of the application is configured to match the first user face feature set with a second user clustering result, where the second user clustering result includes a second user face feature clustering result and a second user psychophysiological feature clustering result corresponding to the second user face feature clustering result; the determining module 3 in the embodiment of the application is configured to determine the psychological characteristic and/or the physiological characteristic of the first user according to the matching result of the first user face characteristic set and the second user clustering result.
As a preferred implementation manner of the embodiment of the present application, the matching module includes: the first acquisition unit is used for acquiring second user face information; the first extraction unit is used for extracting second user face features according to the second user face information; and the first clustering unit is used for clustering the second user face features according to a preset face feature clustering threshold value so as to obtain a second user face feature clustering result.
As a preferred implementation manner of the embodiment of the present application, the matching module further includes: the second acquisition unit is used for acquiring the human-computer interaction information of each human face feature user group, wherein the human-computer interaction information comprises any one or more of second user personal information, second user psychological information and second user physiological information; the second extraction unit is used for extracting the psychophysiological characteristics of a second user according to the human-computer interaction information; and the second clustering unit is used for clustering the second user psychophysiological characteristics according to a preset psychophysiological characteristic clustering threshold value so as to obtain a second user psychophysiological characteristic clustering result corresponding to each face characteristic user group.
As a preferred implementation manner of the embodiment of the present application, the second user facial features include a plurality of features, and the first clustering unit includes: the clustering subunit is used for clustering the second user face features to obtain a plurality of face feature clustering sub-results; and the comparison sub-unit is used for comparing the plurality of face feature clustering sub-results with the preset face feature clustering threshold respectively to obtain a second user face feature clustering result.
As a preferred implementation manner of the embodiment of the present application, the second user clustering result includes a plurality of face feature user groups, and the determining module includes: the calculating unit is used for calculating the matching degree of the first user face feature set and each face feature user group to obtain a plurality of matching results; and the comparison unit is used for comparing each matching result with a preset matching threshold respectively so as to determine the face characteristic user group to which the first user belongs and the cardiac physiological characteristics corresponding to the face characteristic user group according to the comparison result.
For the specific connection relationship between the modules and the units and the functions performed, please refer to the detailed description of the method, which is not repeated herein.
According to an embodiment of the present invention, there is also provided a computer apparatus including: one or more processors; storage means for storing one or more programs; the one or more programs, when executed by the one or more processors, cause the one or more processors to implement the method as previously described.
There is also provided, in accordance with an embodiment of the present invention, a computer-readable storage medium having stored thereon computer instructions, which when executed by a processor, implement the steps of the method as previously described.
As shown in fig. 7, the electronic device includes one or more processors 31 and a memory 32, and one processor 31 is taken as an example in fig. 7.
The control unit may further include: an input device 33 and an output device 34.
The processor 31, the memory 32, the input device 33 and the output device 34 may be connected by a bus or other means, and fig. 7 illustrates the connection by a bus as an example.
The processor 31 may be a Central Processing Unit (CPU). The Processor 31 may also be other general purpose processors, Digital Signal Processors (DSPs), Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) or other Programmable logic devices, discrete Gate or transistor logic devices, discrete hardware components, or combinations thereof. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 32, which is a non-transitory computer readable storage medium, may be used to store non-transitory software programs, non-transitory computer executable programs, and modules. The processor 31 executes various functional applications of the server and data processing by running non-transitory software programs, instructions and modules stored in the memory 32, so as to implement the psychophysiological characteristic prediction method of the above method embodiment.
The memory 32 may include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area may store data created according to use of a processing device operated by the server, and the like. Further, the memory 32 may include high speed random access memory, and may also include non-transitory memory, such as at least one magnetic disk storage device, flash memory device, or other non-transitory solid state storage device. In some embodiments, the memory 32 may optionally include memory located remotely from the processor 31, which may be connected to a network connection device via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The input device 33 may receive input numeric or character information and generate key signal inputs related to user settings and function control of the processing device of the server. The output device 34 may include a display device such as a display screen.
One or more modules are stored in the memory 32, which when executed by the one or more processors 31 perform the methods as previously described.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein. The computer instructions are for causing the computer to perform the above-described method of predicting a psychophysiological characteristic.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
Finally, the principle and the implementation of the present invention are explained by applying the specific embodiments in the present invention, and the above description of the embodiments is only used to help understanding the method and the core idea of the present invention; meanwhile, for a person skilled in the art, according to the idea of the present invention, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present invention.

Claims (10)

1. A method for predicting psychophysiological characteristics based on face information is characterized by comprising the following steps:
acquiring first user face features to form a first user face feature set according to the first user face features;
matching the first user face feature set with a second user clustering result, wherein the second user clustering result comprises a second user face feature clustering result and a second user psychophysiological feature clustering result corresponding to the second user face feature clustering result;
and determining the psychological characteristics and/or the physiological characteristics of the first user according to the matching result of the first user face characteristic set and the second user clustering result.
2. The method according to claim 1, wherein the matching the first user face feature set with the second user clustering result comprises:
collecting second user face information;
extracting second user face features according to the second user face information;
and clustering the second user face features according to a preset face feature clustering threshold value to obtain a second user face feature clustering result.
3. The method according to claim 1, wherein the second user facial feature clustering result comprises a plurality of facial feature user groups, and the matching the first user facial feature set with the second user facial feature clustering result comprises:
collecting human-computer interaction information of each human face feature user group, wherein the human-computer interaction information comprises any one or more of second user personal information, second user psychological information and second user physiological information;
extracting a second user psychophysiological characteristic according to the man-machine interaction information;
and clustering the psychophysiological characteristics of the second user according to a preset psychophysiological characteristic clustering threshold value to obtain a second user psychophysiological characteristic clustering result corresponding to each face characteristic user group.
4. The method for predicting psychophysiological characteristics based on face information according to claim 2, wherein the second user face characteristics include a plurality of second user face characteristics, and the clustering the second user face characteristics according to a preset face characteristic clustering threshold to obtain a second user face characteristic clustering result comprises:
clustering a plurality of second user face features to obtain a plurality of face feature clustering sub-results;
and comparing the plurality of face feature clustering sub-results with the preset face feature clustering threshold respectively to obtain a second user face feature clustering result.
5. The method according to claim 1, wherein the second user clustering result comprises a plurality of face feature user groups, and the determining the psychological and/or physiological features of the first user according to the matching result of the first user face feature set and the second user clustering result comprises:
calculating the matching degree of the first user face feature set and each face feature user group to obtain a plurality of matching results;
and comparing each matching result with a preset matching threshold respectively to determine a face feature user group to which the first user belongs and the cardiac physiological features corresponding to the face feature user group according to the comparison result.
6. A device for predicting a physiological cardiac characteristic based on face information, comprising:
the acquisition module is used for acquiring first user face features so as to form a first user face feature set according to the first user face features;
the matching module is used for matching the first user face feature set with a second user clustering result, wherein the second user clustering result comprises a second user face feature clustering result and a second user psychophysiological feature clustering result corresponding to the second user face feature clustering result;
and the determining module is used for determining the psychological characteristics and/or the physiological characteristics of the first user according to the matching result of the first user face characteristic set and the second user clustering result.
7. The apparatus of claim 6, wherein the matching module comprises:
the first acquisition unit is used for acquiring second user face information;
the first extraction unit is used for extracting second user face features according to the second user face information;
and the first clustering unit is used for clustering the second user face features according to a preset face feature clustering threshold value so as to obtain a second user face feature clustering result.
8. The apparatus of claim 6, wherein the matching module further comprises:
the second acquisition unit is used for acquiring the human-computer interaction information of each human face feature user group, wherein the human-computer interaction information comprises any one or more of second user personal information, second user psychological information and second user physiological information;
the second extraction unit is used for extracting the psychophysiological characteristics of a second user according to the human-computer interaction information;
and the second clustering unit is used for clustering the second user psychophysiological characteristics according to a preset psychophysiological characteristic clustering threshold value so as to obtain a second user psychophysiological characteristic clustering result corresponding to each face characteristic user group.
9. An electronic device, comprising:
one or more processors;
storage means for storing one or more programs;
the one or more programs, when executed by the one or more processors, cause the one or more processors to implement the method of any of claims 1-5.
10. A non-transitory readable storage medium having stored thereon computer instructions which, when executed by a processor, implement the steps of the method of any one of claims 1 to 5.
CN202010045198.0A 2020-01-15 2020-01-15 Heart physiological characteristic prediction method and device based on face information Pending CN111291623A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010045198.0A CN111291623A (en) 2020-01-15 2020-01-15 Heart physiological characteristic prediction method and device based on face information

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010045198.0A CN111291623A (en) 2020-01-15 2020-01-15 Heart physiological characteristic prediction method and device based on face information

Publications (1)

Publication Number Publication Date
CN111291623A true CN111291623A (en) 2020-06-16

Family

ID=71026644

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010045198.0A Pending CN111291623A (en) 2020-01-15 2020-01-15 Heart physiological characteristic prediction method and device based on face information

Country Status (1)

Country Link
CN (1) CN111291623A (en)

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR19980058349A (en) * 1996-12-30 1998-09-25 구자홍 Person Identification Using Image Information
JP2009043039A (en) * 2007-08-09 2009-02-26 Nippon Telegr & Teleph Corp <Ntt> Object detector, object detection method, object detection program, and recording medium recording the program
JP2011013818A (en) * 2009-06-30 2011-01-20 Nippon Hoso Kyokai <Nhk> Facial feature point extracting device and face feature point extraction program
CN103761536A (en) * 2014-01-28 2014-04-30 五邑大学 Human face beautifying method based on non-supervision optimal beauty features and depth evaluation model
CN104765768A (en) * 2015-03-09 2015-07-08 深圳云天励飞技术有限公司 Mass face database rapid and accurate retrieval method
CN105335691A (en) * 2014-08-14 2016-02-17 南京普爱射线影像设备有限公司 Smiling face identification and encouragement system
CN108509041A (en) * 2018-03-29 2018-09-07 百度在线网络技术(北京)有限公司 Method and apparatus for executing operation
CN108733429A (en) * 2018-05-16 2018-11-02 Oppo广东移动通信有限公司 Method of adjustment, device, storage medium and the mobile terminal of system resource configuration
CN109815873A (en) * 2019-01-17 2019-05-28 深圳壹账通智能科技有限公司 Merchandise display method, apparatus, equipment and medium based on image recognition
CN109934097A (en) * 2019-01-23 2019-06-25 深圳市中银科技有限公司 A kind of expression and mental health management system based on artificial intelligence
US20190213393A1 (en) * 2018-01-10 2019-07-11 International Business Machines Corporation Automated facial recognition detection
CN110558997A (en) * 2019-08-30 2019-12-13 深圳智慧林网络科技有限公司 Robot-based accompanying method, robot and computer-readable storage medium
RU2710942C1 (en) * 2018-12-06 2020-01-14 Самсунг Электроникс Ко., Лтд. Simultaneous recognition of person attributes and identification of person in organizing photo albums

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR19980058349A (en) * 1996-12-30 1998-09-25 구자홍 Person Identification Using Image Information
JP2009043039A (en) * 2007-08-09 2009-02-26 Nippon Telegr & Teleph Corp <Ntt> Object detector, object detection method, object detection program, and recording medium recording the program
JP2011013818A (en) * 2009-06-30 2011-01-20 Nippon Hoso Kyokai <Nhk> Facial feature point extracting device and face feature point extraction program
CN103761536A (en) * 2014-01-28 2014-04-30 五邑大学 Human face beautifying method based on non-supervision optimal beauty features and depth evaluation model
CN105335691A (en) * 2014-08-14 2016-02-17 南京普爱射线影像设备有限公司 Smiling face identification and encouragement system
CN104765768A (en) * 2015-03-09 2015-07-08 深圳云天励飞技术有限公司 Mass face database rapid and accurate retrieval method
US20190213393A1 (en) * 2018-01-10 2019-07-11 International Business Machines Corporation Automated facial recognition detection
CN108509041A (en) * 2018-03-29 2018-09-07 百度在线网络技术(北京)有限公司 Method and apparatus for executing operation
CN108733429A (en) * 2018-05-16 2018-11-02 Oppo广东移动通信有限公司 Method of adjustment, device, storage medium and the mobile terminal of system resource configuration
RU2710942C1 (en) * 2018-12-06 2020-01-14 Самсунг Электроникс Ко., Лтд. Simultaneous recognition of person attributes and identification of person in organizing photo albums
CN109815873A (en) * 2019-01-17 2019-05-28 深圳壹账通智能科技有限公司 Merchandise display method, apparatus, equipment and medium based on image recognition
CN109934097A (en) * 2019-01-23 2019-06-25 深圳市中银科技有限公司 A kind of expression and mental health management system based on artificial intelligence
CN110558997A (en) * 2019-08-30 2019-12-13 深圳智慧林网络科技有限公司 Robot-based accompanying method, robot and computer-readable storage medium

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
张铎: "《生物识别技术基础》", 30 April 2009, 武汉大学出版社, pages: 104 - 105 *
李涛等: "《数字图像处理之红外弱目标分割方法研究》", 30 June 2016, 西南交通大学出版社, pages: 85 - 88 *
陈敏等: "《OPNET物联网仿真 基于5G通信与计算的物联网智能应用》", 30 November 2018, 华中科技大学出版社, pages: 41 - 43 *

Similar Documents

Publication Publication Date Title
CN109583332B (en) Face recognition method, face recognition system, medium, and electronic device
WO2019119505A1 (en) Face recognition method and device, computer device and storage medium
Chou et al. Every rating matters: Joint learning of subjective labels and individual annotators for speech emotion classification
CN111291678A (en) Face image clustering method and device based on multi-feature fusion
CN109271917B (en) Face recognition method and device, computer equipment and readable storage medium
WO2021063056A1 (en) Facial attribute recognition method and apparatus, and electronic device and storage medium
CN111652331B (en) Image recognition method and device and computer readable storage medium
CN105095415A (en) Method and apparatus for confirming network emotion
JP2018032340A (en) Attribute estimation device, attribute estimation method and attribute estimation program
CN110705428A (en) Facial age recognition system and method based on impulse neural network
CN116956896A (en) Text analysis method, system, electronic equipment and medium based on artificial intelligence
CN115294397A (en) Classification task post-processing method, device, equipment and storage medium
Hossny et al. Enhancing keyword correlation for event detection in social networks using SVD and k-means: Twitter case study
CN113128526A (en) Image recognition method and device, electronic equipment and computer-readable storage medium
CN111259057A (en) Data processing method and device for civil appeal analysis
CN111291623A (en) Heart physiological characteristic prediction method and device based on face information
Yang et al. An academic social network friend recommendation algorithm based on decision tree
WO2021175010A1 (en) User gender identification method and apparatus, electronic device, and storage medium
CN115188031A (en) Fingerprint identification method, computer program product, storage medium and electronic device
CN109614854B (en) Video data processing method and device, computer device and readable storage medium
CN113628735A (en) Online appointment registration method and device based on neural network
CN111292849A (en) Method and device for predicting psychophysiological characteristics based on user behaviors
CN114373212A (en) Face recognition model construction method, face recognition method and related equipment
CN113704623A (en) Data recommendation method, device, equipment and storage medium
CN112463964A (en) Text classification and model training method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination