CN111539471A - Identity information fusion system based on multiple features - Google Patents

Identity information fusion system based on multiple features Download PDF

Info

Publication number
CN111539471A
CN111539471A CN202010314526.2A CN202010314526A CN111539471A CN 111539471 A CN111539471 A CN 111539471A CN 202010314526 A CN202010314526 A CN 202010314526A CN 111539471 A CN111539471 A CN 111539471A
Authority
CN
China
Prior art keywords
data
fusion
module
result
matching
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN202010314526.2A
Other languages
Chinese (zh)
Inventor
曹显利
韩美林
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Deep Knowledge Intelligent Technology Jinhua Co ltd
Original Assignee
Deep Knowledge Intelligent Technology Jinhua Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Deep Knowledge Intelligent Technology Jinhua Co ltd filed Critical Deep Knowledge Intelligent Technology Jinhua Co ltd
Priority to CN202010314526.2A priority Critical patent/CN111539471A/en
Publication of CN111539471A publication Critical patent/CN111539471A/en
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/40Information retrieval; Database structures therefor; File system structures therefor of multimedia data, e.g. slideshows comprising image and additional audio data
    • G06F16/48Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/483Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/70Multimodal biometrics, e.g. combining information from different biometric modalities

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • Library & Information Science (AREA)
  • Multimedia (AREA)
  • General Engineering & Computer Science (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Evolutionary Computation (AREA)
  • Evolutionary Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Human Computer Interaction (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Databases & Information Systems (AREA)
  • Collating Specific Patterns (AREA)

Abstract

The invention discloses an identity information fusion system based on multiple characteristics, which comprises a data acquisition device, a data matching module and a data fusion module, wherein the data acquisition device is used for acquiring different types of characteristic identity data of a user, acquiring multiple pieces of characteristic identity information of the user and outputting the acquired multiple pieces of characteristic identity information to the data matching module; the data matching module is used for searching a plurality of characteristic identity information output by the data acquisition device in a preset database and outputting a plurality of characteristic matching results obtained by searching to the data fusion module; the data fusion module is used for carrying out data fusion on the plurality of characteristic matching results output by the data matching module and outputting a user identity fusion result. The identity information fusion system based on multiple characteristics provided by the invention adopts multiple characteristics to identify the identity of the user and fuses the logic relations of the multiple identified characteristics, so that the identity of the user is effectively determined, and the identity identification accuracy rate and the response speed of the user are high.

Description

Identity information fusion system based on multiple features
Technical Field
The invention relates to the field of user identity recognition, and particularly discloses an identity information fusion system based on multiple features.
Background
Identity recognition is one of basic technologies in the fields of Internet of things, artificial intelligence and the like, and is widely applied to various occasions. Currently, the widely used identification technologies include two types, one is an identification method based on identification objects and identification knowledge, and the other is an identification method based on user biological characteristics. The identification articles comprise certificates, magnetic cards and the like, the identification knowledge comprises user names, passwords, verification codes and the like, and the user biological characteristics comprise fingerprint characteristics, facial characteristics, sound characteristics, head characteristics and the like.
The identification articles have the defects of easy loss and possibility of counterfeiting, and the identification knowledge has the defects of easy error recording or forgetting, low efficiency in the identification process of the user and inconvenient use of the user. Although the user biological characteristics overcome the defects of easy loss, possibility of counterfeiting, easy error recording or forgetting and the like, the user biological characteristics are influenced by more factors, such as ambient light, illness of the user and the like, the identification efficiency is not high, and erroneous judgment is easily caused. The prior art usually adopts a single characteristic (such as a face characteristic or a fingerprint characteristic) to identify the identity of a user, and lacks necessary fusion of multiple characteristic information.
Therefore, in the prior art, a single feature is often adopted to identify the user identity, which is a technical problem to be solved urgently.
Disclosure of Invention
The invention provides an identity information fusion system based on multiple features, and aims to solve the technical problem that a user identity is usually identified by a single feature in the prior art.
The invention provides an identity information fusion system based on multiple characteristics, which comprises a data acquisition device, a data matching module and a data fusion module, wherein the data acquisition device is connected with the data matching module and is used for acquiring different types of characteristic identity data of users, acquiring multiple pieces of characteristic identity information of the users and outputting the acquired multiple pieces of characteristic identity information to the data matching module; the data matching module is respectively connected with the data acquisition device and the data fusion module and is used for retrieving the characteristic identity information output by the data acquisition device in a preset database and outputting a plurality of characteristic matching results obtained by retrieval to the data fusion module; and the data fusion module is connected with the data matching module and is used for carrying out data fusion on the plurality of characteristic matching results output by the data matching module and outputting a user identity fusion result.
Furthermore, the data acquisition device comprises one or more of a user identity identification object acquisition module, a user identity identification knowledge acquisition module, a user biological characteristic acquisition module and a user identity characteristic acquisition module.
Further, the feature matching result comprises a main data matching result and an auxiliary data matching result, the data fusion module comprises a first data fusion unit, and the first data fusion unit is used for outputting the user identity fusion result by adopting a calculation method if the main data matching result and the auxiliary data matching result in the feature matching result are both continuous data.
Further, the first data fusion unit comprises a calculation subunit and an output subunit, wherein the calculation subunit is used for respectively assigning weight coefficients to the main data matching result and the auxiliary data matching result and calculating a data fusion result B; and the output subunit is used for presetting one or more numerical values, dividing the value range of the calculated data fusion result B into two or more sections, and defining the user identity fusion result of each section.
Further, the data fusion result B is obtained by the following formula:
B=(a0×A0+a1×A1+a2×A2+…+an×An)/(A0+A1+A2+…+An)
wherein a0 is the matching result of the main data; a1 and a2 … an are matching results of each auxiliary data; a0, A1 and A2 … An are preset weight coefficients.
Further, the feature matching result comprises discrete type main data, continuous type main data, discrete type auxiliary data and continuous type auxiliary data, the data fusion module comprises a second data fusion unit,
and the second data fusion unit is used for obtaining the comprehensive result of all auxiliary data according to the feature matching result and a preset data weight table.
Further, the data fusion module further comprises a third data output unit, and the third data output unit is used for outputting the user identity fusion result if the plurality of feature matching results are recognized to be discrete data and the plurality of feature matching results point to the same user.
Further, the data fusion module comprises a fourth data output unit, and the fourth data output unit is used for processing the multiple feature matching results by adopting a summation method and/or a singleton method if the multiple feature matching results are identified to be continuous data.
Further, the fourth data output unit includes a sum output subunit,
and the summation output subunit is used for outputting a user identity fusion result if the plurality of feature matching results are identified to point to the same user and the sum of the plurality of feature matching results is greater than a preset identity threshold value by adopting a summation method.
Further, the fourth data output unit further comprises a single output subunit,
and the single-item output subunit is used for outputting the user identity fusion result by adopting a single-item method if the plurality of feature matching results are identified to point to the same user and each feature matching result is greater than a preset identity threshold value.
The beneficial effects obtained by the invention are as follows:
the identity information fusion system based on multiple features provided by the invention adopts the data acquisition device, the data matching module and the data fusion module, acquires the feature identity data of different types of users through the data acquisition device, acquires a plurality of feature identity information of the users, and outputs the acquired feature identity information to the data matching module; the data matching module searches a plurality of characteristic identity information output by the data acquisition device in a preset database and outputs a plurality of characteristic matching results obtained by searching to the data fusion module; and the data fusion module performs data fusion on the plurality of characteristic matching results output by the data matching module and outputs a user identity fusion result. The identity information fusion system based on multiple characteristics provided by the invention adopts multiple characteristics to identify the identity of the user and fuses the logic relations of the multiple identified characteristics, so that the identity of the user is effectively determined, and the identity identification accuracy rate and the response speed of the user are high.
Drawings
FIG. 1 is a functional block diagram of an embodiment of a multi-feature based identity information fusion system provided by the present invention;
FIG. 2 is a functional block diagram of an embodiment of the data fusion module shown in FIG. 1;
FIG. 3 is a functional block diagram of an embodiment of the first data fusion unit shown in FIG. 2;
FIG. 4 is a functional block diagram of a second embodiment of the data fusion module shown in FIG. 1;
FIG. 5 is a functional block diagram of a third embodiment of the data fusion module shown in FIG. 1;
FIG. 6 is a functional block diagram of a fourth embodiment of the data fusion module shown in FIG. 1;
FIG. 7 is a functional block diagram of the first embodiment of the fourth data output unit shown in FIG. 6;
fig. 8 is a functional block diagram of a second embodiment of the fourth data output unit shown in fig. 6.
The reference numbers illustrate:
10. a data acquisition device; 20. a data matching module; 30. a data fusion module; 31. a first data fusion unit; 311. a calculation subunit; 312. an output subunit; 32. a second data fusion unit; 33. a third data output unit; 34. a fourth data output unit; 341. a summing output subunit; 342. and a single output subunit.
Detailed Description
In order to better understand the technical solution, the technical solution will be described in detail with reference to the drawings and the specific embodiments.
As shown in fig. 1, a first embodiment of the present invention provides a multi-feature-based identity information fusion system, which includes a data acquisition device 10, a data matching module 20, and a data fusion module 30, where the data acquisition device 10 is connected to the data matching module 20, and is configured to acquire different types of feature identity data of a user, acquire a plurality of pieces of feature identity information of the user, and output the acquired plurality of pieces of feature identity information to the data matching module 20; the data matching module 20 is respectively connected with the data acquisition device 10 and the data fusion module 30, and is configured to retrieve a plurality of feature identity information output by the data acquisition device 10 in a preset database, and output a plurality of feature matching results obtained by the retrieval to the data fusion module 30; the data fusion module 30 is connected to the data matching module 20, and is configured to perform data fusion on the multiple feature matching results output by the data matching module 20, and output a user identity fusion result.
The data acquisition device 10 includes one or more of a user identification object acquisition module, a user identification knowledge acquisition module, a user biological characteristic acquisition module, and a user identification characteristic acquisition module. The user identification object acquisition module is used for acquiring the user identification object. The user identity identification knowledge acquisition module is used for acquiring user identity identification knowledge. The user biological characteristic acquisition module is used for acquiring the user biological characteristics. The user identity characteristic acquisition module is used for acquiring user identity characteristics. The user identification object comprises a certificate, a magnetic card and the like. And the user identity identification knowledge acquisition module is used for acquiring a user name, a password, an authentication code and the like. The user biological characteristic acquisition module comprises fingerprint characteristics, facial characteristics, sound characteristics, head characteristics and the like. The user identity characteristics comprise local characteristics such as gait, pace, action characteristics, clothing characteristics, decoration characteristics and personal characteristics of the user and remote information such as mobile phone positioning, user contact information and user work and rest habits of the user. The user information and the traditional user identity identification information are fused in different modes, so that the accuracy and the speed of user identity information identification can be improved. Specifically, the data collection device 10 may be an identification card information reading device, a magnetic card information reading device, a password input device, a human face feature collection device, a voice feature collection device, a barcode or two-dimensional code reading device, a human body feature collection device, or a remote information acquisition module. The database is used for storing data of all users, including identification card data, magnetic card data, password data, mobile phone verification code data, human face characteristic data, sound characteristic data, fingerprint characteristic acquisition devices, bar codes or two-dimensional code data or human body characteristic data and the like. The data matching module is used for processing the user data input by the data acquisition device, searching in the database according to the data input by the data acquisition device and outputting a search result. The output result of the data matching module has two types, one is discrete output, for example: {1, 0}, { yes, no }, { high, medium, low }, { a, B, C, D, E }, etc.; the other is a continuous output, for example: 0.25, 100, etc. The data fusion module 30 is configured to output a plurality of feature matching results in a logical manner. The logical means includes logical and, logical or, and the like. The main logic of the logical AND is as follows: when a plurality of feature matching results are output, the user identity fusion result is output only if any feature data in the plurality of feature matching results are in accordance, and any feature data in the plurality of feature matching results has 'one-vote-rejection' power. For example, the data acquisition device 10 simultaneously acquires three characteristic data of the user: the identity card A, the face B and the fingerprint C output the user identity information only when the A is equal to the B and equal to the C, the three characteristic comparison processes are mutual verification, and the specific process is as follows: comparing A with B, only when A is B, namely the identification result of the identity card information is the same as the identification result of the face information, continuing to compare B with C, only when B is C, namely the identification result of the face information is the same as the identification result of the fingerprint information, continuing until any two-two comparison is finished. As long as one of the comparison results does not match, the result cannot be output. The main logic of the logical OR is: there is one primary data with "a vote veto", and other secondary data may increase or decrease the trustworthiness of the primary data, but with much less weight than the primary data.
The identity information fusion system based on multiple features provided by the embodiment adopts the data acquisition device, the data matching module and the data fusion module, acquires different types of feature identity data of users through the data acquisition device, acquires multiple pieces of feature identity information of the users, and outputs the acquired multiple pieces of feature identity information to the data matching module; the data matching module searches a plurality of characteristic identity information output by the data acquisition device in a preset database and outputs a plurality of characteristic matching results obtained by searching to the data fusion module; and the data fusion module performs data fusion on the plurality of characteristic matching results output by the data matching module and outputs a user identity fusion result. The multi-feature-based identity information fusion system provided by the embodiment identifies the identity of the user by adopting various features, and performs fusion of logical relations on the various identified features, so that the identity of the user is effectively determined, and the identity identification accuracy rate and the response speed of the user are high.
Preferably, please refer to fig. 2, and fig. 2 is a schematic functional module diagram of an embodiment of the data fusion module shown in fig. 1, in which in this embodiment, the data acquisition device 10 includes a main data acquisition device and a plurality of auxiliary data acquisition devices. The data matching module 20 includes a main data matching module and an auxiliary data matching module. The database includes a primary database and a secondary database. The main data matching module is connected with the main data acquisition device and the data fusion module 30 respectively. The auxiliary data matching module is connected with the auxiliary data acquisition device and the data fusion module 30 respectively. When the system works, the main data acquisition device and the auxiliary data acquisition device respectively acquire the identity data of a user to obtain the main data information and the auxiliary data information of the user identity; and output to the main data matching module and the auxiliary data matching module. The main data matching module and the auxiliary data matching module respectively search in the main database and the auxiliary database according to the collected main data information and auxiliary data information, and output the search results to the data fusion module 30. The data fusion module 30 performs data fusion on the plurality of search results and outputs a data fusion result. The main data acquisition device is used for acquiring main data information of a user and comprises an identity card information reading device, a magnetic card information reading device, a password input device, a bar code or two-dimensional code reading device and the like. The auxiliary data acquisition device is used for acquiring auxiliary data information of a user, and the acquired data types comprise human face characteristics, sound characteristics, fingerprint characteristics, head and human body characteristics, visual characteristics of personal articles of clothing and the like, behavior characteristics of gait, pace and action and the like, work and rest time, mobile phone positioning, travel information, other remote information acquisition modules and the like. The primary database is used for storing primary data of the users, and the secondary database is used for storing secondary characteristic data of the users. The main data matching module and the auxiliary data matching module are used for processing user data input by the data acquisition device, searching in a database according to the data input by the data acquisition device 10 and outputting a search result. The output of the data matching module 20 has two types, one is a discrete output, for example: {1, 0}, { yes, no }, { high, medium, low }, { a, B, C, D, E }, etc.; the other is a continuous output, for example: 0.25, 100, etc. The data fusion module 30 is configured to perform data fusion on output results of the multiple data matching modules, and output the result. The basic logic of the data fusion module 30 is: the main data has a large weight, and generally the matching result of the main data is taken as the main data, and the matching result of the main data can be changed unless the auxiliary data is inconsistent with the main data and keeps consistent between the auxiliary data and the value of the auxiliary data is high. In this embodiment, the feature matching result includes a main data matching result and an auxiliary data matching result, and the data fusion module 30 includes a first data fusion unit 31, where the first data fusion unit 31 is configured to output the user identity fusion result by using a calculation method if it is recognized that the main data matching result and the auxiliary data matching result in the feature matching result are both continuous data.
In the identity information fusion system based on multiple features provided by this embodiment, if it is recognized that the main data matching result and the auxiliary data matching result in the feature matching result are both continuous data, the user identity fusion result is output by using a calculation method, so that the degree of automation is high, the accuracy of user identity recognition is high, and the response speed is high.
Preferably, referring to fig. 3, fig. 3 is a schematic diagram of functional modules of an embodiment of the first data fusion unit shown in fig. 2, and in this embodiment, the user identity fusion result is output by a calculation method. The first data fusion unit 31 includes a calculation subunit 311 and an output subunit 312, where the calculation subunit 311 is configured to assign weight coefficients to the main data matching result and the auxiliary data matching result, respectively, and calculate a data fusion result B; the output subunit 312 is configured to preset one or more values, divide the value range of the calculated data fusion result B into two or more sections, and define a user identity fusion result for each section.
In this embodiment, the data fusion result B is obtained by the following formula:
B=(a0×A0+a1×A1+a2×A2+…+an×An)/(A0+A1+A2+…+An)
wherein a0 is the matching result of the main data; a1 and a2 … an are matching results of each auxiliary data; a0, A1 and A2 … An are preset weight coefficients.
For a data fusion result B of data fusion, one or more numerical values are preset, the value range of the data fusion result B is divided into two or more sections, and then the data fusion output result of each section is defined. For example, a value 0.55 is preset, a value range [0, 1] of the data fusion result B is divided into a first section [0, 0.55] and a second section [0.55, 1], and it is further set that "no" is output when the data fusion result B falls in the first section and "yes" is output when the data fusion result B falls in the second section.
Further, the weighting coefficients a0, a1, a2 … An can be adjusted according to the final result, and the basic logic is: if the output result of a certain data matching module is consistent with the final data fusion result, the weight coefficient of the data matching module is increased, otherwise, the weight coefficient is decreased. One example of weight coefficient adjustment is: and if the data matching module is consistent with the final data fusion result for three times, the weight coefficient of the module is improved by 1%.
The following is a specific embodiment of the present solution:
adopt fingerprint identification device as the main data collection system, adopt face identification device and speech recognition device as the auxiliary data collection system, to the user's fingerprint information that fingerprint identification device read, its main data matching module that corresponds calls local fingerprint database and carries out the fingerprint information matching, output fingerprint matching result: the numerical value a0 of Zhang III is 0.7, meanwhile, the face recognition device and the voice recognition device respectively collect face information and voice information of the user, the corresponding auxiliary data matching modules respectively search and match in the corresponding databases, and the matching results of the face recognition modules are output: the user is Zhang three, the value a1 is 0.4, and the matching result of the speech recognition module: the user is Zhangthree with a value of a 2-0.5. The weights of a0, a1 and a2 are set to 5, 3 and 2, respectively, and according to a calculation method,
B=(a0×A0+a1×A1+a2×A2)/(A0+A1+A2)
the result B of data fusion is:
B=(0.7×5+0.4×3+0.5×2)/(5+3+2)=5.7/10=0.57
presetting a value 0.55, dividing the value range [0, 1] of B into a first section [0, 0.55] and a second section [0.55, 1], and then setting that when B falls in the first section, the output is 'NO', and when B falls in the second section, the output is 'YES'.
Since 0.57 falls in the second section [0.55, 1], the data fusion result is output: the user is Zhang III.
Preferably, please refer to fig. 4, where fig. 4 is a functional module diagram of a second embodiment of the data fusion module shown in fig. 1, on the basis of the first embodiment, the feature matching result includes discrete type main data, continuous type main data, discrete type auxiliary data, and continuous type auxiliary data, and the data fusion module 30 includes a second data fusion unit 32, and the second data fusion unit 32 is configured to obtain a comprehensive result of all auxiliary data according to the feature matching result and a preset data weight table.
For the case that the data matching result includes discrete data and continuous data, a table look-up method is adopted, as shown in table 1:
Figure BDA0002457533890000091
TABLE 1 data weight table
In the above table, the auxiliary data refers to the integration of all auxiliary data. One comprehensive method is as follows: all the auxiliary data are equally weighted and are accumulated in quantity according to the auxiliary data result to determine the comprehensive result of all the auxiliary data, for example, if three of five auxiliary data are yes and two of the five auxiliary data are no, the auxiliary data comprehensive result is yes. The other comprehensive method is as follows: the auxiliary data are accumulated according to different weights, and a comprehensive result of all auxiliary data is obtained, for example, if the first result is yes, the weight is 0.4, the second result is no, the weight is 0.3, the third result is no, and the weight is 0.3, the comprehensive result of the auxiliary data is no, among the three auxiliary data.
In the multi-feature-based identity information fusion system provided by this embodiment, the main data matching module and the auxiliary data matching module respectively search in the main database and the auxiliary database according to the acquired main data information and auxiliary data information, and output the search result to the data fusion module, and the data fusion module outputs the user identity fusion result through a calculation method or a table lookup method, so that the degree of automation is high, the accuracy of user identity recognition is high, and the response speed is high.
Further, as shown in fig. 5, fig. 5 is a functional module schematic diagram of a third embodiment of the data fusion module shown in fig. 1, and on the basis of the first embodiment, the data fusion module 30 further includes a third data output unit 33, and the third data output unit 33 is configured to output a user identity fusion result if it is recognized that the plurality of feature matching results are all discrete data and the plurality of feature matching results all point to the same user. For example, the data fusion module outputs 1 when both data matching modules output 1 and 1, respectively, and 0 otherwise. When all the input data do not point to the same result, the data fusion module outputs prompt information and alarm information or does not output the prompt information and the alarm information.
In the multi-feature-based identity information fusion system provided by the embodiment, if the data fusion module recognizes that the plurality of feature matching results are all discrete data and the plurality of feature matching results all point to the same user, the user identity fusion result is output, so that the degree of automation is high, the accuracy rate of user identity recognition is high, and the response speed is high.
Preferably, referring to fig. 6 to 8, fig. 6 is a functional module schematic diagram of a fourth embodiment of the data fusion module shown in fig. 1, on the basis of the first embodiment, the data fusion module 30 includes a fourth data output unit 34, and the fourth data output unit 34 is configured to process a plurality of feature matching results by using a summation method and/or a single-term method if it is identified that the plurality of feature matching results are all continuous data. Specifically, the fourth data output unit 34 includes a summation output subunit 341, where the summation output subunit 34 is configured to output a user identity fusion result if it is identified that the multiple feature matching results point to the same user and the sum of the multiple feature matching results is greater than a preset identity threshold. The fourth data output unit 34 further includes a single-item output subunit 342, where the single-item output subunit 342 is configured to output a user identity fusion result if it is identified that the plurality of feature matching results point to the same user and each feature matching result is greater than a preset identity threshold value by using a single-item method.
The summation method is that when a plurality of data input to the data fusion module point to the same user and the sum of the data is greater than a certain preset value, the data fusion module outputs 'yes', otherwise, no prompt message is output or output. For example, three data matching modules respectively output: if "zhangsan" is 0.8, zhangsan "is 0.9, zhangsan" is 0.87, the three input data points to the same user "zhangsan", and for a preset threshold value of 2.5, the sum of the three data is 2.57 > 2.5, and the output of the data fusion module is yes. And if the plurality of data input into the data fusion module do not point to the same user and the sum of the plurality of data is greater than a certain preset value, the system outputs alarm information. For example, two data matching modules respectively output: if the sum of the two data is greater than the preset value 1.5, the system outputs alarm information.
The single method is that when a plurality of data input to the data fusion module point to the same user and are respectively larger than corresponding preset values, the data fusion module outputs 'yes', otherwise, prompt information is not output or output. For example, three data matching modules respectively output: when the three data is 0.7, 0.8 and 0.7, the corresponding preset values of the three data are respectively 0.8 and 0.9, and the three data are all larger than the preset values, so that the output of the data fusion module is yes. And if the plurality of data input into the data fusion module do not point to the same user and are respectively greater than the corresponding preset values, the system outputs alarm information. For example, two data matching modules respectively output: if "three" is 0.8, and if "lie four" is 0.9, at this time, two data input to the data fusion module do not point to the same user, and the two data are respectively greater than preset values 0.7 and 0.8, and at this time, the system outputs alarm information.
When the plurality of data input to the data fusion module include discrete data and continuous data, all the data are required to point to the same user, and the discrete data are all yes, and when all the continuous data conform to the summation method or the single-item method, the data fusion module outputs yes, otherwise, no prompt message is output or output. For example, when the three data matching modules respectively output "yes", and 0.9 and 0.87, the corresponding preset values of two continuous data are respectively 0.8 and 0.7, and the discrete data in the three data are all "yes" and the continuous data are all greater than the preset values, so the data fusion module outputs "yes". When the plurality of data input to the data fusion module contain discrete data and continuous data, if all the data do not point to the same user, the data fusion module outputs prompt information, alarm information or does not output the prompt information and the alarm information.
The following is a specific embodiment of the present solution:
adopt ID card reader and people's face video collector as data acquisition device, to the user ID card information that the ID card reader was gathered, its corresponding data matching module 1 calls network ID card database and carries out the identity information matching, outputs this user's identity information 1: the discrete output with the last name of "Zhang three" is "Yes"; for the user face feature information acquired by the face video collector, the corresponding data matching module 2 calls a local user face information database to perform feature matching, and outputs the user information 2 corresponding to the user face feature: continuous data with a name of "Zhang three" is 0.87; the data fusion module fuses two input information, wherein the preset value of data 2 is 0.85, all discrete data are required to be yes according to a preset information fusion rule, and all continuous data are greater than the preset value, so that the output result of the data fusion module is as follows: the output of the user's "three sheets" is "yes".
If the identity card information 1: the discrete output with the last name "zhangsan" is yes, and the face identification information 2: the continuous data with the name of three is 0.07, namely the continuous data with the name of 0.93, the data output by the data matching module does not point to the same user and is larger than the preset value of 0.85, and the data fusion module outputs alarm information.
In the multi-feature-based identity information fusion system provided by this embodiment, if it is recognized that the multiple feature matching results are all continuous data, the multiple feature matching results are processed by using a summation method and/or a singles method, so that the degree of automation is high, the accuracy of user identity recognition is high, and the response speed is high.
While preferred embodiments of the present invention have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including preferred embodiments and all such alterations and modifications as fall within the scope of the invention. It will be apparent to those skilled in the art that various changes and modifications may be made in the present invention without departing from the spirit and scope of the invention. Thus, if such modifications and variations of the present invention fall within the scope of the claims of the present invention and their equivalents, the present invention is also intended to include such modifications and variations.

Claims (8)

1. The identity information fusion system based on multiple characteristics is characterized by comprising a data acquisition device (10), a data matching module (20) and a data fusion module (30), wherein,
the data acquisition device (10) is connected with the data matching module (20) and is used for acquiring different types of characteristic identity data of users, acquiring a plurality of pieces of characteristic identity information of the users and outputting the acquired plurality of pieces of characteristic identity information to the data matching module (20);
the data matching module (20) is respectively connected with the data acquisition device (10) and the data fusion module (30), and is used for retrieving the plurality of feature identity information output by the data acquisition device (10) in a preset database and outputting a plurality of feature matching results obtained by retrieval to the data fusion module (30);
the data fusion module (30) is connected with the data matching module (20) and is used for performing data fusion on the plurality of feature matching results output by the data matching module (20) and outputting a user identity fusion result.
2. The multi-feature based identity information fusion system of claim 1,
the data acquisition device (10) comprises one or more of a user identity identification object acquisition module, a user identity identification knowledge acquisition module, a user biological characteristic acquisition module and a user identity characteristic acquisition module.
3. The multi-feature based identity information fusion system of claim 1,
the feature matching result comprises a primary data matching result and a secondary data matching result, the data fusion module (30) comprises a first data fusion unit (31),
the first data fusion unit (31) is configured to output the user identity fusion result by using a calculation method if the main data matching result and the auxiliary data matching result in the feature matching result are both identified as continuous data.
4. The multi-feature based identity information fusion system of claim 3,
the first data fusion unit (31) comprises a calculation subunit (311) and an output subunit (312),
the calculating subunit (311) is configured to assign weight coefficients to the main data matching result and the auxiliary data matching result, respectively, and calculate a data fusion result B;
the output subunit (312) is configured to preset one or more values, divide the calculated value range of the data fusion result B into two or more sections, and define a user identity fusion result for each section.
5. The multi-feature based identity information fusion system of claim 4,
the data fusion result B is obtained by the following formula:
B=(a0×A0+a1×A1+a2×A2+…+an×An)/(A0+A1+A2+…+An)
wherein a0 is the matching result of the main data; a1 and a2 … an are matching results of each auxiliary data; a0, A1 and A2 … An are preset weight coefficients.
6. The multi-feature based identity information fusion system of claim 1,
the feature matching result comprises discrete type main data, continuous type main data, discrete type auxiliary data and continuous type auxiliary data, the data fusion module (30) comprises a second data fusion unit (32),
and the second data fusion unit (32) is used for obtaining a comprehensive result of all auxiliary data according to the feature matching result and a preset data weight table.
7. The multi-feature based identity information fusion system of claim 1,
the data fusion module (30) further comprises a third data output unit (33),
and the third data output unit (33) is used for outputting the user identity fusion result if the plurality of feature matching results are recognized to be discrete data and all the feature matching results point to the same user.
8. The multi-feature based identity information fusion system of claim 1,
the data fusion module (30) comprises a fourth data output unit (34),
and the fourth data output unit (34) is used for processing the plurality of feature matching results by adopting a summation method and/or a singleitem method if the plurality of feature matching results are identified to be continuous data.
CN202010314526.2A 2020-04-20 2020-04-20 Identity information fusion system based on multiple features Withdrawn CN111539471A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010314526.2A CN111539471A (en) 2020-04-20 2020-04-20 Identity information fusion system based on multiple features

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010314526.2A CN111539471A (en) 2020-04-20 2020-04-20 Identity information fusion system based on multiple features

Publications (1)

Publication Number Publication Date
CN111539471A true CN111539471A (en) 2020-08-14

Family

ID=71975161

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010314526.2A Withdrawn CN111539471A (en) 2020-04-20 2020-04-20 Identity information fusion system based on multiple features

Country Status (1)

Country Link
CN (1) CN111539471A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113706728A (en) * 2021-08-19 2021-11-26 南京邮电大学 Lightweight authorization method and parking method based on multi-back-end fusion

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113706728A (en) * 2021-08-19 2021-11-26 南京邮电大学 Lightweight authorization method and parking method based on multi-back-end fusion

Similar Documents

Publication Publication Date Title
CN109800648B (en) Face detection and recognition method and device based on face key point correction
CN107577990B (en) Large-scale face recognition method based on GPU (graphics processing Unit) accelerated retrieval
US8055667B2 (en) Integrating and enhancing searching of media content and biometric databases
Soltane et al. Face and speech based multi-modal biometric authentication
CN109919093A (en) A kind of face identification method, device, equipment and readable storage medium storing program for executing
CN105718960A (en) Image ordering model based on convolutional neural network and spatial pyramid matching
CN109190514B (en) Face attribute recognition method and system based on bidirectional long-short term memory network
CN110414550B (en) Training method, device and system of face recognition model and computer readable medium
JP2010511938A (en) Image identification using face recognition
KR20000052498A (en) Personal identification method, personal identification apparatus, and recording medium
CN112016623B (en) Face clustering method, device, equipment and storage medium
CN110990498A (en) Data fusion method based on FCM algorithm
Johal et al. A novel method for fingerprint core point detection
US11403875B2 (en) Processing method of learning face recognition by artificial intelligence module
CN108614894A (en) A kind of face recognition database's constructive method based on maximum spanning tree
CN112735437A (en) Voiceprint comparison method, system and device and storage mechanism
CN111539471A (en) Identity information fusion system based on multiple features
CN113869398B (en) Unbalanced text classification method, device, equipment and storage medium
TWI325568B (en) A method for face varification
CN111583938A (en) Electronic device and voice recognition method
CN110135253A (en) A kind of finger vena identification method based on long-term recursive convolution neural network
CN112765521B (en) Website user classification method based on improved K neighbor
CN114241588A (en) Self-adaptive face comparison method and system
CN114707174A (en) Data processing method and device, electronic equipment and storage medium
CN117113321B (en) Image searching method and system for searching face by face

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication

Application publication date: 20200814

WW01 Invention patent application withdrawn after publication