CN115631525A - Insurance instant matching method based on face edge point recognition - Google Patents

Insurance instant matching method based on face edge point recognition Download PDF

Info

Publication number
CN115631525A
CN115631525A CN202211316442.8A CN202211316442A CN115631525A CN 115631525 A CN115631525 A CN 115631525A CN 202211316442 A CN202211316442 A CN 202211316442A CN 115631525 A CN115631525 A CN 115631525A
Authority
CN
China
Prior art keywords
insurance
worker
point
face edge
identity
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202211316442.8A
Other languages
Chinese (zh)
Other versions
CN115631525B (en
Inventor
田明祥
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wancai Technology Hangzhou Co ltd
Original Assignee
Wancai Technology Hangzhou Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wancai Technology Hangzhou Co ltd filed Critical Wancai Technology Hangzhou Co ltd
Priority to CN202211316442.8A priority Critical patent/CN115631525B/en
Publication of CN115631525A publication Critical patent/CN115631525A/en
Application granted granted Critical
Publication of CN115631525B publication Critical patent/CN115631525B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/172Classification, e.g. identification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns
    • G06K7/10Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation
    • G06K7/10009Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation sensing by radiation using wavelengths larger than 0.1 mm, e.g. radio-waves or microwaves
    • G06K7/10297Methods or arrangements for sensing record carriers, e.g. for reading patterns by electromagnetic radiation, e.g. optical sensing; by corpuscular radiation sensing by radiation using wavelengths larger than 0.1 mm, e.g. radio-waves or microwaves arrangements for handling protocols designed for non-contact record carriers such as RFIDs NFCs, e.g. ISO/IEC 14443 and 18092
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q40/00Finance; Insurance; Tax strategies; Processing of corporate or income taxes
    • G06Q40/08Insurance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/761Proximity, similarity or dissimilarity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/168Feature extraction; Face representation

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Business, Economics & Management (AREA)
  • Human Computer Interaction (AREA)
  • Accounting & Taxation (AREA)
  • Toxicology (AREA)
  • Finance (AREA)
  • Artificial Intelligence (AREA)
  • Computing Systems (AREA)
  • Evolutionary Computation (AREA)
  • Computer Security & Cryptography (AREA)
  • Software Systems (AREA)
  • Electromagnetism (AREA)
  • Databases & Information Systems (AREA)
  • Medical Informatics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Development Economics (AREA)
  • Economics (AREA)
  • Marketing (AREA)
  • Strategic Management (AREA)
  • Technology Law (AREA)
  • General Business, Economics & Management (AREA)
  • Financial Or Insurance-Related Operations Such As Payment And Settlement (AREA)

Abstract

The invention discloses an insurance instant matching method based on face edge point recognition, and belongs to the technical field of insurance recommendation. According to the invention, the accuracy of identity recognition is improved on the premise of reducing the influence on the identity recognition efficiency of gate entry workers through the face edge point recognition, and the accuracy of instant application is further improved. In addition, when identifying insurance corresponding to the work type of the identity of the entry gate worker based on the reference insurance element similarity calculation method, the influence of the insurance type historically referred to by other workers who work in the same working area with the worker and the reference insurance amount factor of the insurance company for the corresponding insurance referred to by other workers on determining which type of insurance the worker finally applies is considered, and meanwhile, the problem that when the same work type has 2 or more corresponding applicable insurance types, the machine cannot determine which type of final application is solved.

Description

Insurance instant matching method based on face edge point recognition
Technical Field
The invention relates to the technical field of insurance recommendation, in particular to an insurance instant matching method based on face edge point identification.
Background
In some special scenes such as the construction site scene, the corresponding insurance needs to be immediately applied to each worker who frequently moves. For example, corresponding insurance is applied to workers of different work types. In the existing scheme, a manual matching mode is used for insuring each worker, but when the number of workers needing insuring is large or many objects needing insuring are updated every month due to frequent flowing and other reasons, the defects of tedious manual insuring mode and high repeatability are more obvious, an engineering party hopes that an automatic insuring method can be used for insuring workers of different types of workers in real time, and the expected real-time automatic insuring method is as follows:
the immediate insurance application instruction is issued to each gate or after a worker enters a gate and reaches a designated operation area, the work type of the worker can be automatically identified, and then the insurance corresponding to the work type is immediately applied to the worker, the immediate insurance application process does not need human intervention, and the insurance application accuracy is high. However, the precondition that the work type is immediately recognized and the immediate insurance application is completed when the operator enters the gate is that accurate recognition of the identity of the worker is required, and the work type corresponding to the worker can be accurately matched based on the relation between the identity of the worker and the work type bound at the background after the identity of the worker is accurately recognized.
In the existing entry identity recognition scheme, the accuracy rate and the recognition speed of identity recognition are contradictory, the identity recognition speed must be sacrificed when the entry identity recognition accuracy rate is ensured, the identity recognition accuracy rate must be sacrificed when the entry identity recognition speed is ensured, but the guarantee accuracy must be ensured to accurately recognize the identity of a worker firstly, and the identity recognition speed must be considered simultaneously so as to ensure that no congestion is caused when a large number of workers enter the gate simultaneously, so that how to recognize the identity speed of the worker when entering the gate and ensure the accuracy of the identity recognition of the worker become the first technical problem which needs to be solved for realizing the instant guarantee.
In addition, the same work type may correspond to more than one insurable insurance type, for example, the work type of "power system installation" includes an overhead working construction type, a heavy current engineering construction type, a light current engineering construction type, and the like, and if each construction type corresponds to one insurable insurance type, the work type of "power system installation" includes a plurality of insurable insurance types. Because the same worker may have the high-altitude operation capability and the construction capabilities of the strong-current engineering and the weak-current engineering, or the same worker is arranged to perform the high-altitude operation today, the strong-current engineering construction in the tomorrow and the weak-current engineering construction in the postnatal day, under the condition, how to timely apply insurance to the worker of the type of the worker on a daily basis becomes a second technical problem to be solved by the invention.
Disclosure of Invention
The invention provides an insurance instant matching method based on face edge point recognition, aiming at improving the identity recognition accuracy of workers during gate entry, improving the work type recognition accuracy and further improving the insurance application accuracy and instantaneity.
In order to achieve the purpose, the invention adopts the following technical scheme:
the insurance instant matching method based on face edge point recognition is provided, and comprises the following steps:
s1, identifying the identity of a gate entry worker and acquiring a gate entry historical data set corresponding to an identity identification result,
if the acquisition is successful, the step S2 is switched to;
if the acquisition fails, jumping to the step S5;
s2, identifying the face edge points of the workers on site and calculating the confidence coefficient of any face edge point i to face identification
Figure BDA0003908900920000021
S3, obtaining confidence of the human face edge points i on the human face identification from the gate-entering historical data set
Figure BDA0003908900920000022
If the acquisition is successful, the step S4 is carried out;
if the acquisition is not successful, the method will
Figure BDA0003908900920000023
As a function of the association of the workers
Figure BDA0003908900920000024
Adding the historical data into the entrance gate data set and then jumping to the step S5;
s4, judging
Figure BDA0003908900920000025
And with
Figure BDA0003908900920000026
Is less than a preset distance threshold,
if so, judging that the identity authentication is successful and skipping to the step S5;
if not, judging that the identity authentication fails and prompting to alarm;
s5, identifying the work type associated with the identity of the worker, judging whether the insurable insurance type corresponding to the work type is '1',
if so, immediately insuring the worker with the insurance corresponding to the work type with the binding relationship with the identity information of the worker;
if not, the step S6 is carried out;
and S6, matching the insurance type to be insured for the worker and instantly insureing the worker by the aid of the reference insurance element similarity calculation method based on the intersection operation of the reference insurance consideration factors.
Preferably, in step S1, the method for identifying the identity of the worker during entry includes any one or more of face recognition, fingerprint recognition, iris recognition and RFID tag identification,
in step S2, the face edge points recognized by the worker include a first cheekbone key point on the left side of the cheek, a second cheekbone key point on the right side of the cheek, a key point on the lowest position of the chin, a key point on the center of the left cheek curve between the first cheekbone key point and the key point on the lowest position of the chin, and a key point on the center of the right cheek curve between the second cheekbone key point and the key point on the lowest position of the chin.
Preferably, in step S2,
Figure BDA0003908900920000031
meter (2)The calculation method comprises the following steps:
s21, placing the face image of the worker with the designated size acquired at the site in the step S2 at the designated shooting angle and the designated shooting distance in a chess and card grid map with the preset size in a centered mode, wherein a plurality of grids are divided in the chess and card grid map, and each grid carries a unique row of serial numbers;
s22, identifying a sorting number corresponding to the grid into which each face edge point i in the face image arranged in the chess and card grid graph falls, and matching out a grid positioning coordinate corresponding to each arrangement serial number as a coordinate position point of the corresponding face edge point i based on the association relation between a preset grid arrangement serial number and a grid positioning coordinate;
s23, calculating the distance d between each face edge point i and the reference positioning point of the worker i
S24, all the workers are related to i Summing and averaging to obtain the average value of the distance between the edge points of the human face
Figure BDA0003908900920000032
S25, calculating any one d i And
Figure BDA0003908900920000033
is divided by as
Figure BDA0003908900920000034
Preferably, in step S3,
Figure BDA0003908900920000035
the calculating method comprises the following steps:
s31, giving historical jth confidence degree calculated for the face edge point i of the worker to face recognition
Figure BDA0003908900920000036
Corresponding weight w ij
S32, for each
Figure BDA0003908900920000037
Carrying out weighted summation and averaging to obtain the confidence coefficient corresponding to the face edge point i of the worker
Figure BDA0003908900920000038
And adding the data into the entry gate historical data set.
Preferably, the weight w ij The endowing method comprises the following steps:
s311, acquiring time points of the human face edge points of the workers collected during each historical gate entry, and recording the time points as t j ,t j Representing historical jth acquisition of the face edge points of the worker;
s312, calculating t j And t j-1 Time interval of
Figure BDA0003908900920000039
And calculate t 1 And t n Time interval of
Figure BDA00039089009200000310
t 1 、t n Respectively representing time points of the human face edge points of the workers collected by the first gate entry and the last gate entry;
s313, calculating
Figure BDA00039089009200000311
And
Figure BDA00039089009200000312
as the weight w ij
Preferably, the method for determining the reference anchor point comprises the following steps:
s231, judging whether the face database stores the face image corresponding to the worker successfully identified in the step S1,
if yes, go to step S232;
if not, identifying the coordinate position corresponding to the grid with the corresponding arrangement serial number, in which the nose tip point in the face image collected in the step S21 falls into the chess grid map, as the coordinate position of the reference positioning point;
s232, obtaining the positioning coordinates of the reference positioning points corresponding to the workers in the chess and card grid map from a reference positioning point database.
Preferably, the reference localization point associated to the worker stored in the reference localization point database is calculated by the following method steps:
s2321, placing each face image which is acquired at the appointed shooting angle and the appointed shooting distance and has the appointed size and is related to the worker in the chess and card grid image in the centered mode during historical gate entry, and then identifying grid positioning coordinates corresponding to the grids, wherein the positions of nose tips in each face image fall into the grids, and the grid positioning coordinates are used as the positions of the nose tips in the corresponding face images;
s2322, calculating a first mean value of a horizontal axis coordinate and a second mean value of a vertical axis coordinate of a position point of the nose tip corresponding to each face image related to the worker as a horizontal axis coordinate and a vertical axis coordinate of the reference positioning point related to the worker.
Preferably, in step S6, the method for matching the insurance category to be insured for the worker based on the reference insurance factor similarity calculation method by the reference insurance factor intersection operation includes the steps of:
s61, acquiring an operator data set corresponding to an operation area entered by the worker who intends to carry out instant insurance, wherein each operator working in the operation area has the same work type, and each piece of data stored in the operator data set comprises an insurance type and a reference insurance amount factor of the corresponding operator historical reference insurance;
s62, extracting the data corresponding to each operator with the same insurance type of the historical participation from the operator data set, and adding the data into an intersection operation data set corresponding to the insurance type;
s63, performing intersection operation on the reference and protection quantity factors recorded in each piece of data in each intersection operation data set to obtain an intersection operation result of the reference and protection quantity factors corresponding to each intersection operation data set;
s64, acquiring the reference and security elements of the workers, and then calculating the element similarity of the intersection operation result of the reference and security elements and each reference and security amount element;
and S65, taking the insurance type historically participated by the operator corresponding to the maximum element similarity as the insurance type to be instantly applied by the worker.
Preferably, the reference amount factor or the reference factor includes a construction type of the worker in the working area.
The invention has the following beneficial effects:
1. after the identity result of the gate entry worker is identified by the conventional method such as face identification, RFID label identity identification and the like, the confidence coefficient of the face identification of any face edge point of the worker is further identified and calculated
Figure BDA0003908900920000041
And calculate
Figure BDA0003908900920000042
And
Figure BDA0003908900920000043
the identity of the worker is verified, so that the problem that the work type corresponding to the real identity of the worker cannot be further accurately matched due to the fact that the identity is not accurately recognized by a conventional identity recognition method, and further insurance corresponding to the work type which is incorrectly matched is wrongly guaranteed for the worker can be solved;
2、
Figure BDA0003908900920000051
for solving in advance and storing in the entry database, only calculation is needed when verifying the identity of a worker
Figure BDA0003908900920000052
The identity verification of workers can be completed, due to the fact that the calculated amount is small, the identity verification speed is very high, the verification time increased by the conventional identity recognition algorithm is negligible compared, the entrance identity is rapidly recognized in the scene, the identity recognition accuracy is guaranteed, and meanwhile the identity recognition efficiency is considered.
3、
Figure BDA0003908900920000053
For any face edge point on the face of the gate entry worker, the gate entry worker can enter the gate entry worker only by identifying any one key point of a first cheekbone key point, a second cheekbone key point, a chin lowest position key point, a left cheek curve central point key point and a right cheek curve central position key point
Figure BDA0003908900920000054
The calculation process effectively avoids the problem that the edge points of partial human faces are difficult to detect immediately and cannot enter quickly due to the influence of illumination, acquisition angle and the like
Figure BDA0003908900920000055
The calculation process further causes the problems of low efficiency and even failure of the authentication process, and influences the rapid entrance and exit of the barrier gate and the instant application of insurance.
4. When identifying the insurance corresponding to the work type of the identity of the worker related to the entrance gate, the reference insurance element similarity calculation method based on the intersection operation of the reference insurance factor considers the influence of the insurance type historically referred to by other workers operating in the same operation area with the worker and the reference insurance factor of the insurance company on the corresponding insurance referred to by other workers on determining which type of insurance the worker finally puts into operation. The method solves the problem that when the same work type has 2 or more corresponding insurable insurance types, a machine cannot judge which kind of final insurance is applied finally by performing intersection operation on the insurance reference quantity factors for applying the insurance by an insurance company when the insurance reference factors of the worker and other workers in the same work area apply the same kind of insurance historically and taking the insurance type of the worker historically and corresponding to the maximum factor similarity as the insurance type to be applied by the worker immediately.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings required to be used in the embodiments of the present invention will be briefly described below. It is obvious that the drawings described below are only some embodiments of the invention, and that for a person skilled in the art, other drawings can be derived from them without inventive effort.
Fig. 1 is a diagram illustrating steps of implementing an insurance instant matching method based on face edge point recognition according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of a face image placed in a grid map of a chess or card;
FIG. 3 is a schematic diagram of a grid of playing cards having predetermined dimensions.
Detailed Description
The technical scheme of the invention is further explained by the specific implementation mode in combination with the attached drawings.
Wherein the showings are for the purpose of illustration only and not for the purpose of limiting the same, the same is shown by way of illustration only and not in the form of limitation; for a better explanation of the embodiments of the present invention, some parts of the drawings may be omitted, enlarged or reduced, and do not represent the size of an actual product; it will be understood by those skilled in the art that certain well-known structures in the drawings and descriptions thereof may be omitted.
The same or similar reference numerals in the drawings of the embodiments of the present invention correspond to the same or similar components; in the description of the present invention, it should be understood that if the terms "upper", "lower", "left", "right", "inner", "outer", etc. are used to indicate an orientation or a positional relationship based on that shown in the drawings, it is only for convenience of description and simplification of description, but not to indicate or imply that the device or element referred to must have a specific orientation, be constructed and operated in a specific orientation, and therefore, the terms describing the positional relationship in the drawings are only used for illustrative purposes and are not to be construed as limitations on the present patent, and specific meanings of the terms may be understood according to specific situations by those of ordinary skill in the art.
In the description of the present invention, unless otherwise explicitly specified or limited, the term "connected" or the like, if appearing to indicate a connection relationship between the components, is to be understood broadly, for example, as being fixed or detachable or integral; can be mechanically or electrically connected; they may be connected directly or indirectly through intervening media, or they may be connected through any combination of two or more members or structures. The specific meanings of the above terms in the present invention can be understood in specific cases to those skilled in the art.
The method for instant automatic insurance application expected to be realized by the invention is as follows:
the immediate insurance application instruction is issued to each gate machine or the worker can automatically identify the work type of the worker after entering the gate and reaching the designated operation area, and then the worker is immediately applied with insurance corresponding to the work type, the immediate insurance application process does not need human intervention, and the insurance application accuracy is high. However, the precondition that the work type is immediately recognized and the immediate insurance application is completed when the operator enters the gate is that accurate recognition of the identity of the worker is required, and the work type corresponding to the worker can be matched based on the relation between the identity of the worker and the work type bound at the background only after the identity of the worker is accurately recognized. But how are more appropriate insurance categories for different workers to be insured when the work category type corresponds to insurable insurance categories of 2 or more?
In order to ensure instantaneity and accuracy of gate entry insurance application, an embodiment of the invention provides an insurance instant matching method based on face edge point identification, as shown in fig. 1, which comprises the following steps: s1, identifying the identity of a gate entry worker and acquiring a gate entry historical data set corresponding to an identity identification result,
if the acquisition is successful (indicating that the worker is an old worker), the step S2 is carried out;
if the acquisition fails (indicating that the worker is a new employee), jumping to a step S5;
it should be noted that the method for identifying the identity of the entry worker in step S1 is a conventional identity identification method, for example, a face identification method and an RFID tag identity identification method of a conventional access control system, and these conventional entry identity identification methods can ensure that the worker enters the gate quickly, but sacrifice the accuracy of identity identification, and are prone to errors in identity identification. For example, worker A wears the RFID tag of worker B on the worker A, and when passing the gate, the system verifies that the worker is allowed for worker B but the worker is actually worker A.
The identification of the identity of the entry worker by using the conventional method in the step S1 is not the technical creation point of the present invention, and the technical creation point of the step is to obtain an entry history data set corresponding to the conventional identity identification result, and the entry history data set stores the confidence level of each face edge point of the passing worker (for example, a worker wears the RFID tag of a worker B and enters the gate, and the worker B is the passing worker, and the entry history data set corresponding to the worker B is obtained)
Figure BDA0003908900920000071
i denotes the ith personal face edge point, i =1, \ 8230, and n, n is the number of the personal face edge points. In this embodiment, n =5 represents 5 face edge points, which are the first cheekbone key point Q1 on the left side of the cheek, the second cheekbone key point Q2 on the right side of the cheek, the key point Q3 on the lowest position of the chin, the key point Q4 on the center of the left cheek curve between the first cheekbone key point and the key point on the lowest position of the chin, and the key point Q5 on the center of the right cheek curve between the second cheekbone key point and the key point on the lowest position of the chin, respectively, as shown in fig. 2.
S2, identifying the face edge points of the worker on site and calculating the confidence of any face edge point i on face identification
Figure BDA0003908900920000072
Figure BDA0003908900920000073
The calculation method comprises the following steps:
s21, the face image of the worker with the designated size acquired at the site in the step S2 at the designated angle and the designated shooting distance is placed in a chess and card grid diagram with the preset size shown in the figure 3 in a centered mode (please refer to the chess and card grid diagram after the face image is placed in the chess and card grid diagram, a plurality of grids are divided in the chess and card grid diagram, and each grid carries a unique row of serial numbers;
s22, identifying a sorting number corresponding to a grid into which each face edge point i in a face image placed in a chess and card grid diagram falls (for example, a second cheekbone key point Q2 positioned on the right side of a cheek falls into the grid with the sequence of 13 in the graph 3), and matching a grid positioning coordinate corresponding to each sorting number as a coordinate site of the face edge point i based on the association relationship between a preset grid arrangement number and a grid positioning coordinate;
s23, calculating the distance d between each face edge point i and the reference positioning point for the worker i
S24, all the workers are related to d i Summing and averaging to obtain the average value of the distance between the edge points of the human face
Figure BDA0003908900920000074
S25, calculating any d i And with
Figure BDA0003908900920000075
As a division value of
Figure BDA0003908900920000076
Preferably, in step S23, d is calculated i The reference positioning point adopted in the method is the nasal tip position of the human face. The nose tip position features are outstanding and easy to identify, the distances of the nose tip, a first cheekbone key point on the left side of the cheek, a second cheekbone key point on the right side of the cheek, a chin lowest position key point, a left cheek curve central position point key point between the first cheekbone key point and the chin lowest position key point and a right cheek curve central position point key point between the second cheekbone key point and the chin lowest position key point are equivalent, and a certain d calculated by calculation cannot occur i Too small or too large to result in a solution
Figure BDA0003908900920000077
Too small or too large, the system may be too small or too large
Figure BDA0003908900920000078
And determining to solve the wrong problem.
The key of the gate entry instant insurance is to ensure the instantaneity of insurance on the premise of ensuring the accuracy rate of insurance,
Figure BDA0003908900920000081
the calculation includes five steps S21 to S25, and if the reference positioning point adopted in step S23 is the nose tip position in the face image of each entry worker placed in the grid diagram of the chess and card, determining the reference positioning point and obtaining the coordinate position of the reference positioning point takes a certain time, which is not favorable for improving the instantaneity of the application. Therefore, to further improve
Figure BDA0003908900920000082
The reference positioning points used in step S23 are preferably reference positioning points whose coordinate positions are determined in advance, so that the calculation of the speed is performed
Figure BDA0003908900920000083
The position of a reference anchor point does not need to be calculated for each human face. The embodiment of the invention provides the following method steps for determining the reference positioning point in advance:
s2321, each human face image of an associated worker with a specified size, which is acquired at a specified shooting angle and a specified shooting distance during historical gate entry, is placed in a chess and card grid image in a centered manner, and grid positioning coordinates corresponding to a grid in which the nose tip position in each human face image falls are identified and serve as the position point of the nose tip in the corresponding human face image;
s2322, calculating a first mean value of a horizontal axis coordinate and a second mean value of a vertical axis coordinate of a position point of the nose tip corresponding to each face image related to the worker as a horizontal axis coordinate and a vertical axis coordinate of a reference positioning point related to the worker.
The premise of determining the reference positioning point related to the worker in advance is that the worker is an old worker and has face image data acquired when the worker has a history of entering and exiting the barrier gate. If the worker is a new employee who does not store the historical collected face image data, the reference positioning point cannot be determined in advance through the steps S2321-S2322. In order to solve this problem, for a new employee, the invention provides the following method steps to determine its corresponding reference localization point:
s231, judging whether the face database stores the face image corresponding to the worker successfully identified in the step S1,
if yes, go to step S232;
if not, identifying the coordinate position corresponding to the grid with the corresponding arrangement serial number in the chess and card grid map, in which the nose tip point in the face image collected in the step S21 falls, as the coordinate position of the reference positioning point.
By the above method to accomplish
Figure BDA0003908900920000084
After the calculation, the insurance instant matching method based on the face edge point recognition provided by the embodiment proceeds to the following steps:
s3, obtaining confidence of face edge points i to face recognition from gate-entering historical data set
Figure BDA0003908900920000085
If the acquisition is successful, the step S4 is carried out;
if the acquisition fails, the method will
Figure BDA0003908900920000086
As a function of the worker
Figure BDA0003908900920000087
Adding the data into the entrance historical data set and then jumping to the step S5;
Figure BDA0003908900920000088
the calculation method comprises the following steps:
s31, giving the historical jth confidence degree calculated for the face edge point i of the worker to the face recognition
Figure BDA0003908900920000089
Corresponding weight w ij
Figure BDA00039089009200000810
All in one
Figure BDA00039089009200000811
Is distinguished by
Figure BDA00039089009200000812
Calculate for the worker for the jth time of history
Figure BDA00039089009200000813
S32, for each
Figure BDA00039089009200000814
Carrying out weighted summation and averaging to obtain the confidence coefficient corresponding to the face edge point i of the worker
Figure BDA00039089009200000815
And adding the data into the corresponding entry historical data set of the worker.
In this embodiment, the weight w ij The endowing method comprises the following steps:
s311, acquiring the time point of acquiring the face edge point of the worker during each historical gate entry, and recording the time point as t j ,t j Representing the j-th collection of the face edge points of the worker in history;
s312, calculating t j And t j-1 Time interval of
Figure BDA0003908900920000091
And calculate t 1 And t n Time interval of
Figure BDA0003908900920000092
t 1 、t n Respectively representing the time points of the human face edge points of the worker collected by the first gate entry and the last gate entry;
s313, calculating
Figure BDA0003908900920000093
And
Figure BDA0003908900920000094
as a weight w ij
Is obtained to
Figure BDA0003908900920000095
Then, the insurance instant matching method based on face edge point recognition provided by this embodiment proceeds to the following steps:
s4, judging
Figure BDA0003908900920000096
And
Figure BDA0003908900920000097
is less than a preset distance threshold,
if so, judging that the identity authentication is successful and skipping to the step S5;
if not, judging that the authentication fails and prompting to alarm;
Figure BDA0003908900920000098
and
Figure BDA0003908900920000099
may be such that
Figure BDA00039089009200000910
And
Figure BDA00039089009200000911
the absolute value of the difference of (a).
After the identity of the worker is successfully verified, the insurance instant matching method based on the face edge point recognition provided by the embodiment is switched to the following steps:
s5, identifying the work type associated with the identity of the worker (the binding relationship between the identity of the worker and the work type is recorded in the system in advance), judging whether the insurable insurance type corresponding to the work type is '1',
if yes, insuring the worker with insurance corresponding to the work type with the binding relation with the identity information of the worker in real time;
if not, the step S6 is carried out;
and S6, matching the insurance type to be insured for the worker and instantly insuring the worker based on the reference insurance factor intersection operation reference insurance element similarity method.
In step S6, the method for calculating the similarity of the insurance factors based on the intersection operation of the factors of the reference insurance amount to match the insurance types to be insured for the worker includes the steps of:
s61, acquiring an operator data set corresponding to an operation area into which a worker intending to carry out instant insurance enters after entering a brake, wherein each operator working in the operation area has the same work type but possibly different construction types (for example, the operators in the operation area have three persons A, B and C, the work types of the 3 persons are all 'power system installation', but the construction type of the A person is 'high-altitude operation', the construction type of the B person is 'strong current engineering construction', the construction type of the C person is 'weak current engineering construction'), each piece of data stored in the operator data set comprises the insurance types and the reference insurance factors of the corresponding operator historical insurance (for example, the insurance type of the operator A historical insurance is S1, the insurance type of the B historical insurance is S2, the insurance type of the C historical insurance is S3, the insurance types of the S1, the S2 and the S3 have completely or partially different reference insurance factors for the references, for example, for the operator A performing the high-altitude operation, one of the reference factors is the factors which have to be qualified as the high-altitude operation, and the qualification factors of the high-altitude operation area, such as the high-altitude operation area required insurance factors and the like, and the reference factors of the safety factors of the operation area required to be different to be included in the high-altitude operation area;
s62, extracting data corresponding to each operator with the same insurance type of the historical insurance participation from the operator data set, and adding the data into an intersection operation data set corresponding to the insurance type (for example, if the insurance types of the operators A, D, E and F participating in the insurance in the operation area are S1, adding the data of A, D, E and F into the intersection operation data set corresponding to the same insurance type S1);
s63, performing intersection operation on each of the reference and guaranteed amount factors recorded in each of the intersection operation data sets to obtain a reference and guaranteed amount factor intersection operation result corresponding to each of the intersection operation data sets (for example, the reference and guaranteed amount factors of the operator a include k1, k2, and k3, the reference and guaranteed amount factors of D include k1, k2, k3, and k4, the reference and guaranteed amount factors of E include k1, k2, k3, k4, and k5, the reference and guaranteed amount factors of F include k1, k3, k4, and k5, and the reference and guaranteed amount intersection operation results of the reference and guaranteed amounts of a, D, E, and F include k1 and k 3);
s64, acquiring the reference and protection elements of a worker, and calculating the element similarity of the intersection operation result of the reference and protection elements and each reference and protection amount factor (assuming that the reference and protection elements of the worker are k1, k3 and k4, the intersection operation result of the reference and protection amount factors calculated in the step S63 is two, namely k1, k2, k3 and k1, k3 and k4, and if the preset element similarity calculation method is the proportion of the same number of elements in all elements, the element similarity of k1, k3 and k4 to k1, k2 and k3 and to k1, k3 and k4 is 2/3);
s65, using the insurance category historically referred to by the operator corresponding to the maximum element similarity as the insurance category to be instantly applied by the worker (when the number of the maximum element similarities is two or more, one insurance may be optionally applied, for example, using the insurance category S1 corresponding to the intersection operation result of the reference factors k1, k2, and k3 illustrated in the optional step S64 as the insurance category to be instantly applied by the worker).
In conclusion, the invention improves the accuracy of identity recognition by face edge point recognition on the premise of reducing the influence on the identity recognition efficiency of gate entry workers, thereby being beneficial to improving the accuracy of real-time insurance application. In addition, when identifying insurance corresponding to the type of work related to the identity of the gate entering worker, the calculation method based on the similarity of the insurance-participating elements considers the influence of the insurance type historically participated by other workers operating in the same operation area by the worker and the insurance-participating amount factor of the insurance company on the insurance corresponding to the insurance participated by other workers on the determination of the type of insurance finally applied by the worker, and simultaneously solves the problem that when the same type of work has 2 or more corresponding insurable insurance types, the machine cannot determine which type of insurance is finally applied.
It should be understood that the above-described embodiments are merely preferred embodiments of the invention and the technical principles applied thereto. Various modifications, equivalent substitutions, changes, etc., will also be apparent to those skilled in the art. However, such variations are within the scope of the invention as long as they do not depart from the spirit of the invention. In addition, certain terms used in the specification and claims of the present application are not limiting, but are used merely for convenience of description.

Claims (9)

1. An insurance instant matching method based on face edge point recognition is characterized by comprising the following steps:
s1, identifying the identity of a gate entry worker and acquiring a gate entry historical data set corresponding to an identity identification result,
if the acquisition is successful, the step S2 is carried out;
if the acquisition fails, jumping to the step S5;
s2, identifying the face edge points of the workers on site and calculating the confidence coefficient of any face edge point i to face identification
Figure FDA0003908900910000011
S3, obtaining confidence of the human face edge points i on the human face identification from the gate-entering historical data set
Figure FDA0003908900910000012
If the acquisition is successful, the step S4 is carried out;
if the acquisition is not successful, the method will
Figure FDA0003908900910000013
As a function of the association of the workers
Figure FDA0003908900910000014
Adding the historical data into the entrance gate data set and then jumping to the step S5;
s4, judging
Figure FDA0003908900910000015
And with
Figure FDA0003908900910000016
Is less than a preset distance threshold,
if yes, the identity authentication is judged to be successful, and the step S5 is skipped;
if not, judging that the authentication fails and prompting to alarm;
s5, identifying the work type associated with the identity of the worker, judging whether the insurable insurance type corresponding to the work type is '1',
if yes, instantly insuring the insurance corresponding to the work type with the binding relation with the identity information of the worker;
if not, the step S6 is carried out;
and S6, matching the insurance type to be insured for the worker and instantly insureing the worker by the aid of the reference insurance element similarity calculation method based on the intersection operation of the reference insurance consideration factors.
2. The insurance instant matching method based on human face edge point identification according to claim 1, wherein in step S1, the method for identifying the identity of the worker during gate entry comprises any one or more of human face identification, fingerprint identification, iris identification and RFID tag identity identification,
in step S2, the face edge points recognized by the worker include a first cheekbone key point on the left side of the cheek, a second cheekbone key point on the right side of the cheek, a chin lowest position key point, a left cheek curve center point key point between the first cheekbone key point and the chin lowest position key point, and a right cheek curve center point key point between the second cheekbone key point and the chin lowest position key point.
3. The insurance instant matching method based on human face edge point recognition according to claim 1, characterized in that in step S2,
Figure FDA0003908900910000017
the calculation method comprises the following steps:
s21, placing the face image of the worker with the designated size acquired at the site in the step S2 at the designated shooting angle and the designated shooting distance in a chess and card grid map with the preset size in a centered mode, wherein a plurality of grids are divided in the chess and card grid map, and each grid carries a unique row of serial numbers;
s22, identifying a sorting number corresponding to the grid into which each face edge point i in the face image arranged in the chess and card grid graph falls, and matching out a grid positioning coordinate corresponding to each arrangement serial number as a coordinate position point of the corresponding face edge point i based on the association relation between a preset grid arrangement serial number and a grid positioning coordinate;
s23, calculating the distance d between each face edge point i and the reference positioning point of the worker i
S24, all the workers are related to d i Summing and averaging to obtain the average value of the distance between the edge points of the human face
Figure FDA0003908900910000021
S25, calculating any one d i And with
Figure FDA0003908900910000022
Is divided by as
Figure FDA0003908900910000023
4. The insurance instant matching method based on human face edge point recognition according to claim 1, characterized in that in step S3,
Figure FDA0003908900910000024
the calculation method comprises the following steps:
s31, giving historical jth confidence degree calculated for the face edge point i of the worker to face recognition
Figure FDA0003908900910000025
Corresponding weight w ij
S32, for each
Figure FDA0003908900910000026
Carrying out weighted summation and averaging to obtain the confidence coefficient corresponding to the face edge point i of the worker
Figure FDA0003908900910000027
And adding the data into the entry gate historical data set.
5. The insurance instant matching method based on human face edge point recognition as claimed in claim 4, wherein the weight w is ij The endowing method comprises the following steps:
s311, acquiring time points of the human face edge points of the workers collected during each historical gate entry, and recording the time points as t j ,t j Representing historical jth acquisition of the face edge points of the worker;
s312, calculating t j And t j-1 Time interval of
Figure FDA0003908900910000028
And calculate t 1 And t n Time interval of
Figure FDA0003908900910000029
t 1 、t n Individual watchShowing time points of collecting the human face edge points of the workers in the first time and the last time of gate entry;
s313, calculating
Figure FDA00039089009100000210
And
Figure FDA00039089009100000211
as the weight w ij
6. The insurance instant matching method based on human face edge point identification as claimed in claim 3, wherein the determination method of the reference positioning point comprises the following steps:
s231, judging whether the face database stores the face image corresponding to the worker successfully identified in the step S1,
if yes, go to step S232;
if not, identifying the coordinate position of the nose tip point in the face image collected in the step S21, which is corresponding to the grid with the corresponding row sequence number in the chess and card grid diagram, as the coordinate position of the reference positioning point;
s232, obtaining the positioning coordinates of the reference positioning points corresponding to the workers in the chess and card grid map from a reference positioning point database.
7. The insurance instant matching method based on human face edge point identification as claimed in claim 6, characterized in that the reference localization point stored in the reference localization point database and associated with the worker is calculated by the following method steps:
s2321, placing each face image which is acquired at the appointed shooting angle and the appointed shooting distance and has the appointed size and is related to the worker in the chess and card grid image in the centered mode during historical gate entry, and then identifying grid positioning coordinates corresponding to the grids, wherein the positions of nose tips in each face image fall into the grids, and the grid positioning coordinates are used as the positions of the nose tips in the corresponding face images;
s2322, calculating a first mean value of a horizontal axis coordinate and a second mean value of a vertical axis coordinate of a position point of the nose tip corresponding to each face image related to the worker as a horizontal axis coordinate and a vertical axis coordinate of the reference positioning point related to the worker.
8. The insurance instant matching method based on face edge point recognition according to any one of claims 1 to 7, wherein in step S6, the method for calculating the similarity of the reference insurance elements based on the intersection operation of the reference insurance factors matches the insurance types to be insured for the worker comprises the steps of:
s61, acquiring an operator data set corresponding to an operation area entered by the worker who intends to carry out instant insurance, wherein each operator working in the operation area has the same work type, and each piece of data stored in the operator data set comprises an insurance type and a reference insurance amount factor of the corresponding operator historical reference insurance;
s62, extracting the data corresponding to each operator with the same insurance type of the historical participation from the operator data set, and adding the data into an intersection operation data set corresponding to the insurance type;
s63, performing intersection operation on the reference quantity factors recorded in each piece of data in each intersection operation data set to obtain an intersection operation result of the reference quantity factors corresponding to each intersection operation data set;
s64, acquiring the reference and protection elements of the workers, and then calculating the element similarity of the intersection operation result of the reference and protection elements and each reference and protection element;
and S65, taking the insurance type historically participated by the operator corresponding to the maximum element similarity as the insurance type to be instantly applied by the worker.
9. The insurance instant matching method based on human face edge point identification as claimed in claim 8, wherein the reference quantity factor or the reference element comprises the construction type of the operator in the working area.
CN202211316442.8A 2022-10-26 2022-10-26 Face edge point identification-based insurance instant matching method Active CN115631525B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211316442.8A CN115631525B (en) 2022-10-26 2022-10-26 Face edge point identification-based insurance instant matching method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211316442.8A CN115631525B (en) 2022-10-26 2022-10-26 Face edge point identification-based insurance instant matching method

Publications (2)

Publication Number Publication Date
CN115631525A true CN115631525A (en) 2023-01-20
CN115631525B CN115631525B (en) 2023-06-23

Family

ID=84906264

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211316442.8A Active CN115631525B (en) 2022-10-26 2022-10-26 Face edge point identification-based insurance instant matching method

Country Status (1)

Country Link
CN (1) CN115631525B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118430054A (en) * 2024-07-05 2024-08-02 深圳市宏辉智通科技有限公司 Human face recognition method and system based on AI intelligence

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110647809A (en) * 2019-08-15 2020-01-03 中国平安人寿保险股份有限公司 AI (Artificial Intelligence) underwriting system and method based on image analysis and computer-readable storage medium
US20200143146A1 (en) * 2017-11-23 2020-05-07 Beijing Sensetime Technology Development Co., Ltd. Target object recognition method and apparatus, storage medium, and electronic device
CN111639537A (en) * 2020-04-29 2020-09-08 深圳壹账通智能科技有限公司 Face action unit identification method and device, electronic equipment and storage medium
CN112183165A (en) * 2019-07-04 2021-01-05 北京航天长峰科技工业集团有限公司 Face recognition method based on accumulated monitoring video
WO2021083133A1 (en) * 2019-10-29 2021-05-06 广州虎牙科技有限公司 Image processing method and device, equipment and storage medium
CN113505717A (en) * 2021-07-17 2021-10-15 桂林理工大学 Online passing system based on face and facial feature recognition technology
CN113920564A (en) * 2021-10-29 2022-01-11 中国平安财产保险股份有限公司 Client mining method based on artificial intelligence and related equipment
CN114898475A (en) * 2022-05-13 2022-08-12 精英数智科技股份有限公司 Underground personnel identity identification method and device, electronic equipment and readable storage medium

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200143146A1 (en) * 2017-11-23 2020-05-07 Beijing Sensetime Technology Development Co., Ltd. Target object recognition method and apparatus, storage medium, and electronic device
CN112183165A (en) * 2019-07-04 2021-01-05 北京航天长峰科技工业集团有限公司 Face recognition method based on accumulated monitoring video
CN110647809A (en) * 2019-08-15 2020-01-03 中国平安人寿保险股份有限公司 AI (Artificial Intelligence) underwriting system and method based on image analysis and computer-readable storage medium
WO2021083133A1 (en) * 2019-10-29 2021-05-06 广州虎牙科技有限公司 Image processing method and device, equipment and storage medium
CN111639537A (en) * 2020-04-29 2020-09-08 深圳壹账通智能科技有限公司 Face action unit identification method and device, electronic equipment and storage medium
CN113505717A (en) * 2021-07-17 2021-10-15 桂林理工大学 Online passing system based on face and facial feature recognition technology
CN113920564A (en) * 2021-10-29 2022-01-11 中国平安财产保险股份有限公司 Client mining method based on artificial intelligence and related equipment
CN114898475A (en) * 2022-05-13 2022-08-12 精英数智科技股份有限公司 Underground personnel identity identification method and device, electronic equipment and readable storage medium

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
PAVEL KRÁL ET AL.: "Confidence Measure for Experimental Automatic Face Recognition System", ICAART 2014: AGENTS AND ARTIFICIAL INTELLIGENCE, pages 362 - 378 *
李云红 等: "分区域特征提取的人脸识别算法", 西北大学学报(自然科学版) *
杨显 等: "工程建设中基于实名制和地理定位的人脸识别移动考勤设计与实现", 施工技术(中英文), vol. 51, no. 8, pages 127 - 130 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118430054A (en) * 2024-07-05 2024-08-02 深圳市宏辉智通科技有限公司 Human face recognition method and system based on AI intelligence

Also Published As

Publication number Publication date
CN115631525B (en) 2023-06-23

Similar Documents

Publication Publication Date Title
CN113837030B (en) Personnel intelligent management and control method and system for epidemic situation prevention and control and computer equipment
CN103902970A (en) Automatic fingerprint gesture estimation method and system
CN112132041A (en) Community patrol analysis method and system based on computer vision
CN110852714A (en) Salary improvement data management system applied to decoration service platform
CN113408683A (en) Construction site safety supervision method and system
CN113807240A (en) Intelligent transformer substation personnel dressing monitoring method based on uncooperative face recognition
CN109118081B (en) Operation safety supervision system and method based on image processing mode
CN110189111A (en) Work attendance method and device
CN110751675A (en) Urban pet activity track monitoring method based on image recognition and related equipment
CN115082861A (en) Personnel identity and safety violation identification method and system
CN105868693A (en) Identity authentication method and system
CN111460985A (en) On-site worker track statistical method and system based on cross-camera human body matching
CN110543866A (en) Safety management system and method for capital construction engineering constructors
CN115631525A (en) Insurance instant matching method based on face edge point recognition
CN112966638A (en) Transformer station operator identification and positioning method based on multiple characteristics
CN111064935A (en) Intelligent construction site personnel posture detection method and system
CN114022944A (en) Intelligent monitoring system
KR20230017454A (en) Method, Device and Computer Program For Preventing Cheating In Non-face-to-face Evaluation
CN112489241A (en) Electronic patrol system capable of supervising patrol personnel
CN117171694A (en) Distribution scene safety identification system based on AI technology
CN116978152A (en) Noninductive safety monitoring method and system based on radio frequency identification technology
CN113746888B (en) Intelligent assistant decision-making method for work ticket monitoring link
CN112990881A (en) Related party attendance system and method
CN114783097A (en) Hospital epidemic prevention management system and method
CN118470814B (en) Real-time multi-mode attendance collection platform and method supporting elastic assessment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant