CN111159159B - Public traffic passing method, device, equipment and system based on history passing record - Google Patents

Public traffic passing method, device, equipment and system based on history passing record Download PDF

Info

Publication number
CN111159159B
CN111159159B CN201911410193.7A CN201911410193A CN111159159B CN 111159159 B CN111159159 B CN 111159159B CN 201911410193 A CN201911410193 A CN 201911410193A CN 111159159 B CN111159159 B CN 111159159B
Authority
CN
China
Prior art keywords
similarity
image
record information
current
registered
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911410193.7A
Other languages
Chinese (zh)
Other versions
CN111159159A (en
Inventor
高原
林峰
蒋鹏
康燕斌
张志齐
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Yitu Technology Co ltd
Original Assignee
Shanghai Yitu Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Yitu Technology Co ltd filed Critical Shanghai Yitu Technology Co ltd
Priority to CN201911410193.7A priority Critical patent/CN111159159B/en
Publication of CN111159159A publication Critical patent/CN111159159A/en
Application granted granted Critical
Publication of CN111159159B publication Critical patent/CN111159159B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/21Design, administration or maintenance of databases
    • G06F16/219Managing data history or versioning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/24Querying
    • G06F16/245Query processing
    • G06F16/2458Special types of queries, e.g. statistical queries, fuzzy queries or distributed queries
    • G06F16/2462Approximate or statistical queries
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/26Government or public services
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/172Classification, e.g. identification

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Business, Economics & Management (AREA)
  • Databases & Information Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Tourism & Hospitality (AREA)
  • Probability & Statistics with Applications (AREA)
  • General Health & Medical Sciences (AREA)
  • Mathematical Physics (AREA)
  • Computational Linguistics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Fuzzy Systems (AREA)
  • Evolutionary Biology (AREA)
  • Software Systems (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Development Economics (AREA)
  • Educational Administration (AREA)
  • Evolutionary Computation (AREA)
  • Economics (AREA)
  • Human Resources & Organizations (AREA)
  • Marketing (AREA)
  • Primary Health Care (AREA)
  • Strategic Management (AREA)
  • General Business, Economics & Management (AREA)
  • Collating Specific Patterns (AREA)
  • Time Recorders, Dirve Recorders, Access Control (AREA)

Abstract

The invention provides a public traffic passing method, a device, equipment and a system based on a history passing record, wherein the public traffic passing method based on the history passing record comprises the following steps: step S1, acquiring a current image of a person to be passed and acquiring current pass record information at the same time; and S2, comparing the similarity of the current image and a registered image, wherein the registered image is an image of a registered person in the public transportation system, and determining whether to pass or not based on the similarity, the current pass record information and the historical pass record information in step S3. According to the public transportation passing method based on the history passing record, whether the passenger passes is determined based on the similarity, the current passing record information and the history passing record information, and judgment is performed by combining the current passing record and the history passing record information on the basis of similarity judgment, so that the traveling experience of the passenger who often takes public transportation means and has a stable traveling rule can be improved.

Description

Public traffic passing method, device, equipment and system based on history passing record
Technical Field
The invention relates to the field of public transportation, in particular to a public transportation passing method, device, equipment and system based on a history passing record.
Background
In the existing payment systems (such as face-brushing payment, fingerprint payment and the like), most of the payment systems still only use the current image and the registered image as the unique basis for identification, and if the current image quality is poor or the appearance of registered personnel changes to a certain extent, the identification performance is possibly affected, so that the travel experience of a user is affected.
Disclosure of Invention
In view of the above, the present invention provides a public transportation passing method, device, equipment and system based on history passing records, so as to solve the problem of how to improve the traveling experience of users.
In order to solve the above technical problems, in one aspect, a public transportation passing method based on history passing records is provided, including:
step S1, acquiring a current image of a person to be passed and acquiring current pass record information at the same time;
s2, comparing the similarity of the current image and a registered image, wherein the registered image is an image of a person registered in a public transportation system;
and step S3, determining whether to pass or not based on the similarity, the current pass record information and the historical pass record information.
Further, comparing the similarity between the current image and the registered image specifically includes:
step S21, extracting the characteristics of the current image to obtain a current characteristic vector;
step S22, comparing the similarity of the present feature vector and the registration feature vector corresponding to the registration image.
Further, the image and the registration image are face images, and the similarity is face similarity.
Further, the present feature vector and the registration feature vector are respectively extracted through a face detection model and a face recognition model, and the similarity is determined by calculating the cosine distance or the euclidean distance of the present feature vector and the registration feature vector.
Further, an MTCNN model is used as the face detection model to perform face detection and key point positioning on the current image and the registered image respectively, and then a FaceNet model is used as the face recognition model to extract the current feature vector and the registered feature vector respectively.
Further, the history traffic record information is the history traffic record information of the registered person, the similarity of the history traffic record information and the face of the person to be passed meets the preset condition, and the current traffic record information and the history traffic record information respectively comprise one or more of date, time, week, site and entry/exit information.
Further, in the step S3,
when the highest similarity value is above a preset threshold value, the personnel to be passed are passed,
and when the similarity is smaller than the preset threshold value, determining whether to pass or not based on the similarity, the current passing record information and the historical passing record information.
Further, based on the similarity, the current traffic record information and the historical traffic record information, determining whether to pass through a pass confirmation model, wherein the pass confirmation model is a function as follows:
p=POST([s 1 ,…s n ],th,M,SET{H 1 }),
wherein [ s ] 1 ,…s n ]For the face similarity of the n registered images with the highest face image similarity with the person to be passed,
th is the expected threshold score for the value of th,
m is the current passing record information,
SET{H 1 and the information is a set of historical traffic record information of registered people with the similarity of the faces of the people to be passed meeting a preset condition.
Further, the SET { H 1 And the information is a set of history traffic record information within a preset time range from the current traffic.
Further, the release confirmation model is obtained by training with face images of historic passers correctly released by registered persons as positive samples and face images of unregistered users as negative samples, and the training model is rated through a loss function.
Further, the training method comprises the following steps:
converting the positive and negative samples into sample feature vectors;
converting the sample characteristic vector into a two-dimensional vector, and respectively representing the original scores of release and non-release;
the raw score is converted to a normalized final score by softmax (function).
Further, the loss function is a cross entropy function as shown in the following formula:
L=-[y*log(P)+(1-y)*log(1-P)]
wherein y is the true value of the positive sample and negative sample label, 1 is put, 0 is put when not put, and P is the predicted value obtained after the positive sample and negative sample are input into the release confirmation model.
In a second aspect, the present invention provides a public transportation passing device based on a history of passing, comprising:
the acquisition module is used for acquiring the current image of the personnel to be passed and acquiring the current pass record information at the same time;
a comparison module for comparing the similarity of the current image and the registered image, wherein the registered image is an image of a registered person in the public transportation system,
and the determining module is used for determining whether the passing is permitted or not based on the similarity, the current passing record information and the historical passing record information.
Further the comparison module comprises:
the feature extraction module is used for extracting features of the current image to obtain a current feature vector;
and the similarity confirming module is used for comparing the similarity of the current feature vector and the registration feature vector corresponding to the registration image.
Further, the feature extraction module includes:
the face detection model is used for carrying out face detection and key point positioning on the current image and the registered image respectively;
and the face recognition model is used for respectively extracting the current feature vector and the registered feature vector.
Further, the determining module includes a release confirmation model for determining whether to release, the release confirmation model being a function of:
p=POST([s 1 ,…s n ],th,M,SET{H 1 }),
wherein [ s ] 1 ,…s n ]For the face similarity of the n registered images with the highest face image similarity with the person to be passed,
th is the expected threshold score for the value of th,
m is the current passing record information,
SET{H 1 and the information is a set of historical traffic record information of registered people with the similarity of the faces of the people to be passed meeting a preset condition.
In a third aspect, the present invention provides a public transportation device based on a history of traffic, comprising a public transportation device based on a history of traffic according to any one of the above.
In a fourth aspect, the present invention provides a public transportation passing system based on history passing records, comprising:
the registration device is used for registering information and inputting registered images by a user;
the gate is used for acquiring the current image and the current traffic record information of the personnel to be passed;
the server is respectively connected with the registration equipment and the gate network,
the server includes:
one or more processors;
one or more memories having computer readable code stored therein, which when executed by the one or more processors, causes the processors to perform the steps of:
step S1, acquiring a current image of a person to be passed and acquiring current pass record information at the same time;
s2, comparing the similarity of the current image and a registered image, wherein the registered image is an image of a person registered in a public transportation system;
and step S3, determining whether to pass or not based on the similarity, the current pass record information and the historical pass record information.
Further, comparing the similarity between the current image and the registered image specifically includes:
step S21, extracting the characteristics of the current image to obtain a current characteristic vector;
step S22, comparing the similarity of the present feature vector and the registration feature vector corresponding to the registration image.
Further, the image and the registration image are face images, and the similarity is face similarity.
Further, the present feature vector and the registration feature vector are respectively extracted through a face detection model and a face recognition model, and the similarity is determined by calculating the cosine distance or the euclidean distance of the present feature vector and the registration feature vector.
Further, an MTCNN model is used as the face detection model to perform face detection and key point positioning on the current image and the registered image respectively, and then a FaceNet model is used as the face recognition model to extract the current feature vector and the registered feature vector respectively.
Further, the history traffic record information is the history traffic record information of the registered person, the similarity of the history traffic record information and the face of the person to be passed meets the preset condition, and the current traffic record information and the history traffic record information respectively comprise one or more of date, time, week, site and entry/exit information.
Further, in the step S3,
when the highest similarity value is above a preset threshold value, the personnel to be passed are passed,
and when the similarity is smaller than the preset threshold value, determining whether to pass or not based on the similarity, the current passing record information and the historical passing record information.
Further, based on the similarity, the current traffic record information and the historical traffic record information, determining whether to pass through a pass confirmation model, wherein the pass confirmation model is a function as follows:
p=POST([s 1 ,…s n ],th,M,SET{H 1 }),
wherein [ s ] 1 ,…s n ]For the face similarity of the n registered images with the highest face image similarity with the person to be passed,
th is the expected threshold score for the value of th,
m is the current passing record information,
SET{H 1 and the information is a set of historical traffic record information of registered people with the similarity of the faces of the people to be passed meeting a preset condition.
Further, the SET { H 1 And the information is a set of history traffic record information within a preset time range from the current traffic.
Further, the release confirmation model is obtained by training with face images of historic passers correctly released by registered persons as positive samples and face images of unregistered users as negative samples, and the training model is rated through a loss function.
Further, the training method comprises the following steps:
converting the positive and negative samples into sample feature vectors;
converting the sample characteristic vector into a two-dimensional vector, and respectively representing the original scores of release and non-release;
the raw score is converted to a normalized final score by softmax (function).
Further, the loss function is a cross entropy function as shown in the following formula:
L=-[y*log(P)+(1-y)*log(1-P)]
wherein y is the true value of the positive sample and negative sample label, 1 is put, 0 is put when not put, and P is the predicted value obtained after the positive sample and negative sample are input into the release confirmation model.
The technical scheme of the invention has at least one of the following beneficial effects:
according to the public transportation passing method based on the history passing record, on the basis of similarity judgment, the current passing record and the history passing record information are combined to judge, so that the rapid passing of passengers who often take public transportation means and have a stable passing rule can be improved, the passing experience is improved, for example, the similarity is insufficient due to factors such as poor image quality (influence of ambient light, photographing angle and the like) or certain change of appearance, the passengers can still pass rapidly, and the efficiency is improved;
further, the similarity threshold can be reduced by combining the current traffic record and the historical traffic record information for judgment, so that the time for comparing the similarity of the current image and the registered image is reduced, and passengers frequently traveling by public transportation means and having a stable travel rule can travel faster.
Drawings
FIG. 1 is a flow chart of a historic traffic record-based public transportation method according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of a historic traffic log-based public transportation passing arrangement according to an embodiment of the present invention;
fig. 3 is a schematic diagram of a historic traffic record-based public transportation system according to an embodiment of the present invention.
Detailed Description
The following describes in further detail the embodiments of the present invention with reference to the drawings and examples. The following examples are illustrative of the invention and are not intended to limit the scope of the invention.
In the payment (e.g., face payment, fingerprint payment, etc.) scenario of urban public transportation systems, the payment process generates other information than images, but the prior art does not mine and use such information. Particularly, for passengers who often travel by public transportation means, the travel behaviors of the passengers have strong regularity, and if the regularity can be learned and utilized, the recognition performance and the use experience of the passengers are further improved.
The following is mainly aimed at how to keep the original detection and identification model unchanged in the payment scene of the urban public transportation system, and on the basis, the identification performance and the use experience of the user when using the payment service (such as face-brushing payment, fingerprint payment and the like) are further improved by analyzing and utilizing the communication record information and the history traffic record information of the user.
First, a public transportation passing method based on a history passing record according to an embodiment of the present invention is described with reference to fig. 1.
As shown in fig. 1, a public transportation passing method based on a history passing record according to an embodiment of the present invention includes:
step S1, acquiring a current image of a person to be passed and acquiring current pass record information.
The image may include one or more of the following: face images, fingerprint images, iris images, etc. can identify images of an identity.
The current pass record information may include one or more of date, time, week, site, entry/exit information.
That is, the present image of the person to be passed is acquired through the device (for example, a camera, etc.), and the present pass record information is acquired at the same time.
And S2, comparing the similarity of the current image and a registered image, wherein the registered image is an image of a registered person in the public transportation system.
The registered image corresponds to the present image, for example, when the present image is a face image, the registered image also corresponds to the face image, and when the present image is a fingerprint image, the registered image also corresponds to the fingerprint image.
That is, all persons who use payment services (e.g., face-swipe payment, fingerprint payment, etc.) need to register on a designated registration device. Personnel register in a public transportation system and generate a registration image. When public transportation is out, the similarity of the current image (the image of the person to be communicated) and the registered images (the registered images of all registered persons) is compared.
According to some embodiments of the present invention, comparing the similarity between the present image and the registered image specifically includes:
and S21, extracting the characteristics of the current image to obtain a current characteristic vector.
That is, the present feature vector is obtained by extracting the features of the present image. The specific extraction method of the feature vector is different according to different images, for example, when comparing the similarity of the face images, the specific extraction method is extracted through a face detection model and a face recognition model, when comparing the similarity of the irises, the specific extraction method is extracted through an iris detection model and an iris recognition model, and the like.
Step S22, comparing the similarity of the present feature vector and the registration feature vector corresponding to the registration image.
The registration feature vector corresponding to the registration image may be extracted and stored in the registration feature vector database after the registration image.
The following describes a face image by comparing the face similarity between the present face image and the registered face image. Other such as fingerprint images, iris images, etc. can be analogized.
The comparing the similarity of the face image and the registered face image specifically comprises:
and S21', extracting features of the face image to obtain a face feature vector.
The specific method for feature extraction is not particularly limited as long as the facial features can be extracted. Optionally, extracting features of the current image through a face recognition model and a face detection model to obtain a current feature vector.
The method for extracting the features of the present image should be the same as the method for extracting the features of the registered image. In other words, in the above case, the present face feature vector and the registered face feature vector are obtained by feature extraction by the face recognition model and the face detection model.
In addition, when a person registers, that is, the registered feature vector of the corresponding registered image is extracted, and the extracted registered feature vector is stored in the registered image feature library as a target for comparison at the time of recognition later.
Further, the face detection model and the face recognition model may be written as a function f=rec (P).
Wherein P is an image for feature extraction, and f is a feature vector obtained by feature extraction.
For example, an MTCNN model is used as a face detection model to perform face detection and key point positioning on a face current image and a face registration image respectively, and then a FaceNet model is used as a face recognition model to extract a face current feature vector and a face registration feature vector respectively.
Step S22', the similarity of the face feature vector of this time and the registered face feature vector corresponding to the registered face image is compared.
After the face feature vector of this time is obtained, the face feature vector is compared with the registered face feature vector corresponding to the registered face image in similarity. Optionally, the face similarity is determined by calculating the cosine distance or euclidean distance between the face feature vector and the registered face feature vector.
For example, the face similarity is determined by calculating the cosine distance between the face feature vector and the registered face feature vector, where:
a, B is the face feature vector and the registered face feature vector, and similarity is the face similarity.
The face feature vector of this time can be compared with the registered face feature vector of each face in the registered image feature library. For example, if there are n registered face feature vectors of registered images in the registered image feature library, the vector comparison is performed n times in total. And during each comparison, the similarity between the face image and the face registration image can be quantified by calculating the cosine distance of the two feature vectors.
And step S3, determining whether to pass or not based on the similarity, the current pass record information and the historical pass record information.
After the similarity is obtained, combining the current pass record information and the historical pass record information, and finally determining whether to pass or not.
Therefore, on the premise that the original face detection and recognition model is kept unchanged, the historical passing record information of the personnel is analyzed and utilized, and finally, the recognition performance and the use experience of the user when using payment services (such as face-brushing payment, fingerprint payment and the like) are further improved.
In the following, a face image is taken as an example, and whether to pass or not is determined based on the similarity, the current pass record information and the history pass record information, and other images can be analogized.
The history communication record information may be all the history traffic record information, preferably, in order to reduce the calculation amount, the history traffic record information may be the history traffic record information of the registered person whose face similarity with the person to be passed is within a predetermined threshold range, where the current traffic record information and the history traffic record information respectively include one or more of date, time, week, website, and entry/exit information.
The similarity meeting the predetermined condition may be understood as that the value of the face similarity with the person to be passed is within a predetermined range (for example, the cosine similarity is between 0.8 and 1), or may be understood as that the ranking of the face similarity meets the predetermined condition (for example, the top 10 or the top 1% or less of the value of the cosine similarity). That is, in addition to the case where the cosine similarity is 1 is directly obtained, for example, in the case where the cosine similarity cannot be determined to be 1 because of a problem of a snap angle or the like, the history traffic record information of the registered person whose face similarity with the person to be passed satisfies a predetermined condition (i.e., is within a predetermined threshold range) is retrieved, and further, whether or not to pass is judged based on the history traffic record information thereof, the current traffic record information thereof.
According to some embodiments of the invention, determining whether to pass based on the similarity, the current pass record information, and the historical pass record information includes:
s31, when the highest similarity value of the current image and the registered image is more than a preset threshold value, the personnel to be passed are passed.
That is, it can be determined by the similarity comparison that the person to be passed is a registered person and the person to be passed is released, that is, a case in which the person to be passed is normally released with a single similarity and a fixed predetermined threshold value.
And S32, when the similarity is smaller than a preset threshold value, determining whether to pass or not based on the face similarity, the current pass record information and the historical pass record information.
That is, the face similarity does not reach the condition that the registered person is directly judged, and whether the person is released or not is determined based on the face similarity, the current pass record information and the history pass record information. Thus, the following effects on the experience of the person (such as poor quality of the face snapshot of the registered person, or a certain change in the appearance of the registered person) can be avoided, for example, prompt of re-face brushing or rejection of release.
Further, whether the pass is determined by a pass confirmation model is determined by the following function:
p=POST([s 1 ,…s n ],th,M,SET{H 1 }),
wherein [ s ] 1 ,…s n ]For example, n may be 10, and s is the face similarity of the n registered images with the highest face image similarity of the person to be passed 1 ,…s 10 ]。
th is the expected threshold score for the value of th,
m is the current passing record information,
SET{H 1 is a history pass record of registered person satisfying a predetermined condition (for example, only the person whose face similarity is highest, or the first several may be selected)And recording the set of information.
Further, SET { H 1 And the information is a set of history traffic record information within a preset time range from the current traffic. For example, the predetermined time range is half a month or one month. Therefore, the situation that a person starts to have new regular record information and waits for a long time before having better experience can be avoided.
The training method of the release confirmation model can be as follows:
the release confirmation model takes face images of historic passers correctly released by registered personnel (historic final release personnel and on-site terrible images) as positive samples, takes face images of unregistered users as negative samples for training, and evaluates the training model through a loss function.
Further, positive samples are mainly "having low recognition similarity with registrants due to blurring, partial shielding, makeup and the like, but having more stable history traffic record information", and negative samples are mainly "having high similarity with registrants, but not conforming to the history traffic record information of the registrants".
The positive and negative samples can be the current pass record information and the history pass record information, and can also be a face registration image, a current pass record image, the current pass record information and the duration pass record information, which are understood to be within the scope of the invention.
Preferably, the loss function is a cross entropy function of the formula:
L=-[y*log(P)+(1-y)*log(1-P)]
wherein y is the true value of the positive sample and negative sample label, 1 is put, 0 is put when not put, and P is the predicted value obtained after the positive sample and negative sample are input into the release confirmation model.
Optionally, the release verification model mainly comprises an embedding layer (embedding layer), a full connection layer and a softmax layer, and the training method comprises the following steps:
1) The positive and negative samples are converted into sample feature vectors.
First, positive and negative samples are encoded by an embedding layer and converted into feature vectors of a fixed length.
2) And converting the sample characteristic vector into a two-dimensional vector, and respectively representing the original scores of release and non-release.
And then, carrying out operation on the feature vector through a plurality of full connection layers to obtain a two-dimensional vector which respectively represents the original scores of allowed traffic and disallowed traffic.
3) The raw score is converted to a normalized final score by softmax (function).
Finally, the raw score is converted to a normalized final score by the softmax layer.
Of course, the above is an alternative example only, and the softmax (function) of the training model may be replaced with Sigmoid (function), and the cross entropy function of the loss function may be replaced with a mean square error function, i.e., any training model and loss function training release verification model are understood to be within the scope of the present invention.
According to some embodiments of the invention, determining whether to pass based on the similarity, the current pass record information, and the historical pass record information includes:
s31', when the similarity between the current image and the registered image reaches a preset threshold value range, the history passing record information is called.
S32', determining travel similarity based on the history traffic record information and the current traffic record information.
And S33', determining whether to pass or not based on the travel similarity and the similarity (the similarity between the current image and the registered image).
Therefore, when the face recognition fails, the history passing record information and the current passing record information are passed, and the use experience is increased.
Next, a public transportation passing apparatus 1000 based on a history of passing records according to an embodiment of the present invention is described with reference to fig. 2, including:
an obtaining module 1001, configured to obtain a current image of a person to be passed, and obtain current pass record information at the same time;
a comparison module 1002, configured to compare the similarity between the current image and a registered image, where the registered image is an image of a registrant in the public transportation system,
a determining module 1003, configured to determine whether to pass based on the similarity, the current pass record information, and the historical pass record information.
Further the comparing module 1002 includes:
the feature extraction module is used for extracting features of the current image to obtain a current feature vector;
and the similarity confirming module is used for comparing the similarity of the current feature vector and the registration feature vector corresponding to the registration image.
Further, the feature extraction module includes:
the face detection model is used for carrying out face detection and key point positioning on the current image and the registered image respectively;
and the face recognition model is used for respectively extracting the current feature vector and the registered feature vector.
Further, the determining module 1003 includes a release confirmation model for determining whether to release, where the release confirmation model is a function of:
p=POST([s 1 ,…s n ],th,M,SET{H 1 }),
wherein [ s ] 1 ,…s n ]For the face similarity of the n registered images with the highest face image similarity with the person to be passed,
th is the expected threshold score for the value of th,
m is the current passing record information,
SET{H 1 and the information is a set of historical traffic record information of registered people with the similarity of the faces of the people to be passed meeting a preset condition.
In addition, the embodiment of the invention provides public transportation equipment based on the history traffic record, which comprises the public transportation passing device based on the history traffic record.
In addition, the embodiment of the invention also provides a public traffic system based on the history traffic record, as shown in fig. 3, which comprises:
the registration device is used for registering information and inputting registered images by a user;
the gate is used for acquiring the current image and the current traffic record information of the personnel to be passed;
the server is respectively connected with the registration equipment and the gate network,
the server includes:
one or more processors;
one or more memories having computer readable code stored therein, which when executed by the one or more processors, causes the processors to perform the steps of:
step S1, acquiring a current image of a person to be passed and acquiring current pass record information at the same time;
s2, comparing the similarity of the current image and a registered image, wherein the registered image is an image of a person registered in a public transportation system;
and step S3, determining whether to pass or not based on the similarity, the current pass record information and the historical pass record information.
All the steps involved in the public transportation method can be executed by the processor, and will not be described in detail herein.
According to some embodiments of the present invention, as shown in fig. 3, the public transportation passing system flow based on the history passing record is:
1) The following procedure is performed at the registration device: the user registers, obtains user information and a registration image, extracts the characteristics of the registration image, inputs the characteristics of the registration image into a registration image library,
2) The user enters and exits the gate machine to carry out the following processes: and acquiring the current image and current traffic record information of the personnel to be passed.
3) The processor performs the following flow: and extracting the image characteristics, calculating the similarity of the image and the registered image, determining whether the image is released or not based on the similarity, the current traffic record information and the historical traffic record information through a confirmation model, and sending a judging result to the gate so as to control the opening and/or closing of the gate.
While the foregoing is directed to the preferred embodiments of the present invention, it will be appreciated by those skilled in the art that various modifications and adaptations can be made without departing from the principles of the present invention, and such modifications and adaptations are intended to be comprehended within the scope of the present invention.

Claims (22)

1. A public transportation passing method based on a history passing record, comprising:
step S1, acquiring a current image of a person to be passed and acquiring current pass record information at the same time;
s2, comparing the similarity of the current image and a registered image, wherein the registered image is an image of a person registered in a public transportation system;
step S3, determining whether to pass or not based on the similarity, the current pass record information and the historical pass record information;
the comparing the similarity between the current image and the registered image specifically comprises:
step S21, extracting the characteristics of the current image to obtain a current characteristic vector;
step S22, comparing the similarity of the present feature vector and the registration feature vector corresponding to the registration image;
the image and the registration image are face images, and the similarity is face similarity;
the history traffic record information is the history traffic record information of registered personnel with the face similarity of the personnel to be passed meeting the preset condition, and the current traffic record information and the history traffic record information respectively comprise one or more of date, time, week, website and entrance/exit information.
2. The history traffic method according to claim 1, wherein the present feature vector and the registered feature vector are extracted by a face detection model and a face recognition model, respectively, and the similarity is determined by calculating a cosine distance or a euclidean distance of the present feature vector and the registered feature vector.
3. The history traffic log-based public transportation method according to claim 2, wherein the face detection and the key point localization are performed on the present image and the registered image, respectively, using an MTCNN model as the face detection model, and thereafter using a FaceNet model as the face recognition model to extract the present feature vector and the registered feature vector, respectively.
4. The history-based public transportation passing method according to claim 1, wherein in the step S3,
when the highest similarity value is above a preset threshold value, the personnel to be passed are passed,
and when the similarity is smaller than the preset threshold value, determining whether to pass or not based on the similarity, the current passing record information and the historical passing record information.
5. The history-based public transportation passing method according to claim 4, wherein whether to pass is determined by a pass confirmation model based on the similarity, the current passing record information, and the history passing record information, the pass confirmation model being a function of:
p=POST([s 1 ,…s n ],th,M,SET{H 1 }),
wherein [ s ] 1 ,…s n ]For the face similarity of the n registered images with the highest face image similarity with the person to be passed,
th is the expected threshold score for the value of th,
m is the current passing record information,
SET{H 1 and the information is a set of historical traffic record information of registered people with the similarity of the faces of the people to be passed meeting a preset condition.
6. The history-based public transportation passing method according to claim 5, wherein the SET { H 1 And the information is a set of history traffic record information within a preset time range from the current traffic.
7. The public transportation passing method based on the history passing record according to claim 5, wherein the passing confirmation model is obtained by training with face images of history passing persons correctly passed by registered persons as positive samples and face images of unregistered users as negative samples, and is evaluated by a loss function.
8. The history traffic log-based public transportation passing method according to claim 7, wherein the training method is:
converting the positive and negative samples into sample feature vectors;
converting the sample characteristic vector into a two-dimensional vector, and respectively representing the original scores of release and non-release;
the raw score is converted to a normalized final score by softmax (function).
9. The history-based public transportation passing method according to claim 7, wherein the loss function is a cross entropy function represented by the following formula:
L=-[y*log(P)+(1-y)*log(1-P)]
wherein y is the true value of the positive sample and negative sample label, 1 is put, 0 is put when not put, and P is the predicted value obtained after the positive sample and negative sample are input into the release confirmation model.
10. A public transportation passing device based on a history passing record, comprising:
the acquisition module is used for acquiring the current image of the personnel to be passed and acquiring the current pass record information at the same time;
a comparison module for comparing the similarity of the current image and the registered image, wherein the registered image is an image of a registered person in the public transportation system,
the determining module is used for determining whether the passing is allowed or not based on the similarity, the current passing record information and the historical passing record information;
the comparison module comprises:
the feature extraction module is used for extracting features of the current image to obtain a current feature vector;
the similarity confirming module is used for comparing the similarity of the current feature vector and the registration feature vector corresponding to the registration image;
the image and the registration image are face images, and the similarity is face similarity;
the history traffic record information is the history traffic record information of registered personnel with the face similarity of the personnel to be passed meeting the preset condition, and the current traffic record information and the history traffic record information respectively comprise one or more of date, time, week, website and entrance/exit information.
11. The history-based public transportation passing apparatus according to claim 10, wherein the feature extraction module comprises:
the face detection model is used for carrying out face detection and key point positioning on the current image and the registered image respectively;
and the face recognition model is used for respectively extracting the current feature vector and the registered feature vector.
12. The historic traffic-based public transportation device of claim 11, wherein the determination module comprises a release confirmation model for determining whether to release, the release confirmation model being a function of:
p=POST([s 1 ,…s n ],th,M,SET{H 1 }),
wherein [ s ] 1 ,…s n ]For the face similarity of the n registered images with the highest face image similarity with the person to be passed,
th is the expected threshold score for the value of th,
m is the current passing record information,
SET{H 1 and the information is a set of historical traffic record information of registered people with the similarity of the faces of the people to be passed meeting a preset condition.
13. A history-based public transportation apparatus comprising a history-based public transportation passing device according to any one of claims 10 to 12.
14. A transit system based on a history of transit records, comprising:
the registration device is used for registering information and inputting registered images by a user;
the gate is used for acquiring the current image and the current traffic record information of the personnel to be passed;
the server is respectively connected with the registration equipment and the gate network,
the server includes:
one or more processors;
one or more memories having computer readable code stored therein, which when executed by the one or more processors, causes the processors to perform the steps of:
step S1, acquiring a current image of a person to be passed and acquiring current pass record information at the same time;
s2, comparing the similarity of the current image and a registered image, wherein the registered image is an image of a person registered in a public transportation system;
step S3, determining whether to pass or not based on the similarity, the current pass record information and the historical pass record information;
the comparing the similarity between the current image and the registered image specifically comprises:
step S21, extracting the characteristics of the current image to obtain a current characteristic vector;
step S22, comparing the similarity of the present feature vector and the registration feature vector corresponding to the registration image;
the image and the registration image are face images, and the similarity is face similarity;
the history traffic record information is the history traffic record information of registered personnel with the face similarity of the personnel to be passed meeting the preset condition, and the current traffic record information and the history traffic record information respectively comprise one or more of date, time, week, website and entrance/exit information.
15. The history traffic system according to claim 14, wherein the present feature vector and the registered feature vector are extracted by a face detection model and a face recognition model, respectively, and the similarity is determined by calculating a cosine distance or a euclidean distance of the present feature vector and the registered feature vector.
16. The history-based public transportation passing system according to claim 15, wherein the face detection and key point localization are performed on the present image and the registered image, respectively, using an MTCNN model as the face detection model, and thereafter using a FaceNet model as the face recognition model to extract the present feature vector and the registered feature vector, respectively.
17. The history-based public transportation passing system according to claim 14, wherein in step S3,
when the highest similarity value is above a preset threshold value, the personnel to be passed are passed,
and when the similarity is smaller than the preset threshold value, determining whether to pass or not based on the similarity, the current passing record information and the historical passing record information.
18. The historic traffic record-based public transportation system according to claim 14, wherein whether to pass is determined by a pass confirmation model based on the similarity, the current traffic record information, and the historic traffic record information, the pass confirmation model being a function of:
p=POST([s 1 ,…s n ],th,M,SET{H 1 }),
wherein [ s ] 1 ,…s n ]For the face similarity of the n registered images with the highest face image similarity with the person to be passed,
th is the expected threshold score for the value of th,
m is the current passing record information,
SET{H 1 and the information is a set of historical traffic record information of registered people with the similarity of the faces of the people to be passed meeting a preset condition.
19. The history-based public transportation system according to claim 18, wherein the SET { H 1 And the information is a set of history traffic record information within a preset time range from the current traffic.
20. The historic traffic record-based public transportation system according to claim 18, wherein the release confirmation model is obtained by training with face images of historic traffic personnel correctly released by registered personnel as positive samples and face images of unregistered users as negative samples, and is evaluated by a loss function on the training model.
21. The historic traffic record-based public transportation system of claim 20, wherein the training method is as follows:
converting the positive and negative samples into sample feature vectors;
converting the sample characteristic vector into a two-dimensional vector, and respectively representing the original scores of release and non-release;
the raw score is converted to a normalized final score by softmax (function).
22. The historic traffic-based public transportation system of claim 20, wherein the loss function is a cross entropy function represented by the following formula:
L=-[y*log(P)+(1-y)*log(1-P)]
wherein y is the true value of the positive sample and negative sample label, 1 is put, 0 is put when not put, and P is the predicted value obtained after the positive sample and negative sample are input into the release confirmation model.
CN201911410193.7A 2019-12-31 2019-12-31 Public traffic passing method, device, equipment and system based on history passing record Active CN111159159B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911410193.7A CN111159159B (en) 2019-12-31 2019-12-31 Public traffic passing method, device, equipment and system based on history passing record

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911410193.7A CN111159159B (en) 2019-12-31 2019-12-31 Public traffic passing method, device, equipment and system based on history passing record

Publications (2)

Publication Number Publication Date
CN111159159A CN111159159A (en) 2020-05-15
CN111159159B true CN111159159B (en) 2023-11-14

Family

ID=70559895

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911410193.7A Active CN111159159B (en) 2019-12-31 2019-12-31 Public traffic passing method, device, equipment and system based on history passing record

Country Status (1)

Country Link
CN (1) CN111159159B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104851140A (en) * 2014-12-12 2015-08-19 重庆凯泽科技有限公司 Face recognition-based attendance access control system
CN109636397A (en) * 2018-11-13 2019-04-16 平安科技(深圳)有限公司 Transit trip control method, device, computer equipment and storage medium
CN110111470A (en) * 2019-05-16 2019-08-09 郑州博雅讯科技有限公司 A kind of current data linkage control method and control device

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019185384A (en) * 2018-04-10 2019-10-24 キヤノン株式会社 Image authentication device, image authentication method, computer program and storage medium

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104851140A (en) * 2014-12-12 2015-08-19 重庆凯泽科技有限公司 Face recognition-based attendance access control system
CN109636397A (en) * 2018-11-13 2019-04-16 平安科技(深圳)有限公司 Transit trip control method, device, computer equipment and storage medium
CN110111470A (en) * 2019-05-16 2019-08-09 郑州博雅讯科技有限公司 A kind of current data linkage control method and control device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
赵来元 ; 高鸿彬 ; 李媛 ; .基于主成分分析(PCA)的人脸识别算法研究.电子世界.2017,(02),全文. *

Also Published As

Publication number Publication date
CN111159159A (en) 2020-05-15

Similar Documents

Publication Publication Date Title
Sawhney et al. Real-time smart attendance system using face recognition techniques
CN111881726B (en) Living body detection method and device and storage medium
Kukharev et al. Visitor identification-elaborating real time face recognition system
US11074330B2 (en) Biometric recognition method
Schmid et al. Performance analysis of iris-based identification system at the matching score level
CN107833328B (en) Access control verification method and device based on face recognition and computing equipment
CN110874878B (en) Pedestrian analysis method, device, terminal and storage medium
CN111104852B (en) Face recognition technology based on heuristic Gaussian cloud transformation
KR102145132B1 (en) Surrogate Interview Prevention Method Using Deep Learning
WO2020093303A1 (en) Processing method and apparatus based on facial recognition, and device and readable storage medium
CN105740675B (en) A kind of method and system triggering empowerment management based on dynamic person recognition
CN111507320A (en) Detection method, device, equipment and storage medium for kitchen violation behaviors
Tistarelli et al. Biometrics in forensic science: challenges, lessons and new technologies
CN113033404A (en) Face attack event detection method, device, equipment and storage medium
CN105574519A (en) Method and system for opening intelligent door by identifying dynamic figure characteristics
CN109359689B (en) Data identification method and device
CN111582195B (en) Construction method of Chinese lip language monosyllabic recognition classifier
US11062008B2 (en) Biometric recognition method
CN111159159B (en) Public traffic passing method, device, equipment and system based on history passing record
CN112258707A (en) Intelligent access control system based on face recognition
Priyanka et al. Smart ticket system for metro train
CN113591619A (en) Face recognition verification device based on video and verification method thereof
CN111145413A (en) Intelligent access control system and face recognition method thereof
CN114613058B (en) Access control system with attendance function, attendance method and related device
Lateef et al. Face Recognition-Based Automatic Attendance System in a Smart Classroom

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant