CN113077574A - Passing method, system, equipment and medium based on face recognition - Google Patents
Passing method, system, equipment and medium based on face recognition Download PDFInfo
- Publication number
- CN113077574A CN113077574A CN202110386864.1A CN202110386864A CN113077574A CN 113077574 A CN113077574 A CN 113077574A CN 202110386864 A CN202110386864 A CN 202110386864A CN 113077574 A CN113077574 A CN 113077574A
- Authority
- CN
- China
- Prior art keywords
- face
- target object
- account information
- library
- face data
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 88
- 238000001514 detection method Methods 0.000 claims abstract description 58
- 230000003993 interaction Effects 0.000 claims description 33
- 238000012795 verification Methods 0.000 claims description 30
- 230000008569 process Effects 0.000 claims description 29
- 230000008859 change Effects 0.000 claims description 9
- 238000004891 communication Methods 0.000 description 14
- 238000005516 engineering process Methods 0.000 description 13
- 230000002159 abnormal effect Effects 0.000 description 12
- 230000001680 brushing effect Effects 0.000 description 12
- 230000006870 function Effects 0.000 description 12
- 238000012545 processing Methods 0.000 description 12
- 238000000605 extraction Methods 0.000 description 9
- 238000010586 diagram Methods 0.000 description 6
- 238000012423 maintenance Methods 0.000 description 5
- 230000001939 inductive effect Effects 0.000 description 4
- 230000000694 effects Effects 0.000 description 3
- BQCADISMDOOEFD-UHFFFAOYSA-N Silver Chemical compound [Ag] BQCADISMDOOEFD-UHFFFAOYSA-N 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 2
- 238000011161 development Methods 0.000 description 2
- 230000008030 elimination Effects 0.000 description 2
- 238000003379 elimination reaction Methods 0.000 description 2
- 230000007246 mechanism Effects 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 229910052709 silver Inorganic materials 0.000 description 2
- 239000004332 silver Substances 0.000 description 2
- 230000009471 action Effects 0.000 description 1
- 230000004888 barrier function Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 238000012790 confirmation Methods 0.000 description 1
- 238000004134 energy conservation Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 238000003672 processing method Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07C—TIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
- G07C9/00—Individual registration on entry or exit
- G07C9/30—Individual registration on entry or exit not involving the use of a pass
- G07C9/32—Individual registration on entry or exit not involving the use of a pass in combination with an identity check
- G07C9/37—Individual registration on entry or exit not involving the use of a pass in combination with an identity check using biometric data, e.g. fingerprints, iris scans or voice recognition
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07B—TICKET-ISSUING APPARATUS; FARE-REGISTERING APPARATUS; FRANKING APPARATUS
- G07B11/00—Apparatus for validating or cancelling issued tickets
- G07B11/02—Apparatus for validating or cancelling issued tickets for validating inserted tickets
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Devices For Checking Fares Or Tickets At Control Points (AREA)
Abstract
The invention provides a passing method, a system, equipment and a medium based on face recognition, wherein a face image of a target object is obtained through a face recognition terminal; performing living body detection on the acquired face image of the target object, and comparing the face image subjected to the living body detection with all face data in a face bottom library; acquiring face data with the similarity of the face image exceeding a preset threshold value and account information corresponding to the face data from a face bottom library according to the comparison result; the account information at least includes: pass account information and/or payment account information; and generating pass information based on the acquired face data and the corresponding account information, and confirming whether the target object can pass according to the pass information. By means of uniqueness and uniqueness of face information, the invention improves the efficiency of passengers for getting in and out of the station and reduces crowds of passengers for getting in and out of the station or getting on and off the bus by public traffic; the traffic operation service quality is improved.
Description
Technical Field
The invention relates to the technical field of payment, in particular to a passing method, a passing system, passing equipment and a passing medium based on face recognition.
Background
Urban public transport (such as rail transit and urban buses) is a support column of urban traffic and has the characteristics of high speed, large carrying capacity, energy conservation and land utilization. Urban public transport plays a vital role in future development of cities, and domestic cities are developing public transport vigorously at present. However, when passengers take public transportation, the existing modes of entering and exiting and/or getting on and off the bus are usually to enter and exit by swiping a traffic card, scanning a traffic APP two-dimensional code, swiping a financial IC card (bank union flash payment) and the like, and the passengers need to carry the traffic card recharged in advance or start data traffic by using a mobile phone to display the traffic APP two-dimensional code. The existing access and/or getting on/off modes have the following problems:
(1) when the traffic card is defaulting, the card can be used only after being recharged;
(2) the traffic card or the mobile phone is required to be drawn out in advance to open the traffic APP two-dimensional code when the bus passes through the gate of the station entrance and the gate of the station exit or is loaded on a bus every time, so that the user experience is poor;
(3) when carrying large luggage, passengers are inconvenient to draw out traffic cards or operate mobile phones, low efficiency of entering and leaving stations is low, and crowding is easily caused;
(4) the foreign travelers need to purchase temporary tickets;
(5) the self-service ticket machines are required to be configured at the traffic stations to enable passengers to purchase temporary tickets, the self-service ticket machine equipment is required to be configured to occupy precious space, and meanwhile, operation maintenance personnel are required to be configured, so that the operation cost is increased;
(6) when a specific crowd freely walks through the side door, the specific crowd needs the assistance of workers and cannot automatically store records through the system;
(7) at present, the traffic card is generally not bound with the identity card, so that real-name riding can not be realized;
(8) the riding track and record of the suspicious personnel cannot be traced.
Disclosure of Invention
In view of the above-mentioned shortcomings in the prior art, an object of the present invention is to provide a passing method, system, device and medium based on face recognition, which are used to solve the technical problems in the prior art.
In order to achieve the above and other related objects, the present invention provides a passing method based on face recognition, comprising the following steps:
acquiring a face image of a target object to be passed;
performing living body detection on the acquired face image of the target object, and comparing the face image subjected to the living body detection with all face data in a face bottom library;
according to the comparison result, acquiring face data with the similarity exceeding a preset threshold value with the face image and account information corresponding to the face data from the face bottom library;
and generating passage information based on the acquired face data and the corresponding account information, and confirming whether the target object can pass according to the passage information.
Optionally, the account information includes at least: pass account information and/or payment account information; before comparing the face image detected by the living body with all face data in a face base, establishing the face base; the establishment process is as follows:
collecting personal registration information of the target object through a traffic application program and/or a traffic interaction platform; the personal registration information includes at least: face data, mobile phone number information and mobile phone Bluetooth MAC address information;
creating corresponding traffic account information based on the personal registration information of the target object;
and binding the created passage account information with the payment account information of the target object, storing the personal registration information, the passage account information and the payment account information of the target object into one or more databases after the binding is finished, and establishing the human face bottom library.
Optionally, the process of acquiring the face data of the target object through the traffic application and/or the traffic interaction platform includes:
acquiring a face image of the target object through a traffic application program and/or a traffic interaction platform;
performing living body detection on the acquired face image, and if the face image does not pass the living body detection, acquiring the face image of the target object again;
the method comprises the steps of utilizing a quality score model to score face images detected through living bodies, and obtaining the face images with the quality scores exceeding a preset score value;
and performing feature recognition on the face image with the quality score exceeding a preset score value to acquire the face features of the target object as face data of the target object.
Optionally, when a target object enters or leaves the station, if the face data of a plurality of target objects is obtained from the face base according to the comparison result of a certain target object, the method further includes:
performing secondary verification on the certain target object by using mobile phone number information and/or mobile phone Bluetooth MAC address information;
acquiring final face data from the face base based on a secondary verification result, and acquiring account information corresponding to the final face data; or determining final face data from the face data of the plurality of target objects acquired last time based on the secondary verification result, and determining account information corresponding to the final face data.
Optionally, the face base library is arranged in a master library, a site library and/or a gate side library; the site library and/or the gate side library are/is obtained according to the master library;
if the face data stored in the site library and/or the gate side library exceed a preset limited space, determining the first K target objects with the largest riding frequency in a preset time period or the first K target objects with the highest riding frequency in the preset time period as frequently-visited target objects;
and eliminating the face data except the frequently-accessed target object in the site library and/or the gate side library by utilizing an LRU algorithm.
Optionally, when the target object is confirmed to be passable, the ticket checking machine sends the current station, the current time, the current gate number, the face data of the target object and the corresponding passage account information to a ticketing and ticket checking system to generate the station-entering passage order information of the target object;
transmitting the inbound passing order information of the target object to the ticket checking machine for identification, and controlling the current gate to change the current state or maintain the current state according to the identification result;
and/or when the target object is positioned at a traffic exit, the ticket checking machine sends the current station, the current time, the current gate number, the face data of the target object, the corresponding traffic account information and the payment account information to a ticket selling and checking system to generate the exit traffic order information of the target object;
and transmitting the outbound passing order information of the target object to the ticket checking machine for identification, and controlling the current gate to change the current state or maintain the current state according to the identification result.
The invention also provides a passing system based on face recognition, which comprises:
the acquisition module is used for acquiring a face image of a target object to be passed;
the comparison module is used for carrying out living body detection on the acquired face image of the target object and comparing the face image subjected to the living body detection with all face data in the face base;
the account module is used for acquiring face data with the similarity exceeding a preset threshold value with a face image and account information corresponding to the face data from the face bottom library according to a comparison result;
and the passing module is used for generating passing information according to the acquired face data and the corresponding account information and confirming whether the target object can pass or not according to the passing information.
Optionally, the account information includes at least: pass account information and/or payment account information; before the comparison module compares the face image detected by the living body with all face data in a face base, establishing the face base; the establishment process is as follows:
collecting personal registration information of the target object through a traffic application program and/or a traffic interaction platform; the personal registration information includes at least: face data, mobile phone number information and mobile phone Bluetooth MAC address information;
creating corresponding traffic account information based on the personal registration information of the target object;
binding the created passing account information with the payment account information of the target object; and after the binding is finished, storing the personal registration information, the pass account information and the payment account information of the target object into one or more databases to establish the human face bottom library.
Optionally, the process of acquiring the face data of the target object through the traffic application and/or the traffic interaction platform includes:
acquiring a face image of the target object through a traffic application program and/or a traffic interaction platform;
performing living body detection on the acquired face image, and if the face image does not pass the living body detection, acquiring the face image of the target object again;
the method comprises the steps of utilizing a quality score model to score face images detected through living bodies, and obtaining the face images with the quality scores exceeding a preset score value;
and performing feature recognition on the face image with the quality score exceeding a preset score value to acquire the face features of the target object as face data of the target object.
Optionally, when a target object enters or leaves the station, if the face data of a plurality of target objects is obtained from the face base according to the comparison result of a certain target object, the method further includes:
performing secondary verification on the certain target object by using mobile phone number information and/or mobile phone Bluetooth MAC address information;
acquiring final face data from the face base based on a secondary verification result, and acquiring account information corresponding to the final face data; or determining final face data from the face data of the plurality of target objects acquired last time based on the secondary verification result, and determining account information corresponding to the final face data.
Optionally, the face bottom library at least comprises a master library, a site library and a gate side library; the site library and/or the gate side library are/is obtained according to the master library;
if the face data stored in the site library and/or the gate side library exceed a preset limited space, determining the first K target objects with the largest riding frequency in a preset time period or the first K target objects with the highest riding frequency in the preset time period as frequently-visited target objects;
and eliminating the face data except the frequently-accessed target object in the site library and/or the gate side library by utilizing an LRU algorithm.
The present invention also provides a computer apparatus comprising:
one or more processors; and
one or more machine-readable media having instructions stored thereon that, when executed by the one or more processors, cause the apparatus to perform a method as in any one of the above.
The invention also provides one or more machine-readable media having instructions stored thereon, which when executed by one or more processors, cause an apparatus to perform the method as described in any one of the above.
As described above, the present invention provides a passing method, system, device and medium based on face recognition, which has the following beneficial effects: acquiring a face image of a target object to be passed through by one or more face recognition terminals; performing living body detection on the obtained face image of the target object, comparing the face image subjected to the living body detection with all face data in a face base, and obtaining the face data with the similarity exceeding a preset threshold value with the face image and account information corresponding to the face data from the face base according to the comparison result; the account information at least includes: pass account information and/or payment account information; and generating traffic information based on the acquired face data and the corresponding account information, and determining whether the target object can pass according to the traffic information. The invention can complete the association binding of the face information with the traffic account and the payment account by taking the face recognition technology as the core on the basis of the AFC of the prior ticket selling and checking system. Therefore, when a passenger enters and exits the station, the passenger can be subjected to face snapshot through a face recognition terminal PAD arranged at a gate end of the station entrance and/or the station exit, the snapshot face is sent to a background face brushing passing system to carry out living body detection, face feature extraction and face comparison, and passing account information and payment account information corresponding to the passenger are found from an AFC (automatic fare collection) system according to a comparison result. Meanwhile, the passing account information and the payment account information corresponding to the passenger are judged through an AFC (automatic fare collection) system of the ticketing and ticket checking system, and the entering passing order information or the leaving passing order information and a gate control instruction are generated; and sending the generated gate control instruction to a ticket checker AGM at the gate end, and identifying the gate control instruction through the ticket checker AGM to control the gate at the station entrance or the station exit to be opened or continuously closed. And when the passenger leaves the station, the ticket selling and checking system AFC deducts money from the payment account corresponding to the recognized face data, so that the functions of face brushing, passing and payment of the passenger are realized. And when deducting money, the ticket selling and checking system AFC also sends a money deduction result to the payment management system for payment service statistics. According to the invention, by means of uniqueness and uniqueness of face information, the face information of a passenger is associated and bound with the traffic account information and the payment account information of the passenger, when the passenger enters and exits the gate, the face identification terminal on the gate is used for carrying out face identification on the passenger, and after an identification result is transmitted to the ticket selling and checking system AFC, the ticket selling and checking system AFC can automatically judge the traffic account information and the payment account information of the passenger, so that a traffic card or a mobile phone is not required to be drawn out, media such as a bus card, a two-dimensional code and the like in the prior art are replaced, the efficiency of the passenger entering and exiting the station is improved, the congestion of the passenger when the passenger enters and exits the station or gets on and off the station by public traffic is reduced, and the. Public transportation in the present invention includes but is not limited to rail transit, city bus, etc. The invention can also collect the mobile phone number information and the mobile phone Bluetooth MAC address information of the passenger, can realize real-name riding for the passenger and is convenient for tracing the riding record of the passenger.
Drawings
Fig. 1 is a schematic flow chart of a passing method based on face recognition according to an embodiment;
fig. 2 is a schematic flow chart of a passing method based on face recognition according to another embodiment;
FIG. 3 is a flow diagram illustrating a process of registering a target object with a traffic application and/or a traffic interaction platform according to an embodiment;
FIG. 4 is a schematic flow chart illustrating the process of entering and exiting a target object according to an embodiment;
fig. 5 is a schematic hardware structure diagram of a passing system based on face recognition according to an embodiment;
fig. 6 is a schematic hardware structure diagram of a terminal device according to an embodiment;
fig. 7 is a schematic diagram of a hardware structure of a terminal device according to another embodiment.
Description of the element reference numerals
M10 acquisition module
M20 alignment module
M30 account module
M40 passing module
1100 input device
1101 first processor
1102 output device
1103 first memory
1104 communication bus
1200 processing assembly
1201 second processor
1202 second memory
1203 communication assembly
1204 Power supply Assembly
1205 multimedia assembly
1206 Audio component
1207 input/output interface
1208 sensor assembly
Detailed Description
The embodiments of the present invention are described below with reference to specific embodiments, and other advantages and effects of the present invention will be easily understood by those skilled in the art from the disclosure of the present specification. The invention is capable of other and different embodiments and of being practiced or of being carried out in various ways, and its several details are capable of modification in various respects, all without departing from the spirit and scope of the present invention. It is to be noted that the features in the following embodiments and examples may be combined with each other without conflict.
It should be noted that the drawings provided in the following embodiments are only for illustrating the basic idea of the present invention, and the components related to the present invention are only shown in the drawings rather than drawn according to the number, shape and size of the components in actual implementation, and the type, quantity and proportion of the components in actual implementation may be changed freely, and the layout of the components may be more complicated.
When passengers take public transport, ticket buying, ticket checking, entering and leaving or getting on and off are involved, and in the process, an AFC system of a traffic party is required to finish ticket selling, ticket checking, charging, statistics, point clearing, management and the like. The on-site payment ticket buying is the most original ticket buying mode of the AFC system, and with the continuous development and application of the Internet technology and the electronic payment technology, the AFC system also introduces a non-cash electronic payment mode, such as two-dimensional code riding codes of commercial banks, WeChat and Payment treasures.
No matter a one-way ticket or a traffic card is used or a two-dimensional code is used for getting on or off a station, a corresponding virtual account and a corresponding payment account are arranged behind the station. Therefore, if a non-contact medium is provided, the existing problems can be solved without active display, advance, carrying and special operation of passengers, and the binding with a traffic account and a payment account can be completed like a one-way ticket, a traffic card or a two-dimensional code.
The biological characteristics are natural certificates for personal identity confirmation, and the identity identification technology based on biological characteristic identification can provide a highly reliable and stable identity identification mode. The human face is an intrinsic attribute of a human, has strong self-stability and individual difference, and the human face recognition technology has the characteristics of directness, friendliness and convenience, so that a user has no psychological barrier and is easy to accept by the user. Therefore, among various biometric technologies, face recognition is a more promising biometric technology, and has also been widely used in the fields of information security, public security, finance, and the like.
Referring to fig. 1 to 4, the present invention provides a passing method based on face recognition, including the following steps:
s100, acquiring a face image of a target object to be passed through by one or more face recognition terminals; the one or more face recognition terminals can be arranged on gates at traffic entrance and/or exit, and can also be arranged at the getting-on position and/or the getting-off position of the bus. A face recognition terminal PAD in the method is provided with an infrared binocular camera module, a real person judgment algorithm and a high-performance main controller; the operation of the living body detection algorithm can be completed quickly; and various abundant interfaces are provided, and the communication can be carried out with gates of different models. The face recognition terminal PAD in the method can prompt the current passing state of the passenger in the PAD screen.
S200, performing living body detection on the acquired face image of the target object, and comparing the face image subjected to the living body detection with all face data in the face base. The method can eliminate the printed face image or the human head model by performing living body detection on the face image, and ensures that the acquired face image is the face image of a real person.
S300, acquiring face data with the similarity exceeding a preset threshold value with a face image and account information corresponding to the face data from a face bottom library according to a comparison result; the account information at least includes: transit account information and/or payment account information. The payment account information comprises payment treasure information, WeChat payment information, silver union flash payment information and the like.
S400, generating traffic information based on the acquired face data and the corresponding account information, and confirming whether the target object can pass according to the traffic information.
The method can complete the association binding of the face information with the traffic account and the payment account by taking the face recognition technology as the core on the basis of the AFC of the prior ticket selling and checking system. Therefore, when a passenger enters and exits the station, the passenger can be subjected to face snapshot through a face recognition terminal PAD arranged at a gate end of the station entrance and/or the station exit, the snapshot face is sent to a background face brushing passing system to carry out living body detection, face feature extraction and face comparison, and passing account information and payment account information corresponding to the passenger are found from an AFC (automatic fare collection) system according to a comparison result. Meanwhile, the passing account information and the payment account information corresponding to the passenger are judged through an AFC (automatic fare collection) system of the ticketing and ticket checking system, and the entering passing order information or the leaving passing order information and a gate control instruction are generated; and sending the generated gate control instruction to a ticket checker AGM at the gate end, and identifying the gate control instruction through the ticket checker AGM to control the gate at the station entrance or the station exit to be opened or continuously closed. And when the passenger leaves the station, the ticket selling and checking system AFC deducts money from the payment account corresponding to the recognized face data, so that the functions of face brushing, passing and payment of the passenger are realized. And when deducting money, the ticket selling and checking system AFC also sends a money deduction result to the payment management system for payment service statistics. According to the method, by means of uniqueness and uniqueness of face information, the face information of a passenger is associated and bound with traffic passing account information and payment account information of the passenger, when the passenger enters and exits a gate, the face identification terminal on the gate is used for carrying out face identification on the passenger, and after an identification result is transmitted to a ticket selling and checking system AFC, the ticket selling and checking system AFC can automatically judge the traffic passing account information and the payment account information of the passenger, so that a traffic card or a mobile phone is not required to be drawn out, media such as a bus card, a two-dimensional code and the like in the prior art are replaced, the efficiency of the passenger entering and exiting the station is improved, the congestion of the passenger when the passenger enters and exits the station or gets on and off the station in public traffic is reduced, and. The target object in the method can be a common passenger or a specific passenger.
In an exemplary embodiment, before comparing the face image after the living body detection with all face data in a face base, establishing a face base; the establishment process is as follows: collecting personal registration information of a target object through a traffic application program and/or a traffic interaction platform; the personal registration information includes at least: face data, mobile phone number information and mobile phone Bluetooth MAC address information; creating corresponding traffic account information based on the personal registration information of the target object; binding the created passing account information with the payment account information of the target object; and after the binding is finished, storing the personal registration information, the pass account information and the payment account information of the target object into one or more databases to establish a human face bottom library. According to the embodiment of the application, the real-name riding of the passenger can be realized by collecting the information such as the face data, the mobile phone number information, the mobile phone Bluetooth MAC address information and the identity card information of the passenger, and the riding record of the passenger can be traced conveniently. The traffic in the embodiment of the present application includes, but is not limited to, rail transit and an urban bus, that is, the traffic application includes, but is not limited to, a rail transit application and an urban bus traffic application, and the traffic interaction platform includes, but is not limited to, a rail transit interaction platform and an urban bus traffic interaction platform. By way of example, the traffic application may be a traffic APP and the traffic interaction platform may be a traffic wechat public number, for example.
In accordance with the above description, in an exemplary embodiment, as shown in fig. 3, the process of acquiring the face data of the target object by the traffic application and/or the traffic interaction platform includes: acquiring a face image of a target object through a traffic application program and/or a traffic interaction platform; carrying out living body detection on the collected face image; if the living body detection is not passed, the face image of the target object is collected again; the method comprises the steps of utilizing a quality score model to score face images detected through living bodies, and obtaining the face images with the quality scores exceeding a preset score value; and performing feature recognition on the face image with the quality score exceeding the preset score value to acquire the face features of the target object as the face data of the target object. By way of example, the traffic application in the embodiments of the present application may be a rail transit application, and the traffic interaction platform may be a rail transit interaction platform.
According to the above description, in an exemplary embodiment, as shown in fig. 2 to 4, when a target object enters or leaves, if face data of a plurality of target objects are obtained from a face base according to a comparison result of a certain target object, the method further includes: carrying out secondary verification on a certain target object by utilizing mobile phone number information and/or mobile phone Bluetooth MAC address information; acquiring final face data from a face base based on a secondary verification result, and acquiring account information corresponding to the final face data; or determining final face data from the face data of the plurality of target objects acquired last time based on the secondary verification result, and determining account information corresponding to the final face data. As an example, the obtaining of the face data of the passengers A, B and C from the face bottom library according to the comparison result of the passenger a further includes: and performing secondary verification on the passenger A by using the mobile phone number information and/or the mobile phone Bluetooth MAC address information, for example, the passenger A can input the last four-digit information of the mobile phone number on the rail transit APP and/or the rail transit WeChat public number to perform secondary verification. And acquiring the face data of the passenger A from the face bottom library based on the secondary verification result, and acquiring account information corresponding to the face data of the passenger A. As another example, if the face data of passengers X, Y and Z are obtained from the face bottom library according to the comparison result of passenger X, the method further includes: and performing secondary verification on the passenger X by using the mobile phone number information and/or the mobile phone Bluetooth MAC address information, wherein the passenger X can input the mobile phone Bluetooth MAC address information on the rail transit APP and/or the rail transit WeChat public number for secondary verification. Passenger X face data is determined from the face data of X, Y and Z based on the secondary verification result, and account information corresponding to the passenger X face data is determined.
According to the above description, in an exemplary embodiment, as shown in fig. 2 to 4, the face base library at least includes a master library, a site library and a gate side library; comparing the face image detected by the living body with all face data in the face base library to obtain a comparison result, wherein the process comprises the following steps: comparing the face image passing the living body detection with all face data in the gateway side database to obtain a result that the comparison is passed or not passed; and/or comparing the face image passing the living body detection with all face data in the site library to obtain a result that the comparison is passed or failed; and/or comparing the face image passing the living body detection with all face data in the total library to obtain a result that the comparison is passed or failed; the site library and/or the gate side library are obtained according to the master library, for example, corresponding face data are called from the master library to the site library and/or the gate side library according to actual conditions. As an example, the embodiment of the application adopts a face comparison mode with a multilevel architecture, that is, a face image of a passenger D detected by a living body is compared with all face data in a gate side library, if the face image passes comparison with the gate side library, face data with similarity to the face image of the passenger D exceeding a preset threshold and account information corresponding to the face data of the passenger D are acquired from the gate side library according to a comparison result; if the comparison with the gate side library fails, namely the gate side library does not have face data with the similarity with the face image of the passenger D exceeding a preset threshold, the face image passing the living body detection is compared with all the face data in the site library. And if the comparison with the site library is passed, acquiring the face data with the similarity exceeding a preset threshold value with the face image of the passenger D and the account information corresponding to the face data of the passenger D from the site library according to the comparison result. If the comparison with the station library fails, namely the similarity between the human face image of the passenger D and the human face data in the station library exceeds a preset threshold value, the human face image passing the living body detection is compared with all the human face data in the total library. And if the comparison with the master library is passed, acquiring the face data with the similarity exceeding a preset threshold value with the face image of the passenger D and the account information corresponding to the face data of the passenger D from the master library according to the comparison result. If the comparison with the total library fails, namely the similarity of the human face image of the passenger D and the human face data exceeding a preset threshold value does not exist in the total library, displaying a comparison failing result on a PAD (human face identification) terminal; and prompting passenger D to register in rail transit APP and/or rail transit WeChat public number. In the embodiment of the application, the face data of a specific person or a specific passenger can be stored in the station library and/or the gateway side library.
In an exemplary embodiment, if the face data stored in the site library and/or the gate side library exceeds a preset limited space, determining the first K target objects with the largest riding times within a preset time period or the first K target objects with the highest riding frequency within the preset time period as frequently-visited target objects; and eliminating the face data outside the frequently-accessed target object in the site library and/or the gate side library by utilizing an LRU algorithm. Specifically, when the data stored in the database exceeds a certain scale, the false recognition rate of the face recognition is increased, the passing rate of the face recognition is reduced, and the reason for this is that the algorithm cannot maintain the original high accuracy rate when the face recognition algorithm reaches the million level in the face bottom library. However, it is not normally enough for a domestic city to reach the million-class of passengers. Therefore, for the situation, the method acquires the hot spot data of each track traffic station through an LRU algorithm. The data is eliminated according to the historical access records of the data by a tag technology LRU algorithm (Least recently used), namely an elimination mechanism is established, and the data which is not used for the longest time is replaced by hot data, so that a frequent traveler face library is established for the website. The LRU core idea is that "if data has been accessed recently, then the probability of future access is higher; if a data is not accessed in the last period of time, it is less likely to be accessed in the future ". That is, when the defined space is full of data, data that has not been accessed for the longest time should be eliminated. Namely, the LRU algorithm is used for eliminating the first K target objects with the largest riding frequency in the preset time period or the face data except the first K target objects with the highest riding frequency in the preset time period. For a site, the number of frequent passengers of the site is generally not more than 10 ten thousand, which is completely within the high accuracy range of the algorithm. In the embodiment of the application, a station library can be deployed on the side of the station and used for rapidly identifying passengers who frequently enter and exit the station; or an gate-side library is deployed on the gate side for quickly identifying passengers who often enter and exit the gate. For the passengers who never appear, the passengers are identified in the master library, and when the passengers are identified to be multiple persons meeting the requirement of the similarity threshold, the passengers can be required to input the next four mobile phone numbers so as to uniquely confirm the passengers.
According to the above description, in an exemplary embodiment, as shown in fig. 2 to 4, when one or more passengers are located at the entrance of rail transit, the ticket gate AGM sends the current station, the current time, the current gate number, the face data of one or more passengers and the corresponding traffic account information to the ticket selling and checking system AFC, and generates the entrance traffic order information of one or more passengers; and transmitting the station-entering passing order information of one or more passengers to an AGM (automatic fare collection) machine for identification, and controlling the current gate to change the current state or maintain the current state according to the identification result. When one or more passengers are positioned at a rail transit exit, the AGM sends the current station, the current time, the current gate number, the face data of the one or more passengers, the corresponding traffic account information and the payment account information to an AFC (automatic fare collection) system to generate the exit traffic order information of the one or more passengers; and transmitting the outbound passing order information of one or more passengers to a ticket checking machine for identification, and controlling the current gate to change the current state or maintain the current state according to the identification result. In the method, the passenger entering station to generate the entering station passing order information and/or the passenger leaving station to generate the leaving station passing order information is realized based on the conventional ticket selling and checking system AFC and the ticket checking machine AGM, and the method is not repeated.
According to the above description, in an exemplary embodiment, when a passenger K is located at a rail entrance, a face recognition terminal PAD arranged at a gate end of the entrance takes a face snapshot of the passenger K, and sends the snapshot face to a background face-brushing passing system for live body detection, face feature extraction and face recognition. After the face recognition is carried out, the passenger recognition result is fed back to a face recognition terminal PAD, and if an abnormal condition occurs in the feedback process, the service desk is informed of manual work to process the abnormal condition. The method comprises the steps that an entrance request is sent to a ticket checking machine AGM through a face recognition terminal PAD, the ticket checking machine AGM sends the current station, the current time, the current gate serial number and face data of a passenger K to a ticket selling and checking system AFC, the ticket selling and checking system AFC generates entrance order information and gate control instructions of the passenger K, the generated entrance order information and the gate control instructions are transmitted to the ticket checking machine AGM, the gate control instructions are recognized through the ticket checking machine AGM, and the gate is opened. And after the gate is opened, whether the payment is needed or not is judged, and if not, the automatic fee deduction is not carried out. The face recognition process is as follows: firstly, comparing a face image of a passenger K detected through a living body with all face data in a gate side library, and if the face image passes the comparison with the gate side library, acquiring the face data with the similarity exceeding a preset threshold value with the face image of the passenger K and account information corresponding to the face data of the passenger K from the gate side library according to a comparison result; if the comparison with the gate side library fails, namely the gate side library does not have face data with the similarity with the face image of the passenger K exceeding a preset threshold, the face image passing the living body detection is compared with all the face data in the site library. And if the comparison with the site library is passed, acquiring the face data with the similarity exceeding a preset threshold value with the face image of the passenger K and the account information corresponding to the face data of the passenger K from the site library according to the comparison result. If the comparison with the station library fails, namely the similarity between the human face image of the passenger K and the human face data in the station library exceeds a preset threshold value, the human face image passing the living body detection is compared with all the human face data in the total library. And if the comparison with the master database is passed, acquiring the face data with the similarity exceeding a preset threshold value with the face image of the passenger K and the account information corresponding to the face data of the passenger K from the master database according to the comparison result. If the comparison with the total library fails, namely the similarity of the human face image of the passenger K and the human face data exceeding a preset threshold value does not exist in the total library, displaying a comparison failing result on a PAD (human face identification) terminal; and prompting the passenger K to register in the rail transit APP and/or the rail transit WeChat public number.
According to the above description, in an exemplary embodiment, when a passenger K is located at a rail exit, a face recognition terminal PAD arranged at a gate end of the exit performs face snapshot on the passenger K, and sends the snapshot face to a background face-brushing passing system for live body detection, face feature extraction and face recognition. After the face recognition is carried out, the passenger recognition result is fed back to a face recognition terminal PAD, and if an abnormal condition occurs in the feedback process, the service desk is informed of manual work to process the abnormal condition. The method comprises the steps that an outbound request is sent to a ticket checking machine AGM through a face recognition terminal PAD, the ticket checking machine AGM sends the current station, the current time, the current gate serial number and face data of a passenger K to a ticket selling and checking system AFC, the ticket selling and checking system AFC generates outbound passing order information and a gate control instruction of the passenger K, the generated outbound passing order information and the gate control instruction are transmitted to the ticket checking machine AGM, the gate control instruction is recognized through the ticket checking machine AGM, and the gate is opened. And after the gate is opened, judging whether the payment is needed or not, if the fee deduction is needed, deducting money from a payment account corresponding to the face data of the passenger K by the ticket selling and checking system AFC, so that the functions of face brushing and passing and payment of the passenger are realized, and when the fee deduction is carried out, the ticket selling and checking system AFC also sends a money deduction result to the payment management system for payment service statistics and sends a fee deduction prompt notice to the passenger K by the ticket selling and checking system AFC. When deducting money, the ticket selling and checking system AFC also judges whether the fee is successfully deducted, and when the payment is abnormal, the ticket selling and checking system AFC also informs a service desk to process the abnormal condition by manual work.
In summary, the invention provides a passing method based on face recognition, which can complete association binding of face information with a traffic passing account and a payment account by taking face recognition technology as a core on the basis of an AFC (automatic fare collection) of the existing ticket-selling and ticket-checking system. Therefore, when a passenger enters and exits the station, the passenger can be subjected to face snapshot through a face recognition terminal PAD arranged at a gate end of the station entrance and/or the station exit, the snapshot face is sent to a background face brushing passing system to carry out living body detection, face feature extraction and face comparison, and passing account information and payment account information corresponding to the passenger are found from an AFC (automatic fare collection) system according to a comparison result. Meanwhile, the passing account information and the payment account information corresponding to the passenger are judged through an AFC (automatic fare collection) system of the ticketing and ticket checking system, and the entering passing order information or the leaving passing order information and a gate control instruction are generated; and sending the generated gate control instruction to a ticket checker AGM at the gate end, and identifying the gate control instruction through the ticket checker AGM to control the gate at the station entrance or the station exit to be opened or continuously closed. And when the passenger leaves the station, the ticket selling and checking system AFC deducts money from the payment account corresponding to the recognized face data, so that the functions of face brushing, passing and payment of the passenger are realized. And when deducting money, the ticket selling and checking system AFC also sends a money deduction result to the payment management system for payment service statistics. According to the method, by means of uniqueness and uniqueness of face information, the face information of a passenger is associated and bound with traffic passing account information and payment account information of the passenger, when the passenger enters and exits a gate, the face identification terminal on the gate is used for carrying out face identification on the passenger, and after an identification result is transmitted to a ticket selling and checking system AFC, the ticket selling and checking system AFC can automatically judge the traffic passing account information and the payment account information of the passenger, so that a traffic card or a mobile phone is not required to be drawn out, media such as a bus card, a two-dimensional code and the like in the prior art are replaced, the efficiency of the passenger entering and exiting the station is improved, the congestion of the passenger when the passenger enters and exits the station or gets on and off the station in public traffic is reduced, and. The target object in the method can be a common passenger or a specific passenger. The method does not need media such as a traffic card, a mobile phone two-dimensional code and the like; the method can realize the non-inductive passing and non-inductive payment, and has good user experience. The method has the advantages that the passengers are registered in advance on the traffic APP and/or the traffic wechat public number in advance, and the face images of the passengers can be used for replacing media such as traffic cards, mobile phone two-dimensional codes and the like when the passengers enter and exit the station, so that ticket selling can be reduced, the workload of manual maintenance is reduced, and the operation and maintenance cost is reduced.
As shown in fig. 2 to 5, a passing system based on face recognition is characterized by comprising:
the acquisition module M10 is used for acquiring the face image of the target object through one or more face recognition terminals; the one or more face recognition terminals can be arranged on gates at traffic entrance and/or exit, and can also be arranged at the getting-on position and/or the getting-off position of the bus. A face recognition terminal PAD in the system is provided with an infrared binocular camera module, a real person judgment algorithm and a high-performance main controller; the operation of the living body detection algorithm can be completed quickly; and various abundant interfaces are provided, and the communication can be carried out with gates of different models. The face recognition terminal PAD in the system can prompt the current passing state of the passenger in the PAD screen.
And the comparison module M20 is used for performing living body detection on the acquired face image of the target object and comparing the face image subjected to living body detection with all face data in the face base. The system can eliminate the printed face image or the human head model by performing living body detection on the face image, and ensures that the acquired face image is the face image of a real person.
The account module M30 is configured to obtain, according to the comparison result, face data whose similarity with the face image exceeds a preset threshold from the face base, and account information corresponding to the face data; the account information at least includes: pass account information and/or payment account information; the payment account information comprises payment treasure information, WeChat payment information, silver union flash payment information and the like.
And the passing module M40 is used for generating passing information according to the acquired face data and the corresponding account information, and determining whether the target object can pass according to the passing information.
The system can complete the association binding of the face information with the traffic account and the payment account by taking the face recognition technology as the core on the basis of the AFC of the prior ticket selling and checking system. Therefore, when a passenger enters and exits the station, the passenger can be subjected to face snapshot through a face recognition terminal PAD arranged at a gate end of the station entrance and/or the station exit, the snapshot face is sent to a background face brushing passing system to carry out living body detection, face feature extraction and face comparison, and passing account information and payment account information corresponding to the passenger are found from an AFC (automatic fare collection) system according to a comparison result. Meanwhile, the passing account information and the payment account information corresponding to the passenger are judged through an AFC (automatic fare collection) system of the ticketing and ticket checking system, and the entering passing order information or the leaving passing order information and a gate control instruction are generated; and sending the generated gate control instruction to a ticket checker AGM at the gate end, and identifying the gate control instruction through the ticket checker AGM to control the gate at the station entrance or the station exit to be opened or continuously closed. And when the passenger leaves the station, the ticket selling and checking system AFC deducts money from the payment account corresponding to the recognized face data, so that the functions of face brushing, passing and payment of the passenger are realized. And when deducting money, the ticket selling and checking system AFC also sends a money deduction result to the payment management system for payment service statistics. The system binds the face information of the passenger with the traffic account information and the payment account information by means of the uniqueness and uniqueness of the face information, when the passenger enters and exits the gate, the face identification terminal on the gate is used for carrying out face identification on the passenger, and after the identification result is transmitted to the ticket selling and checking system AFC, the ticket selling and checking system AFC can automatically judge the traffic account information and the payment account information of the passenger, so that a traffic card or a mobile phone is not required to be drawn out, the media such as a bus card, a two-dimensional code and the like in the prior art are replaced, the efficiency of the passenger entering and exiting the station is improved, the congestion of the passenger when the passenger enters and exits the station or gets on and off the station in public traffic is reduced, and the traffic operation service quality is improved. The target object in the system can be a common passenger or a specific passenger.
In an exemplary embodiment, the comparing module M20 further establishes a face database before comparing the face image detected by the living body with all the face data in the face database; the establishment process is as follows: collecting personal registration information of a target object through a traffic application program and/or a traffic interaction platform; the personal registration information includes at least: face data, mobile phone number information and mobile phone Bluetooth MAC address information; creating corresponding traffic account information based on the personal registration information of the target object; binding the created passing account information with the payment account information of the target object; and after the binding is finished, storing the personal registration information, the pass account information and the payment account information of the target object into one or more databases to establish a human face bottom library. According to the embodiment of the application, the real-name riding of the passenger can be realized by collecting the information such as the face data, the mobile phone number information, the mobile phone Bluetooth MAC address information and the identity card information of the passenger, and the riding record of the passenger can be traced conveniently. The traffic in the embodiment of the present application includes, but is not limited to, rail transit and an urban bus, that is, the traffic application includes, but is not limited to, a rail transit application and an urban bus traffic application, and the traffic interaction platform includes, but is not limited to, a rail transit interaction platform and an urban bus traffic interaction platform. By way of example, the traffic application may be a traffic APP and the traffic interaction platform may be a traffic wechat public number, for example.
In accordance with the above description, in an exemplary embodiment, as shown in fig. 3, the process of acquiring the face data of the target object by the traffic application and/or the traffic interaction platform includes: acquiring a face image of a target object through a traffic application program and/or a traffic interaction platform; carrying out living body detection on the collected face image; if the living body detection is not passed, the face image of the target object is collected again; the method comprises the steps of utilizing a quality score model to score face images detected through living bodies, and obtaining the face images with the quality scores exceeding a preset score value; and performing feature recognition on the face image with the quality score exceeding the preset score value to acquire the face features of the target object as the face data of the target object. By way of example, the traffic application in the embodiments of the present application may be a rail transit application, and the traffic interaction platform may be a rail transit interaction platform.
According to the above description, in an exemplary embodiment, as shown in fig. 2 to 4, when a target object enters or leaves, if face data of a plurality of target objects are obtained from a face base according to a comparison result of a certain target object, the method further includes: carrying out secondary verification on a certain target object by utilizing mobile phone number information and/or mobile phone Bluetooth MAC address information; acquiring final face data from a face base based on a secondary verification result, and acquiring account information corresponding to the final face data; or determining final face data from the face data of the plurality of target objects acquired last time based on the secondary verification result, and determining account information corresponding to the final face data. As an example, the obtaining of the face data of the passengers A, B and C from the face bottom library according to the comparison result of the passenger a further includes: and performing secondary verification on the passenger A by using the mobile phone number information and/or the mobile phone Bluetooth MAC address information, for example, the passenger A can input the last four-digit information of the mobile phone number on the rail transit APP and/or the rail transit WeChat public number to perform secondary verification. And acquiring the face data of the passenger A from the face bottom library based on the secondary verification result, and acquiring account information corresponding to the face data of the passenger A. As another example, if the face data of passengers X, Y and Z are obtained from the face bottom library according to the comparison result of passenger X, the method further includes: and performing secondary verification on the passenger X by using the mobile phone number information and/or the mobile phone Bluetooth MAC address information, wherein the passenger X can input the mobile phone Bluetooth MAC address information on the rail transit APP and/or the rail transit WeChat public number for secondary verification. Passenger X face data is determined from the face data of X, Y and Z based on the secondary verification result, and account information corresponding to the passenger X face data is determined.
According to the above description, in an exemplary embodiment, as shown in fig. 2 to 4, the face base library at least includes a master library, a site library and a gate side library; comparing the face image detected by the living body with all face data in the face base library to obtain a comparison result, wherein the process comprises the following steps: comparing the face image passing the living body detection with all face data in the gateway side database to obtain a result that the comparison is passed or not passed; and/or comparing the face image passing the living body detection with all face data in the site library to obtain a result that the comparison is passed or failed; and/or comparing the face image passing the living body detection with all face data in the total library to obtain a result that the comparison is passed or failed; the site library and/or the gate side library are obtained according to the master library, for example, corresponding face data are called from the master library to the site library and/or the gate side library according to actual conditions. As an example, the embodiment of the application adopts a face comparison mode with a multilevel architecture, that is, a face image of a passenger D detected by a living body is compared with all face data in a gate side library, if the face image passes comparison with the gate side library, face data with similarity to the face image of the passenger D exceeding a preset threshold and account information corresponding to the face data of the passenger D are acquired from the gate side library according to a comparison result; if the comparison with the gate side library fails, namely the gate side library does not have face data with the similarity with the face image of the passenger D exceeding a preset threshold, the face image passing the living body detection is compared with all the face data in the site library. And if the comparison with the site library is passed, acquiring the face data with the similarity exceeding a preset threshold value with the face image of the passenger D and the account information corresponding to the face data of the passenger D from the site library according to the comparison result. If the comparison with the station library fails, namely the similarity between the human face image of the passenger D and the human face data in the station library exceeds a preset threshold value, the human face image passing the living body detection is compared with all the human face data in the total library. And if the comparison with the master library is passed, acquiring the face data with the similarity exceeding a preset threshold value with the face image of the passenger D and the account information corresponding to the face data of the passenger D from the master library according to the comparison result. If the comparison with the total library fails, namely the similarity of the human face image of the passenger D and the human face data exceeding a preset threshold value does not exist in the total library, displaying a comparison failing result on a PAD (human face identification) terminal; and prompting passenger D to register in rail transit APP and/or rail transit WeChat public number. In the embodiment of the application, the face data of a specific person or a specific passenger can be stored in the station library and/or the gateway side library.
In an exemplary embodiment, if the face data stored in the site library and/or the gate side library exceeds a preset limited space, determining the first K target objects with the largest riding times within a preset time period or the first K target objects with the highest riding frequency within the preset time period as frequently-visited target objects; and eliminating the face data outside the frequently-accessed target object in the site library and/or the gate side library by utilizing an LRU algorithm. Specifically, when the data stored in the database exceeds a certain scale, the false recognition rate of the face recognition is increased, the passing rate of the face recognition is reduced, and the reason for this is that the algorithm cannot maintain the original high accuracy rate when the face recognition algorithm reaches the million level in the face bottom library. However, it is not normally enough for a domestic city to reach the million-class of passengers. Therefore, for this situation, the system acquires the hot spot data of each track traffic station through the LRU algorithm. The data is eliminated according to the historical access records of the data by a tag technology LRU algorithm (Least recently used), namely an elimination mechanism is established, and the data which is not used for the longest time is replaced by hot data, so that a frequent traveler face library is established for the website. The LRU core idea is that "if data has been accessed recently, then the probability of future access is higher; if a data is not accessed in the last period of time, it is less likely to be accessed in the future ". That is, when the defined space is full of data, data that has not been accessed for the longest time should be eliminated. Namely, the LRU algorithm is used for eliminating the first K target objects with the largest riding frequency in the preset time period or the face data except the first K target objects with the highest riding frequency in the preset time period. For a site, the number of frequent passengers of the site is generally not more than 10 ten thousand, which is completely within the high accuracy range of the algorithm. In the embodiment of the application, a station library can be deployed on the side of the station and used for rapidly identifying passengers who frequently enter and exit the station; or an gate-side library is deployed on the gate side for quickly identifying passengers who often enter and exit the gate. For the passengers who never appear, the passengers are identified in the master library, and when the passengers are identified to be multiple persons meeting the requirement of the similarity threshold, the passengers can be required to input the next four mobile phone numbers so as to uniquely confirm the passengers.
According to the above description, in an exemplary embodiment, as shown in fig. 2 to 4, when one or more passengers are located at the entrance of rail transit, the ticket gate AGM sends the current station, the current time, the current gate number, the face data of one or more passengers and the corresponding traffic account information to the ticket selling and checking system AFC, and generates the entrance traffic order information of one or more passengers; and transmitting the station-entering passing order information of one or more passengers to an AGM (automatic fare collection) machine for identification, and controlling the current gate to change the current state or maintain the current state according to the identification result. When one or more passengers are positioned at a rail transit exit, the AGM sends the current station, the current time, the current gate number, the face data of the one or more passengers, the corresponding traffic account information and the payment account information to an AFC (automatic fare collection) system to generate the exit traffic order information of the one or more passengers; and transmitting the outbound passing order information of one or more passengers to a ticket checking machine for identification, and controlling the current gate to change the current state or maintain the current state according to the identification result. In the system, the passenger entering station generates entering station passing order information and/or the passenger leaving station generates leaving station passing order information, which is realized based on the prior ticket selling and checking system AFC and the ticket checking machine AGM, and the system is not repeated.
According to the above description, in an exemplary embodiment, when a passenger K is located at a rail entrance, a face recognition terminal PAD arranged at a gate end of the entrance takes a face snapshot of the passenger K, and sends the snapshot face to a background face-brushing passing system for live body detection, face feature extraction and face recognition. After the face recognition is carried out, the passenger recognition result is fed back to a face recognition terminal PAD, and if an abnormal condition occurs in the feedback process, the service desk is informed of manual work to process the abnormal condition. The method comprises the steps that an entrance request is sent to a ticket checking machine AGM through a face recognition terminal PAD, the ticket checking machine AGM sends the current station, the current time, the current gate serial number and face data of a passenger K to a ticket selling and checking system AFC, the ticket selling and checking system AFC generates entrance order information and gate control instructions of the passenger K, the generated entrance order information and the gate control instructions are transmitted to the ticket checking machine AGM, the gate control instructions are recognized through the ticket checking machine AGM, and the gate is opened. And after the gate is opened, whether the payment is needed or not is judged, and if not, the automatic fee deduction is not carried out. The face recognition process is as follows: firstly, comparing a face image of a passenger K detected through a living body with all face data in a gate side library, and if the face image passes the comparison with the gate side library, acquiring the face data with the similarity exceeding a preset threshold value with the face image of the passenger K and account information corresponding to the face data of the passenger K from the gate side library according to a comparison result; if the comparison with the gate side library fails, namely the gate side library does not have face data with the similarity with the face image of the passenger K exceeding a preset threshold, the face image passing the living body detection is compared with all the face data in the site library. And if the comparison with the site library is passed, acquiring the face data with the similarity exceeding a preset threshold value with the face image of the passenger K and the account information corresponding to the face data of the passenger K from the site library according to the comparison result. If the comparison with the station library fails, namely the similarity between the human face image of the passenger K and the human face data in the station library exceeds a preset threshold value, the human face image passing the living body detection is compared with all the human face data in the total library. And if the comparison with the master database is passed, acquiring the face data with the similarity exceeding a preset threshold value with the face image of the passenger K and the account information corresponding to the face data of the passenger K from the master database according to the comparison result. If the comparison with the total library fails, namely the similarity of the human face image of the passenger K and the human face data exceeding a preset threshold value does not exist in the total library, displaying a comparison failing result on a PAD (human face identification) terminal; and prompting the passenger K to register in the rail transit APP and/or the rail transit WeChat public number.
According to the above description, in an exemplary embodiment, when a passenger K is located at a rail exit, a face recognition terminal PAD arranged at a gate end of the exit performs face snapshot on the passenger K, and sends the snapshot face to a background face-brushing passing system for live body detection, face feature extraction and face recognition. After the face recognition is carried out, the passenger recognition result is fed back to a face recognition terminal PAD, and if an abnormal condition occurs in the feedback process, the service desk is informed of manual work to process the abnormal condition. The method comprises the steps that an outbound request is sent to a ticket checking machine AGM through a face recognition terminal PAD, the ticket checking machine AGM sends the current station, the current time, the current gate serial number and face data of a passenger K to a ticket selling and checking system AFC, the ticket selling and checking system AFC generates outbound passing order information and a gate control instruction of the passenger K, the generated outbound passing order information and the gate control instruction are transmitted to the ticket checking machine AGM, the gate control instruction is recognized through the ticket checking machine AGM, and the gate is opened. And after the gate is opened, judging whether the payment is needed or not, if the fee deduction is needed, deducting money from a payment account corresponding to the face data of the passenger K by the ticket selling and checking system AFC, so that the functions of face brushing and passing and payment of the passenger are realized, and when the fee deduction is carried out, the ticket selling and checking system AFC also sends a money deduction result to the payment management system for payment service statistics and sends a fee deduction prompt notice to the passenger K by the ticket selling and checking system AFC. When deducting money, the ticket selling and checking system AFC also judges whether the fee is successfully deducted, and when the payment is abnormal, the ticket selling and checking system AFC also informs a service desk to process the abnormal condition by manual work.
In summary, the invention provides a passing system based on face recognition, which can complete association binding of face information with a traffic passing account and a payment account by taking face recognition technology as a core on the basis of an AFC (automatic fare collection) of the existing ticket-selling and ticket-checking system. Therefore, when a passenger enters and exits the station, the passenger can be subjected to face snapshot through a face recognition terminal PAD arranged at a gate end of the station entrance and/or the station exit, the snapshot face is sent to a background face brushing passing system to carry out living body detection, face feature extraction and face comparison, and passing account information and payment account information corresponding to the passenger are found from an AFC (automatic fare collection) system according to a comparison result. Meanwhile, the passing account information and the payment account information corresponding to the passenger are judged through an AFC (automatic fare collection) system of the ticketing and ticket checking system, and the entering passing order information or the leaving passing order information and a gate control instruction are generated; and sending the generated gate control instruction to a ticket checker AGM at the gate end, and identifying the gate control instruction through the ticket checker AGM to control the gate at the station entrance or the station exit to be opened or continuously closed. And when the passenger leaves the station, the ticket selling and checking system AFC deducts money from the payment account corresponding to the recognized face data, so that the functions of face brushing, passing and payment of the passenger are realized. And when deducting money, the ticket selling and checking system AFC also sends a money deduction result to the payment management system for payment service statistics. The system binds the face information of the passenger with the traffic account information and the payment account information by means of the uniqueness and uniqueness of the face information, when the passenger enters and exits the gate, the face identification terminal on the gate is used for carrying out face identification on the passenger, and after the identification result is transmitted to the ticket selling and checking system AFC, the ticket selling and checking system AFC can automatically judge the traffic account information and the payment account information of the passenger, so that a traffic card or a mobile phone is not required to be drawn out, the media such as a bus card, a two-dimensional code and the like in the prior art are replaced, the efficiency of the passenger entering and exiting the station is improved, the congestion of the passenger when the passenger enters and exits the station or gets on and off the station in public traffic is reduced, and the traffic operation service quality is improved. The target object in the system can be a common passenger or a specific passenger. The system does not need media such as a traffic card, a mobile phone two-dimensional code and the like; the method can realize the non-inductive passing and non-inductive payment, and has good user experience. The system can use the face images of passengers to replace media such as traffic cards, mobile phone two-dimensional codes and the like when the passengers enter and exit the station by enabling the passengers to register in advance on the traffic APP and/or the traffic wechat public number, so that ticket selling can be reduced, the workload of manual maintenance is reduced, and the operation and maintenance cost is reduced.
The embodiment of the application further provides traffic passing equipment based on face recognition, which comprises:
acquiring a face image of a target object to be passed through by one or more face recognition terminals;
performing living body detection on the acquired face image of the target object, and comparing the face image subjected to the living body detection with all face data in a face bottom library;
acquiring face data with the similarity of the face image exceeding a preset threshold value and account information corresponding to the face data from a face bottom library according to the comparison result; the account information at least includes: pass account information and/or payment account information;
and generating traffic information based on the acquired face data and the corresponding account information, and determining whether the target object can pass according to the traffic information.
In this embodiment, the device executes the system or the method, and specific functions and technical effects are described with reference to the above embodiments, which are not described herein again.
An embodiment of the present application further provides a computer device, where the computer device may include: one or more processors; and one or more machine readable media having instructions stored thereon that, when executed by the one or more processors, cause the apparatus to perform the method of fig. 1. In practical applications, the device may be used as a terminal device, and may also be used as a server, where examples of the terminal device may include: the mobile terminal includes a smart phone, a tablet computer, an electronic book reader, an MP3 (Moving Picture Experts Group Audio Layer III) player, an MP4 (Moving Picture Experts Group Audio Layer IV) player, a laptop, a vehicle-mounted computer, a desktop computer, a set-top box, an intelligent television, a wearable device, and the like.
The present embodiment also provides a non-volatile readable storage medium, where one or more modules (programs) are stored in the storage medium, and when the one or more modules are applied to a device, the device may execute instructions (instructions) included in the data processing method in fig. 1 according to the present embodiment.
Fig. 6 is a schematic diagram of a hardware structure of a terminal device according to an embodiment of the present application. As shown, the terminal device may include: an input device 1100, a first processor 1101, an output device 1102, a first memory 1103, and at least one communication bus 1104. The communication bus 1104 is used to implement communication connections between the elements. The first memory 1103 may include a high-speed RAM memory, and may also include a non-volatile storage NVM, such as at least one disk memory, and the first memory 1103 may store various programs for performing various processing functions and implementing the method steps of the present embodiment.
Alternatively, the first processor 1101 may be, for example, a Central Processing Unit (CPU), an Application Specific Integrated Circuit (ASIC), a Digital Signal Processor (DSP), a Digital Signal Processing Device (DSPD), a Programmable Logic Device (PLD), a Field Programmable Gate Array (FPGA), a controller, a microcontroller, a microprocessor, or other electronic components, and the processor 1101 is coupled to the input device 1100 and the output device 1102 through a wired or wireless connection.
Optionally, the input device 1100 may include a variety of input devices, such as at least one of a user-oriented user interface, a device-oriented device interface, a software programmable interface, a camera, and a sensor. Optionally, the device interface facing the device may be a wired interface for data transmission between devices, or may be a hardware plug-in interface (e.g., a USB interface, a serial port, etc.) for data transmission between devices; optionally, the user-facing user interface may be, for example, a user-facing control key, a voice input device for receiving voice input, and a touch sensing device (e.g., a touch screen with a touch sensing function, a touch pad, etc.) for receiving user touch input; optionally, the programmable interface of the software may be, for example, an entry for a user to edit or modify a program, such as an input pin interface or an input interface of a chip; the output devices 1102 may include output devices such as a display, audio, and the like.
In this embodiment, the processor of the terminal device includes a function for executing each module of the speech recognition apparatus in each device, and specific functions and technical effects may refer to the above embodiments, which are not described herein again.
Fig. 7 is a schematic hardware structure diagram of a terminal device according to another embodiment of the present application. FIG. 7 is a specific embodiment of the implementation of FIG. 6. As shown, the terminal device of the present embodiment may include a second processor 1201 and a second memory 1202.
The second processor 1201 executes the computer program code stored in the second memory 1202 to implement the method described in fig. 1 in the above embodiment.
The second memory 1202 is configured to store various types of data to support operations at the terminal device. Examples of such data include instructions for any application or method operating on the terminal device, such as messages, pictures, videos, and so forth. The second memory 1202 may include a Random Access Memory (RAM) and may also include a non-volatile memory (non-volatile memory), such as at least one disk memory.
Optionally, a second processor 1201 is provided in the processing assembly 1200. The terminal device may further include: communication components 1203, power components 1204, multimedia components 1205, audio components 1206, input/output interfaces 1207, and/or sensor components 1208. The specific components included in the terminal device are set according to actual requirements, which is not limited in this embodiment.
The processing component 1200 generally controls the overall operation of the terminal device. The processing assembly 1200 may include one or more second processors 1201 to execute instructions to perform all or part of the steps of the method illustrated in fig. 1 described above. Further, the processing component 1200 can include one or more modules that facilitate interaction between the processing component 1200 and other components. For example, the processing component 1200 can include a multimedia module to facilitate interaction between the multimedia component 1205 and the processing component 1200.
The power supply component 1204 provides power to the various components of the terminal device. The power components 1204 may include a power management system, one or more power sources, and other components associated with generating, managing, and distributing power for the terminal device.
The multimedia components 1205 include a display screen that provides an output interface between the terminal device and the user. In some embodiments, the display screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the display screen includes a touch panel, the display screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation.
The audio component 1206 is configured to output and/or input speech signals. For example, the audio component 1206 includes a Microphone (MIC) configured to receive external voice signals when the terminal device is in an operational mode, such as a voice recognition mode. The received speech signal may further be stored in the second memory 1202 or transmitted via the communication component 1203. In some embodiments, audio component 1206 also includes a speaker for outputting voice signals.
The input/output interface 1207 provides an interface between the processing component 1200 and peripheral interface modules, which may be click wheels, buttons, etc. These buttons may include, but are not limited to: a volume button, a start button, and a lock button.
The sensor component 1208 includes one or more sensors for providing various aspects of status assessment for the terminal device. For example, the sensor component 1208 may detect an open/closed state of the terminal device, relative positioning of the components, presence or absence of user contact with the terminal device. The sensor assembly 1208 may include a proximity sensor configured to detect the presence of nearby objects without any physical contact, including detecting the distance between the user and the terminal device. In some embodiments, the sensor assembly 1208 may also include a camera or the like.
The communication component 1203 is configured to facilitate communications between the terminal device and other devices in a wired or wireless manner. The terminal device may access a wireless network based on a communication standard, such as WiFi, 2G or 3G, or a combination thereof. In one embodiment, the terminal device may include a SIM card slot therein for inserting a SIM card therein, so that the terminal device may log onto a GPRS network to establish communication with the server via the internet.
As can be seen from the above, the communication component 1203, the audio component 1206, the input/output interface 1207 and the sensor component 1208 in the embodiment of fig. 7 may be implemented as the input device in the embodiment of fig. 6.
The foregoing embodiments are merely illustrative of the principles and utilities of the present invention and are not intended to limit the invention. Any person skilled in the art can modify or change the above-mentioned embodiments without departing from the spirit and scope of the present invention. Accordingly, it is intended that all equivalent modifications or changes which can be made by those skilled in the art without departing from the spirit and technical spirit of the present invention be covered by the claims of the present invention.
Claims (13)
1. A passing method based on face recognition is characterized by comprising the following steps:
acquiring a face image of a target object to be passed;
performing living body detection on the acquired face image of the target object, and comparing the face image subjected to the living body detection with all face data in a face bottom library;
according to the comparison result, acquiring face data with the similarity exceeding a preset threshold value with the face image and account information corresponding to the face data from the face bottom library;
and generating passage information based on the acquired face data and the corresponding account information, and confirming whether the target object can pass according to the passage information.
2. The passing method based on face recognition of claim 1, wherein the account information at least comprises: pass account information and/or payment account information; before comparing the face image detected by the living body with all face data in a face base, establishing the face base; the establishment process is as follows:
collecting personal registration information of the target object through a traffic application program and/or a traffic interaction platform; the personal registration information includes at least: face data, mobile phone number information and mobile phone Bluetooth MAC address information;
creating corresponding traffic account information based on the personal registration information of the target object;
and binding the created passage account information with the payment account information of the target object, storing the personal registration information, the passage account information and the payment account information of the target object into one or more databases after the binding is finished, and establishing the human face bottom library.
3. The method for passing based on face recognition according to claim 2, wherein the process of acquiring the face data of the target object through a traffic application and/or a traffic interaction platform comprises:
acquiring a face image of the target object through a traffic application program and/or a traffic interaction platform;
performing living body detection on the acquired face image, and if the face image does not pass the living body detection, acquiring the face image of the target object again;
the method comprises the steps of utilizing a quality score model to score face images detected through living bodies, and obtaining the face images with the quality scores exceeding a preset score value;
and performing feature recognition on the face image with the quality score exceeding a preset score value to acquire the face features of the target object as face data of the target object.
4. The passing method based on face recognition according to claim 2, wherein when a target object enters or leaves, if face data of a plurality of target objects are obtained from the face base according to a comparison result of a certain target object, the passing method further comprises:
performing secondary verification on the certain target object by using mobile phone number information and/or mobile phone Bluetooth MAC address information;
acquiring final face data from the face base based on a secondary verification result, and acquiring account information corresponding to the final face data; or determining final face data from the face data of the plurality of target objects acquired last time based on the secondary verification result, and determining account information corresponding to the final face data.
5. The method for passing based on human face recognition according to any one of claims 1 to 4, wherein the human face bottom library is arranged in a master library, a site library and/or a gate side library; the site library and/or the gate side library are/is obtained according to the master library;
if the face data stored in the site library and/or the gate side library exceed a preset limited space, determining the first K target objects with the largest riding frequency in a preset time period or the first K target objects with the highest riding frequency in the preset time period as frequently-visited target objects;
and eliminating the face data except the frequently-accessed target object in the site library and/or the gate side library by utilizing an LRU algorithm.
6. The passing method based on the face recognition is characterized in that when the target object is confirmed to pass, a ticket checker sends a current station, a current time, a current gate number, face data of the target object and corresponding passing account information to a ticket selling and checking system to generate the entering and passing order information of the target object;
transmitting the inbound passing order information of the target object to the ticket checking machine for identification, and controlling the current gate to change the current state or maintain the current state according to the identification result;
and/or when the target object is positioned at a traffic exit, the ticket checking machine sends the current station, the current time, the current gate number, the face data of the target object, the corresponding traffic account information and the payment account information to a ticket selling and checking system to generate the exit traffic order information of the target object;
and transmitting the outbound passing order information of the target object to the ticket checking machine for identification, and controlling the current gate to change the current state or maintain the current state according to the identification result.
7. A passing system based on face recognition is characterized by comprising:
the acquisition module is used for acquiring a face image of a target object to be passed;
the comparison module is used for carrying out living body detection on the acquired face image of the target object and comparing the face image subjected to the living body detection with all face data in the face base;
the account module is used for acquiring face data with the similarity exceeding a preset threshold value with a face image and account information corresponding to the face data from the face bottom library according to a comparison result;
and the passing module is used for generating passing information according to the acquired face data and the corresponding account information and confirming whether the target object can pass or not according to the passing information.
8. The face recognition-based transit system of claim 7, wherein the account information comprises at least: pass account information and/or payment account information; before the comparison module compares the face image detected by the living body with all face data in a face base, establishing the face base; the establishment process is as follows:
collecting personal registration information of the target object through a traffic application program and/or a traffic interaction platform; the personal registration information includes at least: face data, mobile phone number information and mobile phone Bluetooth MAC address information;
creating corresponding traffic account information based on the personal registration information of the target object;
binding the created passing account information with the payment account information of the target object; and after the binding is finished, storing the personal registration information, the pass account information and the payment account information of the target object into one or more databases to establish the human face bottom library.
9. The system of claim 8, wherein the process of acquiring the face data of the target object through a traffic application and/or a traffic interaction platform comprises:
acquiring a face image of the target object through a traffic application program and/or a traffic interaction platform;
performing living body detection on the acquired face image, and if the face image does not pass the living body detection, acquiring the face image of the target object again;
the method comprises the steps of utilizing a quality score model to score face images detected through living bodies, and obtaining the face images with the quality scores exceeding a preset score value;
and performing feature recognition on the face image with the quality score exceeding a preset score value to acquire the face features of the target object as face data of the target object.
10. The passing system based on face recognition of claim 8, wherein when a target object enters or leaves a station, if face data of a plurality of target objects are obtained from the face base according to a comparison result of a certain target object, the passing system further comprises:
performing secondary verification on the certain target object by using mobile phone number information and/or mobile phone Bluetooth MAC address information;
acquiring final face data from the face base based on a secondary verification result, and acquiring account information corresponding to the final face data; or determining final face data from the face data of the plurality of target objects acquired last time based on the secondary verification result, and determining account information corresponding to the final face data.
11. The system for passing based on human face recognition according to any one of claims 7 to 10, wherein the human face bottom library at least comprises a master library, a site library and a gate side library; the site library and/or the gate side library are/is obtained according to the master library;
if the face data stored in the site library and/or the gate side library exceed a preset limited space, determining the first K target objects with the largest riding frequency in a preset time period or the first K target objects with the highest riding frequency in the preset time period as frequently-visited target objects;
and eliminating the face data except the frequently-accessed target object in the site library and/or the gate side library by utilizing an LRU algorithm.
12. A computer device, comprising:
one or more processors; and
one or more machine-readable media having instructions stored thereon that, when executed by the one or more processors, cause the apparatus to perform the method of any of claims 1-6.
13. One or more machine-readable media having instructions stored thereon, which when executed by one or more processors, cause an apparatus to perform the method of any of claims 1-6.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110386864.1A CN113077574A (en) | 2021-04-09 | 2021-04-09 | Passing method, system, equipment and medium based on face recognition |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110386864.1A CN113077574A (en) | 2021-04-09 | 2021-04-09 | Passing method, system, equipment and medium based on face recognition |
Publications (1)
Publication Number | Publication Date |
---|---|
CN113077574A true CN113077574A (en) | 2021-07-06 |
Family
ID=76617241
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110386864.1A Pending CN113077574A (en) | 2021-04-09 | 2021-04-09 | Passing method, system, equipment and medium based on face recognition |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113077574A (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114255537A (en) * | 2021-09-24 | 2022-03-29 | 国科众合创新集团有限公司 | Intelligent security check gate integrated system and method based on biological recognition |
CN114495294A (en) * | 2021-12-03 | 2022-05-13 | 华中科技大学鄂州工业技术研究院 | Non-inductive payment method and device for metro gate machine and storage medium |
CN115223253A (en) * | 2022-07-29 | 2022-10-21 | 深圳市八百通机电科技有限公司 | Subway ticketing system |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109636397A (en) * | 2018-11-13 | 2019-04-16 | 平安科技(深圳)有限公司 | Transit trip control method, device, computer equipment and storage medium |
CN109859364A (en) * | 2019-03-13 | 2019-06-07 | 厦门路桥信息股份有限公司 | Fast passing method, medium, terminal device and system based on face recognition technology |
CN110647823A (en) * | 2019-09-02 | 2020-01-03 | 中国建设银行股份有限公司 | Method and device for optimizing human face base |
CN111178139A (en) * | 2019-12-04 | 2020-05-19 | 北京沃东天骏信息技术有限公司 | Identity authentication method, payment method and payment equipment |
CN111223222A (en) * | 2020-01-06 | 2020-06-02 | 广州新科佳都科技有限公司 | Non-inductive brake passing method, device, equipment and storage medium based on MAC address information |
CN111292460A (en) * | 2020-02-27 | 2020-06-16 | 广州羊城通有限公司 | Control method and device based on subway face brushing authentication |
-
2021
- 2021-04-09 CN CN202110386864.1A patent/CN113077574A/en active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109636397A (en) * | 2018-11-13 | 2019-04-16 | 平安科技(深圳)有限公司 | Transit trip control method, device, computer equipment and storage medium |
CN109859364A (en) * | 2019-03-13 | 2019-06-07 | 厦门路桥信息股份有限公司 | Fast passing method, medium, terminal device and system based on face recognition technology |
CN110647823A (en) * | 2019-09-02 | 2020-01-03 | 中国建设银行股份有限公司 | Method and device for optimizing human face base |
CN111178139A (en) * | 2019-12-04 | 2020-05-19 | 北京沃东天骏信息技术有限公司 | Identity authentication method, payment method and payment equipment |
CN111223222A (en) * | 2020-01-06 | 2020-06-02 | 广州新科佳都科技有限公司 | Non-inductive brake passing method, device, equipment and storage medium based on MAC address information |
CN111292460A (en) * | 2020-02-27 | 2020-06-16 | 广州羊城通有限公司 | Control method and device based on subway face brushing authentication |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114255537A (en) * | 2021-09-24 | 2022-03-29 | 国科众合创新集团有限公司 | Intelligent security check gate integrated system and method based on biological recognition |
CN114495294A (en) * | 2021-12-03 | 2022-05-13 | 华中科技大学鄂州工业技术研究院 | Non-inductive payment method and device for metro gate machine and storage medium |
CN115223253A (en) * | 2022-07-29 | 2022-10-21 | 深圳市八百通机电科技有限公司 | Subway ticketing system |
CN115223253B (en) * | 2022-07-29 | 2024-06-25 | 深圳市八百通机电科技有限公司 | Subway ticketing system |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN113077574A (en) | Passing method, system, equipment and medium based on face recognition | |
CN201965675U (en) | Subway payment system without staying | |
CN106022758A (en) | Wireless router smart home managing method and wireless router | |
CA2862847C (en) | Identification system | |
CN118781720A (en) | Cash register processing terminal and cash register processing method | |
CN111696241A (en) | Scenic spot ticket checking and selling system and method based on face recognition | |
US10963880B2 (en) | System and method for realizing identity identification on the basis of radio frequency identification technology | |
CN108022181A (en) | Hotel quickly moves in method and device and electronic equipment | |
JP2013191173A (en) | Ticket examination system, automatic ticket examination apparatus and ticket medium | |
CN103632401A (en) | Transportation non-contact chip card toll equipment with identity identification function | |
JP2018018481A (en) | Server system for electronic authentication, program, electronic authentication method and electronic authentication system | |
CN109801056A (en) | A kind of method and device of traffic charging | |
CN106127904B (en) | A kind of monitoring system and method for preventing from stealing a ride | |
CN113052602A (en) | Method, device, machine readable medium and equipment for bus payment | |
CN111260843B (en) | Implementation method of multifunctional integrated teller machine and teller machine | |
CN110826442B (en) | Bill data processing method, system, equipment and medium based on in-vivo detection | |
TWI782926B (en) | Hands-free and ticketless fare collection system | |
KR100999525B1 (en) | Parking-control Method and System by employing mobile | |
KR100694853B1 (en) | Parking System using fingerprint detection | |
JP2021140798A (en) | Railroad use management system and management device | |
CN105513214A (en) | Bicycle storage/taking management method, system and device | |
CN101847232A (en) | Remote controller for TV data interactive device capable of deducting deposit in smart card | |
CN110827096A (en) | Bill data processing method, system, equipment and medium based on payment system | |
KR100857584B1 (en) | Parking control method and system using network | |
CN219658147U (en) | Ticket business processing device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20210706 |