CN111539018A - Target object identification processing method, system, equipment and medium - Google Patents

Target object identification processing method, system, equipment and medium Download PDF

Info

Publication number
CN111539018A
CN111539018A CN202010315785.7A CN202010315785A CN111539018A CN 111539018 A CN111539018 A CN 111539018A CN 202010315785 A CN202010315785 A CN 202010315785A CN 111539018 A CN111539018 A CN 111539018A
Authority
CN
China
Prior art keywords
target object
characteristic data
target
related information
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010315785.7A
Other languages
Chinese (zh)
Inventor
周曦
姚志强
谢科
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chongqing Zhongke Yuncong Technology Co ltd
Original Assignee
Chongqing Zhongke Yuncong Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chongqing Zhongke Yuncong Technology Co ltd filed Critical Chongqing Zhongke Yuncong Technology Co ltd
Priority to CN202010315785.7A priority Critical patent/CN111539018A/en
Publication of CN111539018A publication Critical patent/CN111539018A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/62Protecting access to data via a platform, e.g. using keys or access control rules
    • G06F21/6218Protecting access to data via a platform, e.g. using keys or access control rules to a system of files or objects, e.g. local or distributed file system or database
    • G06F21/6245Protecting personal data, e.g. for financial or medical purposes
    • G06F21/6254Protecting personal data, e.g. for financial or medical purposes by anonymising data, e.g. decorrelating personal data from the owner's identification

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Bioethics (AREA)
  • General Health & Medical Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Databases & Information Systems (AREA)
  • Computer Security & Cryptography (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Medical Informatics (AREA)
  • Navigation (AREA)
  • Devices For Checking Fares Or Tickets At Control Points (AREA)

Abstract

The invention provides a target object identification processing method, a system, equipment and a medium, comprising the following steps: acquiring one or more characteristic data of a target object at the current moment; comparing the characteristic data of the target object acquired at the current moment with the characteristic data of the target object acquired in advance, and matching one or more trip associated information for the target object according to the comparison result; and displaying or playing the travel associated information. The invention can be applied to airports, and ensures that the passengers can conveniently and visually check the information of names, flights, gates, boarding paths and the like under the condition that the passengers do not have boarding cards. Especially, the characteristic data of the target passenger at the current moment can be acquired at places such as an airport security inspection port and an airport isolation area, the characteristic data is compared with the characteristic data acquired in advance at the places such as the airport security inspection port, the trip related information is matched for the target passenger according to the comparison result, and the trip related information is displayed or played after the matching is completed.

Description

Target object identification processing method, system, equipment and medium
Technical Field
The present invention relates to the field of identification technologies, and in particular, to a target object identification processing method, system, device, and medium.
Background
At present, with the increasing pace of life and work, more and more people take airplanes as the first choice of a travel mode in order to save time. While people enjoy the convenience of airplane travel, the ever-increasing passenger flow also presents challenges to airport management. Most passengers usually check their flights at an airport flight display screen (flight display screen) and find their gates according to the boarding paths prompted at the airport.
However, according to the file requirements of 'paperless' convenient travel project plan for propelling tens of millions of airports 'in China' of civil aviation authorities, all large airports can comprehensively propel 'paperless' aircrafts. In the 'paperless' boarding implementation, passengers do not print paper boarding check cards in the whole boarding flow of an airport. Without a paper boarding pass, a traveler may forget information about his or her flight, gate, boarding path, etc. Therefore, there is a need to solve the problem that a traveler cannot clearly know information such as his/her flight, gate, boarding route, etc. without a paper boarding pass.
Disclosure of Invention
In view of the above-mentioned shortcomings of the prior art, it is an object of the present invention to provide a target object recognition processing method, system, device and medium, which are used to solve the problems existing in the prior art.
In order to achieve the above and other related objects, the present invention provides a target object identification processing method, including:
acquiring one or more characteristic data of a target object at the current moment;
comparing the characteristic data of the target object acquired at the current moment with the characteristic data of the target object acquired in advance, and matching one or more corresponding trip associated information for the target object according to the comparison result;
and displaying or playing the one or more trip related information.
Optionally, the method further includes acquiring feature data of the target object in one or more target areas in advance.
Optionally, the characteristic data includes at least one of: biometric data, physical characteristic data.
Optionally, the biometric data comprises at least one of: face, body, voice.
Optionally, the physical characteristic data comprises at least one of: identity card, account book, passport, air ticket, transaction order.
Optionally, the travel associated information includes at least one of: name, flight, gate, boarding path guide map, destination weather, and personalized recommendation information.
Optionally, before displaying or playing the travel related information, data desensitization is further performed on the travel related information.
Optionally, the target region comprises at least one of: airport security inspection port, airport isolation area.
The invention also provides a target object identification processing device, which comprises:
the acquisition module is used for acquiring one or more characteristic data of the target object at the current moment;
the comparison module is used for comparing the feature data of the target object acquired at the current moment with the feature data of the target object acquired in advance, and matching one or more corresponding trip related information for the target object according to the comparison result;
and the output module is used for displaying or playing the one or more trip related information.
Optionally, the method further includes acquiring feature data of the target object in one or more target areas in advance.
Optionally, the characteristic data includes at least one of: biometric data, physical characteristic data.
Optionally, the biometric data comprises at least one of: face, body, voice.
Optionally, the physical characteristic data comprises at least one of: identity card, account book, passport, air ticket, transaction order.
Optionally, the travel associated information includes at least one of: name, flight, gate, boarding path guide map, destination weather, and personalized recommendation information.
Optionally, before displaying or playing the travel related information, data desensitization is further performed on the travel related information.
Optionally, the target region comprises at least one of: airport security inspection port, airport isolation area.
The invention also provides a target object identification processing system, which comprises:
one or more acquisition devices for acquiring one or more characteristic data of a target object at the current moment;
the system comprises one or more processing devices, a processing device and a processing device, wherein the one or more processing devices are used for comparing the characteristic data of a target object acquired at the current moment with the characteristic data of the target object acquired in advance, and matching one or more corresponding trip related information for the target object according to the comparison result;
one or more output devices for displaying or playing the one or more travel associated information.
Optionally, the collecting device, the processing device, and the output device are integrated in the same device or different devices, respectively.
Optionally, the acquisition device comprises at least one of: cell-phone, computer, wearable equipment, security protection equipment, wisdom boat show terminal.
Optionally, the acquisition device is further configured to acquire feature data of the target object in one or more target regions in advance, and synchronize the acquired target feature data into the one or more processing devices.
Optionally, the processing device includes at least one of: cell-phone, computer, remote server, wisdom boat show data processing platform.
Optionally, the output device comprises at least one of: cell-phone, computer, wisdom navigation show terminal.
Optionally, the characteristic data includes at least one of: biometric data, physical characteristic data.
Optionally, the biometric data comprises at least one of: face, body, voice.
Optionally, the physical characteristic data comprises at least one of: identity card, account book, passport, air ticket, transaction order.
Optionally, the travel associated information includes at least one of: name, flight, gate, boarding path guide map, destination weather, and personalized recommendation information.
Optionally, before displaying or playing the travel related information, data desensitization is further performed on the travel related information.
Optionally, the target region comprises at least one of: airport security inspection port, airport isolation area.
The present invention also provides a target object recognition processing apparatus, including:
acquiring one or more characteristic data of a target object at the current moment;
comparing the characteristic data of the target object acquired at the current moment with the characteristic data of the target object acquired in advance, and matching one or more corresponding trip associated information for the target object according to the comparison result;
and displaying or playing the one or more trip related information.
The present invention also provides an apparatus comprising:
one or more processors; and
one or more machine-readable media having instructions stored thereon that, when executed by the one or more processors, cause the apparatus to perform a method as described in one or more of the above.
The present invention also provides one or more machine-readable media having instructions stored thereon, which when executed by one or more processors, cause an apparatus to perform the methods as described in one or more of the above.
As described above, the target object identification processing method, system, device and medium provided by the present invention have the following beneficial effects: firstly, acquiring one or more characteristic data of a target object at the current moment; comparing the characteristic data of the target object acquired at the current moment with the characteristic data of the target object acquired in advance, and matching one or more corresponding trip associated information for the target object according to the comparison result; and finally, displaying or playing the one or more trip related information. The invention can be applied to airports, and ensures that the passengers can conveniently and visually check the information of names, flights, gates, boarding paths and the like under the condition that the passengers do not have boarding cards. Especially, characteristic data such as a face, a human body, voice, an identity card, a account book, a passport, an air ticket, a transaction order and the like of a target passenger at the current moment can be acquired at places such as an airport security inspection port and an airport isolation area, the characteristic data is compared with characteristic data acquired in advance at the places such as the airport security inspection port, and travel associated information such as a name, a flight, a boarding port, a boarding path guide map, destination weather, personalized recommendation information and the like is matched for the target passenger according to a comparison result; and after the trip associated information is matched, displaying all or part of the trip associated information, or playing all or part of the trip associated information.
Drawings
Fig. 1 is a schematic flowchart of a target object identification processing method according to an embodiment;
fig. 2 is a schematic hardware structure diagram of a target object identification processing device according to an embodiment;
fig. 3 is a schematic hardware structure diagram of a target object recognition processing system according to an embodiment;
fig. 4 is a schematic hardware configuration diagram of a target object recognition processing system according to another embodiment;
fig. 5 is a schematic hardware configuration diagram of a target object recognition processing system according to another embodiment;
fig. 6 is a schematic hardware configuration diagram of a target object recognition processing system according to another embodiment;
fig. 7 is a schematic hardware structure diagram of a terminal device according to an embodiment;
fig. 8 is a schematic diagram of a hardware structure of a terminal device according to another embodiment.
Description of the element reference numerals
M10 acquisition module
M20 alignment module
M30 output module
M40 collection system
M50 processing device
M60 output device
40 first monitoring camera
41 airport platform/navigation platform
42 intelligent navigation display data processing platform
43 Intelligent navigation display terminal
50 first computer
51 first remote server
52 second computer
60 second monitoring camera
61 second remote server
62 Mobile phone
1100 input device
1101 first processor
1102 output device
1103 first memory
1104 communication bus
1200 processing assembly
1201 second processor
1202 second memory
1203 communication assembly
1204 Power supply Assembly
1205 multimedia assembly
1206 voice assembly
1207 input/output interface
1208 sensor assembly
Detailed Description
The embodiments of the present invention are described below with reference to specific embodiments, and other advantages and effects of the present invention will be easily understood by those skilled in the art from the disclosure of the present specification. The invention is capable of other and different embodiments and of being practiced or of being carried out in various ways, and its several details are capable of modification in various respects, all without departing from the spirit and scope of the present invention. It is to be noted that the features in the following embodiments and examples may be combined with each other without conflict.
It should be noted that the drawings provided in the following embodiments are only for illustrating the basic idea of the present invention, and the components related to the present invention are only shown in the drawings rather than drawn according to the number, shape and size of the components in actual implementation, and the type, quantity and proportion of the components in actual implementation may be changed freely, and the layout of the components may be more complicated.
Referring to fig. 1, the present invention provides a target object identification processing method, including:
s100, acquiring one or more characteristic data of a target object at the current moment;
s200, comparing the feature data of the target object acquired at the current moment with the feature data of the target object acquired in advance, and matching one or more corresponding trip related information for the target object according to the comparison result;
s300, displaying or playing the one or more trip related information.
The method comprises the steps of firstly, acquiring one or more characteristic data of a target object at the current moment; comparing the characteristic data of the target object acquired at the current moment with the characteristic data of the target object acquired in advance, and matching one or more corresponding trip associated information for the target object according to the comparison result; and finally, displaying or playing the one or more trip related information. The method can be applied to airports, and can ensure that the passengers can conveniently and visually check the information of names, flights, gates, boarding paths and the like under the condition that the passengers do not have boarding cards. Especially, feature data such as a face, a human body, voice, an identity card, a account book, a passport, an air ticket, a transaction order and the like of a target passenger at the current moment can be acquired at places such as an airport security inspection port and an airport isolation area, the feature data is compared with feature data acquired in advance at the places such as the airport security inspection port, travel associated information such as a name, a flight, a boarding port, a boarding path guide map, destination weather, personalized recommendation information and the like is matched for the target passenger according to a comparison result, and after the travel associated information is matched, all or part of the travel associated information is displayed or all or part of the travel associated information is played.
In an exemplary embodiment, the characteristic data includes at least one of: biometric data, physical characteristic data. The biological characteristic data is only characteristic data of the organism and can be used for distinguishing from characteristic data of other similar organisms; such as human faces, human bodies, speech, etc. The physical characteristic data is other characteristic data which is from an external environment and does not belong to biological characteristic data; such as identification cards, house books, passports, airline tickets, transaction orders, and the like.
In some exemplary embodiments, the travel associated information comprises at least one of: name, flight, gate, boarding path guide map, destination weather, and personalized recommendation information. Before the travel associated information is displayed, data desensitization is carried out on the travel associated information. Particularly, when the personal identity information is displayed, only the name containing desensitization is displayed; for example, wanoming appears as wanoming after data desensitization. The personalized recommendation information includes airport public information (such as airport bookstores, airport toilets, airport shops, and the like) and commercial advertisement information (such as tourist attractions of a destination, cate of the destination, and the like), and the personalized recommendation information may be differentiated from one object to another.
In some exemplary embodiments, the method further comprises obtaining feature data of the target object in one or more target regions in advance. The target area includes at least one of: airport security inspection port, airport isolation area. Wherein, the airport isolation area refers to the area within the airport security inspection port.
In a specific embodiment, feature data such as a face, a human body, voice, an identity card, a house number, a passport, an airline ticket, a transaction order and the like of each passenger are pre-recorded at an airport security inspection port through a collection device such as a mobile phone, a computer, a wearable device, a security device, a smart navigation display terminal and the like, and the recorded feature data such as the face, the human body and the like are synchronized to a processing device (for example, synchronized to a smart navigation display data processing platform) through an airport platform and/or an airline platform. Under the condition of no boarding check, if a certain passenger (namely a target passenger) needs to know travel related information such as flights, gates and the like; the target passenger can perform face recognition through the intelligent navigation display terminal, the intelligent navigation display terminal collects the face of the target passenger at the current moment, sends the face of the target passenger collected at the current moment to the intelligent navigation display data processing platform, and compares the face of the target passenger collected at the current moment with the faces of all passengers synchronized in advance through the intelligent navigation display data processing platform; and displaying or playing travel related information such as flights, boarding gates, boarding path guide maps, destination weather and the like matched with the target passenger in the intelligent navigation display terminal according to the comparison result. In the embodiment of the application, a passenger can directly perform face recognition through the intelligent navigation display terminal, the intelligent navigation display terminal collects the face of the passenger in the recognition process, the collected face of the passenger is compared through the intelligent navigation display data processing platform, and travel related information such as flights, boarding gates, boarding path guide maps and destination weather matched with the target passenger is displayed or played in the intelligent navigation display terminal according to the comparison result. Under the condition that the passenger does not have a boarding check, travel related information such as flights, boarding gates, boarding path guide maps, destination weather and the like can be obtained by carrying out face recognition through the intelligent navigation display terminal.
In another embodiment, the characteristic data such as the face, the human body, the voice, the identification card, the account book, the passport, the air ticket, the transaction order and the like of each passenger are pre-recorded in the airport isolation area through the collection device such as the mobile phone, the computer, the wearable device, the security device, the smart navigation display terminal and the like, and the recorded characteristic data such as the identification card, the transaction order and the like are synchronized to the processing device (for example, synchronized to the smart navigation display data processing platform) through the airport platform and/or the airline platform. Under the condition that a boarding check is not available, if a certain passenger (namely a target passenger) needs to know travel associated information such as flights, boarding gates and the like, the target passenger can input an identity card and a transaction order through the intelligent navigation display terminal, the intelligent navigation display terminal collects the identity card and the transaction order input by the target passenger at the current moment, sends the identity card and the transaction order collected at the current moment to the intelligent navigation display data processing platform, and compares the identity card and the transaction order input by the target passenger at the current moment with the identity cards and the transaction orders of all passengers synchronized in advance through the intelligent navigation display data processing platform; and matching travel associated information such as flights, boarding gates, boarding path guide maps, destination weather and the like for the target passenger according to the comparison result, and displaying or playing the travel associated information such as the flights, the boarding gates, the boarding path guide maps, the destination weather and the like matched for the target passenger in the intelligent navigation display terminal. In the embodiment of the application, a passenger can directly input an identity card and a transaction order through the intelligent navigation display terminal, the intelligent navigation display terminal collects the identity card and the transaction order input by the passenger, compares the identity card and the transaction order input by the passenger through data processing of the intelligent navigation display terminal, and displays or plays travel related information such as a flight, a boarding gate, a boarding path guide map and destination weather matched with the target passenger in the intelligent navigation display terminal according to a comparison result. Under the condition that the passenger does not have a boarding check, the travel related information such as flights, boarding gates, boarding path guide maps, destination weather and the like can be obtained by inputting the own identity card and transaction orders through the intelligent navigation display terminal.
As shown in fig. 2, the present invention further provides a target object recognition processing device, which includes:
the acquisition module M10 is configured to acquire one or more feature data of the target object at the current time;
a comparison module M20, configured to compare the feature data of the target object obtained at the current time with the feature data of the target object obtained in advance, and match one or more corresponding trip related information for the target object according to the comparison result;
and the output module M30 is configured to display or play the one or more trip related information.
The equipment firstly obtains one or more characteristic data of a target object at the current moment; comparing the characteristic data of the target object acquired at the current moment with the characteristic data of the target object acquired in advance, and matching one or more corresponding trip associated information for the target object according to the comparison result; and finally, displaying or playing the one or more trip related information. The equipment can be applied to airports, and ensures that passengers can conveniently and visually check information such as names, flights, boarding gates, boarding paths and the like under the condition that the passengers do not have boarding cards. Especially, feature data such as a face, a human body, voice, an identity card, a account book, a passport, an air ticket, a transaction order and the like of a target passenger at the current moment can be acquired at places such as an airport security inspection port and an airport isolation area, the feature data is compared with feature data acquired in advance at the places such as the airport security inspection port, travel associated information such as a name, a flight, a boarding port, a boarding path guide map, destination weather, personalized recommendation information and the like is matched for the target passenger according to a comparison result, and after the travel associated information is matched, all or part of the travel associated information is displayed or all or part of the travel associated information is played.
In an exemplary embodiment, the characteristic data includes at least one of: biometric data, physical characteristic data. The biological characteristic data is only characteristic data of the organism and can be used for distinguishing from characteristic data of other similar organisms; such as human faces, human bodies, speech, etc. The physical characteristic data is other characteristic data which is from an external environment and does not belong to biological characteristic data; such as identification cards, house books, passports, airline tickets, transaction orders, and the like.
In some exemplary embodiments, the travel associated information comprises at least one of: name, flight, gate, boarding path guide map, destination weather, and personalized recommendation information. Before the trip related information is displayed, carrying out data desensitization on the trip related information; particularly, when the personal identity information is displayed, only the name containing desensitization is displayed; for example, wanoming appears as wanoming after data desensitization. The personalized recommendation information includes airport public information (such as airport bookstores, airport toilets, airport shops, and the like) and commercial advertisement information (such as tourist attractions of a destination, cate of the destination, and the like), and the personalized recommendation information may be differentiated from one object to another.
In some exemplary embodiments, the method further comprises obtaining feature data of the target object in one or more target regions in advance. The target area includes at least one of: airport security inspection port, airport isolation area. Wherein, the airport isolation area refers to the area within the airport security inspection port.
In a specific embodiment, feature data such as a face, a human body, voice, an identity card, a house account book, a passport, an airline ticket, a transaction order and the like of each passenger are pre-recorded at an airport security inspection port through a collection device such as a mobile phone, a computer, a wearable device, a security device, a smart navigation display terminal and the like, and the recorded feature data such as the face, the human body and the like are directly synchronized into a processing device (for example, synchronized into a smart navigation display data processing platform). Under the condition of no boarding check, if a certain passenger (namely a target passenger) needs to know travel related information such as flights, gates and the like; the target passenger can identify a human body through the intelligent navigation display terminal, the intelligent navigation display terminal collects the human body of the target passenger at the current moment, sends the human body of the target passenger collected at the current moment to the intelligent navigation display data processing platform, and compares the human body of the target passenger collected at the current moment with the human bodies of all passengers synchronized in advance through the intelligent navigation display data processing platform; and displaying or playing travel related information such as flights, boarding gates, boarding path guide maps, destination weather and the like matched with the target passenger in the intelligent navigation display terminal according to the comparison result. In the embodiment of the application, a passenger can directly perform human body identification through the intelligent navigation display terminal, the intelligent navigation display terminal collects the human body of the passenger in the identification process, compares the collected human body of the passenger through the intelligent navigation display data processing platform, and displays or plays travel related information such as flights, boarding gates, boarding path guide maps and destination weather matched with the target passenger in the intelligent navigation display terminal according to the comparison result. The travel related information such as flights, boarding gates, boarding path guide maps, destination weather and the like can be obtained by the passengers through human body identification through the intelligent navigation display terminal under the condition that the boarding cards are not available.
In another embodiment, the characteristic data such as the face, the human body, the voice, the identification card, the account book, the passport, the air ticket, the transaction order and the like of each passenger are recorded in the airport isolation area in advance through a collection device such as a mobile phone, a computer, a wearable device, a security device, a smart navigation display terminal and the like, and the recorded characteristic data such as the identification card, the transaction order and the like are directly synchronized into a processing device (for example, synchronized into a smart navigation display data processing platform). Under the condition that a boarding check is not available, if a certain passenger (namely a target passenger) needs to know travel related information such as flights, boarding gates and the like, the target passenger can input passports and transaction orders through the intelligent navigation display terminal, the intelligent navigation display terminal collects the passports and the transaction orders input by the target passenger at the current moment, sends the passports and the transaction orders collected at the current moment to the intelligent navigation display data processing platform, and compares the passports and the transaction orders input by the target passenger at the current moment with the passports and the transaction orders of all passengers synchronized in the intelligent navigation display data processing platform in advance through the intelligent navigation display data processing platform; and matching travel associated information such as flights, boarding gates, boarding path guide maps and destination weather for the target passenger according to the comparison result, and displaying or playing the travel associated information such as the flights, the boarding gates, the boarding path guide maps and the destination weather matched for the target passenger in the intelligent navigation display terminal. In the embodiment of the application, a passenger can directly input a passport and a transaction order through the intelligent navigation display terminal, the intelligent navigation display terminal collects the passport and the transaction order input by the passenger, the passport and the transaction order input by the passenger are compared through the intelligent navigation display data processing platform, and travel related information such as a flight, a boarding gate, a boarding path guide map and destination weather matched with the target passenger is displayed or played in the intelligent navigation display terminal according to a comparison result. Under the condition that the passenger does not have a boarding check, the travel related information such as flights, boarding gates, boarding route guide maps, destination weather and the like can be obtained by inputting the passport and the transaction order of the passenger through the intelligent navigation display terminal.
As shown in fig. 3, the present invention further provides a target object recognition processing system, including:
one or more acquisition devices M40 for acquiring one or more characteristic data of the target object at the current time;
one or more processing devices M50, configured to compare the feature data of the target object obtained at the current time with the feature data of the target object obtained in advance, and match one or more corresponding trip related information for the target object according to the comparison result;
one or more output devices M60 for displaying or playing the one or more travel associated information.
The system firstly obtains one or more characteristic data of a target object at the current moment through a collecting device M40; then, the processing device M50 compares the feature data of the target object obtained at the current moment with the feature data of the target object obtained in advance, and matches out one or more corresponding trip related information for the target object according to the comparison result; and finally, displaying or playing the one or more trip related information through the output device M60. The system can be applied to airports, and ensures that passengers can conveniently and visually check information such as names, flights, gates, boarding paths and the like under the condition that the passengers do not have boarding cards. Especially, feature data such as a face, a human body, voice, an identity card, a account book, a passport, an air ticket, a transaction order and the like of a target passenger at the current moment can be acquired at places such as an airport security inspection port and an airport isolation area, the feature data is compared with feature data acquired in advance at the places such as the airport security inspection port, travel associated information such as a name, a flight, a boarding port, a boarding path guide map, destination weather, personalized recommendation information and the like is matched for the target passenger according to a comparison result, and after the travel associated information is matched, all or part of the travel associated information is displayed or all or part of the travel associated information is played.
In an exemplary embodiment, the collecting device M40, the processing device M50 and the output device M60 are integrated in the same device or different devices respectively. For example, in the embodiment of the present application, the acquisition device M40, the processing device M50, and the output device M60 may all be integrated in an avionics device, including all being integrated in an intelligent avionics terminal.
In an exemplary embodiment, the characteristic data includes at least one of: biometric data, physical characteristic data. The biological characteristic data is only characteristic data of the organism and can be used for distinguishing from characteristic data of other similar organisms; such as human faces, human bodies, speech, etc. The physical characteristic data is other characteristic data which is from an external environment and does not belong to biological characteristic data; such as identification cards, house books, passports, airline tickets, transaction orders, and the like.
In some exemplary embodiments, the travel associated information comprises at least one of: name, flight, gate, boarding path guide map, destination weather, and personalized recommendation information. Before the trip related information is displayed, carrying out data desensitization on the trip related information; particularly, when the personal identity information is displayed, only the name containing desensitization is displayed; for example, wanoming appears as wanoming after data desensitization. The personalized recommendation information includes airport public information (such as airport bookstores, airport toilets, airport shops, and the like) and commercial advertisement information (such as tourist attractions of a destination, cate of the destination, and the like), and the personalized recommendation information may be differentiated from one object to another.
In some exemplary embodiments, the acquisition device may be further configured to acquire feature data of the target object in one or more target regions in advance, and synchronize the acquired target feature data into the one or more processing devices. The target area includes at least one of: airport security inspection port, airport isolation area. Wherein, the airport isolation area refers to the area within the airport security inspection port.
In some exemplary embodiments, the acquisition device may include at least one of: cell-phone, computer, wearable equipment, security protection equipment, wisdom boat show terminal. The processing means may comprise at least one of: cell-phone, computer, remote server, wisdom boat show data processing platform, wisdom boat show terminal. The output means may comprise at least one of: cell-phone, computer, wisdom navigation show terminal.
As shown in fig. 4, in a specific embodiment, feature data such as a face, a human body, and voice of each passenger is pre-recorded at an airport security inspection port by a collecting device (e.g., the first monitoring camera 40 in the security device), and the recorded feature data such as the face, the human body, and the voice are synchronized to a processing device (e.g., the smart navigation display data processing platform 42) by the airport platform and/or the airline platform 41. Under the condition of no boarding check, if a certain passenger (namely a target passenger) needs to know travel related information such as flights, gates and the like; the target passenger can perform face, human body and voice recognition through the acquisition device (such as the intelligent navigation display terminal 43), the intelligent navigation display terminal 43 acquires the face, human body and voice of the target passenger at the current moment, and sends the face, human body and voice of the target passenger acquired at the current moment to the intelligent navigation display data processing platform 42, and the face, human body and voice of the target passenger acquired at the current moment are compared with the faces, human bodies and voices of all passengers synchronized in advance through the intelligent navigation display data processing platform 42; and displaying or playing travel related information such as flights, boarding gates, boarding route guide maps, destination weather and the like matched with the target passenger in the intelligent navigation display terminal 43 according to the comparison result. In this embodiment of the application, the passenger can directly perform face, human body and voice recognition through the smart navigation display terminal 43, the smart navigation display terminal 43 collects the face, human body and voice of the passenger at the current time in the recognition process, compares the collected face, human body and voice of the passenger through the smart navigation display data processing platform 42, and displays or plays travel related information such as flight, boarding gate, boarding path guidance diagram and destination weather matched for the target passenger in the smart navigation display terminal 43 according to the comparison result. The travel-related information such as flights, gates, boarding route guide maps, destination weather and the like can be obtained by the passengers through face recognition through the intelligent navigation display terminal 43 under the condition that the boarding check is not available. When the travel-related information is matched, travel-related information such as airport public information (e.g., airport bookstores, airport stores, airport toilets, etc.), personalized recommendation information (e.g., tourist spots of the destination, gourmet of the destination, etc.) may be matched for the traveler and displayed.
As shown in fig. 5, in one embodiment, the characteristic data of the traveler, such as the identification card and the transaction order, is pre-entered at the airport security inspection port through the collecting device (e.g., the first computer 50), and the entered characteristic data of the identification card and the transaction order is directly synchronized into the processing device (e.g., the first remote server 51). Under the condition of no boarding check, if the passenger needs to know travel associated information such as flights and boarding gates, the target passenger can input an identity card and a transaction order through a collecting device (such as a second computer 52), the second computer 52 collects the identity card and the transaction order input by the target passenger at the current moment, sends the identity card and the transaction order collected at the current moment to a first remote server 51, and compares the identity card and the transaction order input by the target passenger at the current moment with the identity cards and the transaction orders of all passengers synchronized in advance through the first remote server 51; and matches the travel related information such as flight, gate, boarding pass guidance map, destination weather, etc. for the target traveler according to the comparison result, and displays or plays the travel related information such as flight, gate, boarding pass guidance map, destination weather, etc. matched for the target traveler in the second computer 52. In this embodiment of the application, the passenger may directly input the identification card and the transaction order through the second computer 52, the second computer 52 collects the identification card and the transaction order input by the passenger at the current time, compares the identification card and the transaction order input by the passenger through the first remote server 51, and displays or plays travel related information such as a flight, a boarding gate, a boarding path guidance map, destination weather and the like matched for the target passenger in the second computer 52 according to a comparison result. The traveler can obtain travel related information such as flights, gates, boarding pass guide maps, destination weather, etc. by inputting his/her own identification card and transaction order through the second computer 52 without boarding pass. The first computer 50 and the second computer 52 are computers located at different positions of the airport, for example, the first computer 50 is located at a security inspection port, and the second computer 52 is located in an airport isolation area. When the travel-related information is matched, travel-related information such as airport public information (e.g., airport bookstores, airport stores, airport toilets, etc.), personalized recommendation information (e.g., tourist spots of the destination, gourmet of the destination, etc.) may be matched for the traveler and displayed.
As shown in fig. 6, in a specific embodiment, feature data such as a face, a human body, and voice of each passenger is pre-recorded in the airport isolation area through a collecting device (e.g., the second monitoring camera 60 in the security device), and the recorded feature data such as the face, the human body, and the voice are directly synchronized into a processing device (e.g., the second remote server 61). Under the condition of no boarding check, if a certain passenger (namely a target passenger) needs to know travel related information such as flights, gates and the like; the target passenger can recognize the face, the human body and the voice through a collecting device (such as a mobile phone 62), the mobile phone 62 collects the face, the human body and the voice of the target passenger at the current moment, the face, the human body and the voice of the target passenger collected at the current moment are sent to a second remote server 61, and the face, the human body and the voice of the target passenger collected at the current moment are compared with the faces, the human bodies and the voices of all passengers synchronized in advance through the second remote server 61; and displays or plays travel related information such as flights, gates, boarding route guide maps, destination weather and the like matched with the target passenger in the mobile phone 62 according to the comparison result. In the embodiment of the application, the passenger can directly recognize the face, the human body and the voice through the mobile phone 62, the mobile phone 62 collects the face, the human body and the voice of the passenger in the recognition process, compares the collected face, the human body and the voice of the passenger through the second remote server 61, and displays or plays travel related information such as a flight, a gate, a boarding path guidance map and destination weather matched with the target passenger in the mobile phone 62 according to the comparison result. Under the condition that the passenger does not have a boarding check, travel related information such as flights, boarding gates, boarding path guide maps, destination weather and the like can be obtained by carrying out face recognition through the intelligent navigation display terminal. In the embodiment of the present application, travel-related information such as airport public information (e.g., airport bookstores, airport shops, airport toilets, etc.), personalized recommendation information (e.g., tourist spots of the destination, gourmet of the destination, etc.) may be matched for the passenger, and displayed.
An embodiment of the present application further provides a target object identification processing apparatus, including:
acquiring one or more characteristic data of a target object at the current moment;
comparing the characteristic data of the target object acquired at the current moment with the characteristic data of the target object acquired in advance, and matching one or more corresponding trip associated information for the target object according to the comparison result;
and displaying or playing the one or more trip related information.
In this embodiment, the target object identification processing device executes the system or the method, and specific functions and technical effects may refer to the above embodiments, which are not described herein again.
An embodiment of the present application further provides an apparatus, which may include: one or more processors; and one or more machine readable media having instructions stored thereon that, when executed by the one or more processors, cause the apparatus to perform the method of fig. 1. In practical applications, the device may be used as a terminal device, and may also be used as a server, where examples of the terminal device may include: the mobile terminal includes a smart phone, a tablet computer, an electronic book reader, an MP3 (Moving Picture Experts Group Audio Layer III) player, an MP4 (Moving Picture Experts Group Audio Layer IV) player, a laptop, a vehicle-mounted computer, a desktop computer, a set-top box, an intelligent television, a wearable device, and the like.
Embodiments of the present application also provide a non-transitory readable storage medium, where one or more modules (programs) are stored in the storage medium, and when the one or more modules are applied to a device, the device may execute instructions (instructions) included in the method in fig. 1 according to the embodiments of the present application.
Fig. 7 is a schematic diagram of a hardware structure of a terminal device according to an embodiment of the present application. As shown, the terminal device may include: an input device 1100, a first processor 1101, an output device 1102, a first memory 1103, and at least one communication bus 1104. The communication bus 1104 is used to implement communication connections between the elements. The first memory 1103 may include a high-speed RAM memory, and may also include a non-volatile storage NVM, such as at least one disk memory, and the first memory 1103 may store various programs for performing various processing functions and implementing the method steps of the present embodiment.
Alternatively, the first processor 1101 may be, for example, a Central Processing Unit (CPU), an Application Specific Integrated Circuit (ASIC), a Digital Signal Processor (DSP), a Digital Signal Processing Device (DSPD), a Programmable Logic Device (PLD), a Field Programmable Gate Array (FPGA), a controller, a microcontroller, a microprocessor, or other electronic components, and the first processor 1101 is coupled to the input device 1100 and the output device 1102 through a wired or wireless connection.
Optionally, the input device 1100 may include a variety of input devices, such as at least one of a user-oriented user interface, a device-oriented device interface, a software programmable interface, a camera, and a sensor. Optionally, the device interface facing the device may be a wired interface for data transmission between devices, or may be a hardware plug-in interface (e.g., a USB interface, a serial port, etc.) for data transmission between devices; optionally, the user-facing user interface may be, for example, a user-facing control key, a voice input device for receiving voice input, and a touch sensing device (e.g., a touch screen with a touch sensing function, a touch pad, etc.) for receiving user touch input; optionally, the programmable interface of the software may be, for example, an entry for a user to edit or modify a program, such as an input pin interface or an input interface of a chip; the output devices 1102 may include output devices such as a display, audio, and the like.
In this embodiment, the processor of the terminal device includes a function for executing each module of the speech recognition apparatus in each device, and specific functions and technical effects may refer to the above embodiments, which are not described herein again.
Fig. 8 is a schematic hardware structure diagram of a terminal device according to an embodiment of the present application. FIG. 8 is a specific embodiment of FIG. 7 in an implementation. As shown, the terminal device of the present embodiment may include a second processor 1201 and a second memory 1202.
The second processor 1201 executes the computer program code stored in the second memory 1202 to implement the method described in fig. 1 in the above embodiment.
The second memory 1202 is adapted to store various types of data to support operation at the terminal device. Examples of such data include instructions for any application or method operating on the terminal device, such as messages, pictures, videos, and so forth. The second memory 1202 may include a Random Access Memory (RAM) and may also include a non-volatile memory (non-volatile memory), such as at least one disk memory.
Optionally, a second processor 1201 is provided in the processing assembly 1200. The terminal device may further include: communication component 1203, power component 1204, multimedia component 1205, speech component 1206, input/output interfaces 1207, and/or sensor component 1208. The specific components included in the terminal device are set according to actual requirements, which is not limited in this embodiment.
The processing component 1200 generally controls the overall operation of the terminal device. The processing assembly 1200 may include one or more second processors 1201 to execute instructions to perform all or part of the steps of the data processing method described above. Further, the processing component 1200 can include one or more modules that facilitate interaction between the processing component 1200 and other components. For example, the processing component 1200 can include a multimedia module to facilitate interaction between the multimedia component 1205 and the processing component 1200.
The power supply component 1204 provides power to the various components of the terminal device. The power components 1204 may include a power management system, one or more power sources, and other components associated with generating, managing, and distributing power for the terminal device.
The multimedia components 1205 include a display screen that provides an output interface between the terminal device and the user. In some embodiments, the display screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the display screen includes a touch panel, the display screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation.
The speech component 1206 is matched to output and/or input speech signals. For example, the voice component 1206 includes a Microphone (MIC) that is matched to receive external voice signals when the terminal device is in an operational mode, such as a voice recognition mode. The received speech signal may further be stored in the second memory 1202 or transmitted via the communication component 1203. In some embodiments, the speech component 1206 further comprises a speaker for outputting speech signals.
The input/output interface 1207 provides an interface between the processing component 1200 and peripheral interface modules, which may be click wheels, buttons, etc. These buttons may include, but are not limited to: a volume button, a start button, and a lock button.
The sensor component 1208 includes one or more sensors for providing various aspects of status assessment for the terminal device. For example, the sensor component 1208 may detect an open/closed state of the terminal device, relative positioning of the components, presence or absence of user contact with the terminal device. The sensor assembly 1208 may include a proximity sensor adapted to detect the presence of nearby objects without any physical contact, including detecting the distance between the user and the terminal device. In some embodiments, the sensor assembly 1208 may also include a camera or the like.
The communication component 1203 is adapted to facilitate communication between the terminal device and other devices in a wired or wireless manner. The terminal device may access a wireless network based on a communication standard, such as WiFi, 2G or 3G, or a combination thereof. In one embodiment, the terminal device may include a SIM card slot therein for inserting a SIM card therein, so that the terminal device may log onto a GPRS network to establish communication with the server via the internet.
As can be seen from the above, the communication component 1203, the voice component 1206, the input/output interface 1207 and the sensor component 1208 involved in the embodiment of fig. 8 can be implemented as the input device in the embodiment of fig. 7.
The foregoing embodiments are merely illustrative of the principles and utilities of the present invention and are not intended to limit the invention. Any person skilled in the art can modify or change the above-mentioned embodiments without departing from the spirit and scope of the present invention. Accordingly, it is intended that all equivalent modifications or changes which can be made by those skilled in the art without departing from the spirit and technical spirit of the present invention be covered by the claims of the present invention.

Claims (31)

1. A target object identification processing method is characterized by comprising the following steps:
acquiring one or more characteristic data of a target object at the current moment;
comparing the characteristic data of the target object acquired at the current moment with the characteristic data of the target object acquired in advance, and matching one or more corresponding trip associated information for the target object according to the comparison result;
and displaying or playing the one or more trip related information.
2. The target object identification processing method according to claim 1, further comprising acquiring feature data of the target object in one or more target areas in advance.
3. The target object recognition processing method according to claim 1 or 2, wherein the feature data includes at least one of: biometric data, physical characteristic data.
4. The target object recognition processing method of claim 3, wherein the biometric data includes at least one of: face, body, voice.
5. The target object recognition processing method of claim 3, wherein the physical characteristic data comprises at least one of: identity card, account book, passport, air ticket, transaction order.
6. The target object identification processing method according to claim 1, wherein the travel-related information includes at least one of: name, flight, gate, boarding path guide map, destination weather, and personalized recommendation information.
7. The target object identification processing method of claim 6, further comprising performing data desensitization on the travel related information before displaying or playing the travel related information.
8. The target object recognition processing method of claim 2, wherein the target area comprises at least one of: airport security inspection port, airport isolation area.
9. A target object recognition processing device is characterized by comprising:
the acquisition module is used for acquiring one or more characteristic data of the target object at the current moment;
the comparison module is used for comparing the feature data of the target object acquired at the current moment with the feature data of the target object acquired in advance, and matching one or more corresponding trip related information for the target object according to the comparison result;
and the output module is used for displaying or playing the one or more trip related information.
10. The apparatus according to claim 9, further comprising acquiring feature data of the target object in one or more target areas in advance.
11. The apparatus according to claim 9 or 10, wherein the feature data includes at least one of: biometric data, physical characteristic data.
12. The target object recognition processing device of claim 11, wherein the biometric data comprises at least one of: face, body, voice.
13. The target object recognition processing apparatus of claim 11, wherein the physical characteristic data includes at least one of: identity card, account book, passport, air ticket, transaction order.
14. The target object recognition processing device of claim 9, wherein the travel-related information includes at least one of: name, flight, gate, boarding path guide map, destination weather, and personalized recommendation information.
15. The target object recognition processing device of claim 14, further comprising performing data desensitization on the travel related information before displaying or playing the travel related information.
16. The apparatus according to claim 10, wherein the target area includes at least one of: airport security inspection port, airport isolation area.
17. A target object recognition processing system, comprising:
one or more acquisition devices for acquiring one or more characteristic data of a target object at the current moment;
the system comprises one or more processing devices, a processing device and a processing device, wherein the one or more processing devices are used for comparing the characteristic data of a target object acquired at the current moment with the characteristic data of the target object acquired in advance, and matching one or more corresponding trip related information for the target object according to the comparison result;
one or more output devices for displaying or playing the one or more travel associated information.
18. The system of claim 17, wherein the acquisition device, the processing device, and the output device are integrated in the same device or different devices, respectively.
19. The target object recognition processing system of claim 17, wherein the acquisition device comprises at least one of: cell-phone, computer, wearable system, security protection system, wisdom navigation show terminal.
20. The system of claim 17 or 19, wherein the acquisition device is further configured to acquire the feature data of the target object in one or more target areas in advance, and synchronize the acquired target feature data into the one or more processing devices.
21. The target object recognition processing system of claim 17, wherein the processing device comprises at least one of: cell-phone, computer, remote server, wisdom boat show data processing platform, wisdom boat show terminal.
22. The target object recognition processing system of claim 17, wherein the output device comprises at least one of: cell-phone, computer, wisdom navigation show terminal.
23. The target object recognition processing system of claim 17, wherein the characterization data includes at least one of: biometric data, physical characteristic data.
24. The target object recognition processing system of claim 23, wherein the biometric data includes at least one of: face, body, voice.
25. The target object recognition processing system of claim 23, wherein the physical characteristic data includes at least one of: identity card, account book, passport, air ticket, transaction order.
26. The target object recognition processing system of claim 17, wherein the travel associated information includes at least one of: name, flight, gate, boarding path guide map, destination weather, and personalized recommendation information.
27. The target object recognition processing system of claim 26, further comprising data desensitization of the travel related information prior to displaying or playing the travel related information.
28. The target object recognition processing system of claim 20, wherein the target area comprises at least one of: airport security inspection port, airport isolation area.
29. A target object recognition processing apparatus characterized by comprising:
acquiring one or more characteristic data of a target object at the current moment;
comparing the characteristic data of the target object acquired at the current moment with the characteristic data of the target object acquired in advance, and matching one or more corresponding trip associated information for the target object according to the comparison result;
and displaying or playing the one or more trip related information.
30. An apparatus, comprising:
one or more processors; and
one or more machine-readable media having instructions stored thereon that, when executed by the one or more processors, cause the apparatus to perform the method recited by one or more of claims 1-8.
31. One or more machine-readable media having instructions stored thereon, which when executed by one or more processors, cause an apparatus to perform the method recited by one or more of claims 1-8.
CN202010315785.7A 2020-04-21 2020-04-21 Target object identification processing method, system, equipment and medium Pending CN111539018A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010315785.7A CN111539018A (en) 2020-04-21 2020-04-21 Target object identification processing method, system, equipment and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010315785.7A CN111539018A (en) 2020-04-21 2020-04-21 Target object identification processing method, system, equipment and medium

Publications (1)

Publication Number Publication Date
CN111539018A true CN111539018A (en) 2020-08-14

Family

ID=71973063

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010315785.7A Pending CN111539018A (en) 2020-04-21 2020-04-21 Target object identification processing method, system, equipment and medium

Country Status (1)

Country Link
CN (1) CN111539018A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112396008A (en) * 2020-11-24 2021-02-23 国家电网有限公司客户服务中心 Sensitive information desensitization method and device, storage medium and electronic equipment

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101661585A (en) * 2009-09-22 2010-03-03 王舰 Paperless boarding system based on two-dimension code technology and biometric identification technology
CN104463761A (en) * 2014-11-17 2015-03-25 无锡知谷网络科技有限公司 Method for providing mobile self-service in air port and system

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101661585A (en) * 2009-09-22 2010-03-03 王舰 Paperless boarding system based on two-dimension code technology and biometric identification technology
CN104463761A (en) * 2014-11-17 2015-03-25 无锡知谷网络科技有限公司 Method for providing mobile self-service in air port and system

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112396008A (en) * 2020-11-24 2021-02-23 国家电网有限公司客户服务中心 Sensitive information desensitization method and device, storage medium and electronic equipment

Similar Documents

Publication Publication Date Title
CN108984594B (en) Presenting related points of interest
JP2022184940A (en) Intelligent presentation of document
EP1983413A1 (en) Information display device, information display method, and program
CN111369418A (en) Health data management method, system, machine readable medium and equipment
US7773832B2 (en) Image outputting apparatus, image outputting method and program
CN104007892A (en) Method for controlling display of multiple objects depending on input related to operation of mobile terminal, and mobile terminal therefor
US10319015B2 (en) Display method for displaying information desired by a user
CN107924267A (en) Situation privacy engine for notice
CN109447232A (en) Robot active inquiry method, apparatus, electronic equipment and storage medium
Elliott et al. Terminal experimentation: The transformation of experiences, events and escapes at global airports
CN111541951B (en) Video-based interactive processing method and device, terminal and readable storage medium
CN115334193A (en) Context-based notification display method and apparatus
US20160147792A1 (en) Image display method and device
US20200234389A1 (en) Notification apparatus, terminal, notification system, notification method, and storage medium
KR20130017179A (en) Mobile terminal, method for controlling of the mobile terminal and system
CN111539018A (en) Target object identification processing method, system, equipment and medium
CN108830980A (en) Security protection integral intelligent robot is received in Study of Intelligent Robot Control method, apparatus and attendance
TWM610088U (en) Portable identification device and identification system using electronic paper
JP3561418B2 (en) Information processing apparatus and information processing method
WO2008035397A1 (en) Information providing device, input information receiving method, and input information receiving program
JP3567336B2 (en) Airport baggage management system
CN104008483A (en) Method and apparatus for perfroming electronic transactions
KR102381939B1 (en) A method of managing study room
CN115700693A (en) Electronic card issuing method, device, computer equipment and storage medium
CN111738692A (en) Regional object management method and device, machine readable medium and equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20200814

RJ01 Rejection of invention patent application after publication