CN110992397A - Personnel entrance and exit trajectory tracking method and system, computer equipment and storage medium - Google Patents
Personnel entrance and exit trajectory tracking method and system, computer equipment and storage medium Download PDFInfo
- Publication number
- CN110992397A CN110992397A CN201910998596.1A CN201910998596A CN110992397A CN 110992397 A CN110992397 A CN 110992397A CN 201910998596 A CN201910998596 A CN 201910998596A CN 110992397 A CN110992397 A CN 110992397A
- Authority
- CN
- China
- Prior art keywords
- person
- shoe print
- time
- shoe
- attribute
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/172—Classification, e.g. identification
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30196—Human being; Person
- G06T2207/30201—Face
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30232—Surveillance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30241—Trajectory
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02P—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
- Y02P90/00—Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
- Y02P90/30—Computing systems specially adapted for manufacturing
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Human Computer Interaction (AREA)
- Image Analysis (AREA)
Abstract
The application relates to a method, a system, a computer device and a storage medium for tracking a person entering and exiting trajectory. The method comprises the following steps: acquiring a first shoe print, acquiring the first time of the first shoe print, and acquiring a first person image at the first time; acquiring a first attribute through the first shoe print, acquiring a second attribute according to the first person image, and associating the first person image with the first shoe print according to the first attribute and the second attribute; after the association is successful, a first event performed by the first person at the first time is obtained. The collected shoe print information and the collected personnel image information in the same time and space are associated with the personnel image of the same personnel and the shoe print according to the personnel attribute acquired by the shoe print information and the personnel attribute acquired by the personnel image, so that the event executed by the personnel at the entrance and exit is judged more accurately, and a more accurate personnel entrance and exit track is acquired, and accurate passenger flow statistical analysis is achieved.
Description
Technical Field
The application relates to the technical field of entrance and exit people counting, in particular to a people entrance and exit track tracking method, a people entrance and exit track tracking system, computer equipment and a storage medium.
Background
There is a need for accurate statistical analysis of passenger flow in many public environments, for example, in public transportation or elevators, it is necessary to perform big data analysis by data such as the number of passengers getting on and off a stop or floor, the number of passengers getting on a stop, etc. to obtain accurate passenger flow data. The entrance and exit of the public transport means has the characteristics that: the passing personnel has no identity mark; the number of passing personnel is large, and the situation of crowding exists; once a ticket is taken on a bus, and a card is swiped, coins are inserted or electronic payment is carried out, people cannot know which station to take off the bus, and accurate statistical analysis of passenger flow cannot be carried out. In the related art, in the process of collecting personnel images through a camera for passenger flow statistical analysis, a lot of scenes exist, and the complete facial images of the personnel cannot be captured, for example, the camera installed inside a public transport means cannot collect the human face images because of crowdedness of the personnel or the fact that the personnel face towards the outside, parking points of some public transport means do not have fixed positions, and the human face images cannot be collected through the camera of a parking station, in addition, at an entrance and an exit of a market or an office building, when the personnel flow volume is large, the camera cannot accurately collect the problem that the passenger flow statistical data of each person in and out is inaccurate.
Aiming at the problem of inaccurate passenger flow statistical data in the related art, no effective solution is provided at present.
Disclosure of Invention
In view of the above, it is necessary to provide a method, a system, a computer device and a storage medium for tracking a person entering and exiting trajectory.
To achieve the above object, according to one aspect of the present invention, there is provided a method for tracking a person entry and exit trajectory, the method including:
acquiring a first shoe print, acquiring a first time of the first shoe print, and acquiring a first person image at the first time;
obtaining a first attribute through the first shoe print, obtaining a second attribute according to the first person image, and associating the first person image with the first shoe print according to the first attribute and the second attribute;
and after the association is successful, obtaining a first event executed by a first person at the first time, wherein the first person corresponds to the first person image.
In one embodiment, said obtaining a first attribute from said first shoe print, obtaining a second attribute from said first person image, and associating said first person image with said first shoe print based on said first attribute and said second attribute comprises:
acquiring a first attribute through the first shoe print, and acquiring a second attribute through the first person image;
the first attribute includes: a first spatial distribution characteristic, a first gender and a first estimated height;
the second attribute includes: a second spatial distribution characteristic, a second gender and a second estimated height;
associating the first shoe print and the first person image to a first person if the first spatially distributed feature and the second spatially distributed feature match, the first gender and the second gender match, and the first estimated height and the second estimated height match.
In one embodiment, obtaining a first event performed by a first person at the first time after the association is successful comprises:
acquiring a first face direction of the first person in the first person image, and acquiring a first shoe print direction of the first shoe print;
under the condition that the first face direction and the first shoe print direction face to a first direction, it is judged that the first person executes a first event at a first time.
In one embodiment, after the association is successful and after a first event performed by a first person at the first time is obtained, the method includes:
acquiring a second shoe print and second time when the second shoe print is acquired, and acquiring a second person image of the first person at the second time under the condition that the second shoe print is consistent with the first shoe print;
acquiring a second face direction of the first person and a second shoe print direction of the second shoe print in the second person image;
and under the condition that the second face direction and the second shoe print direction face a second direction, judging that the first person executes a second event at the second time.
In one embodiment, after the association is successful and after a first event performed by a first person at the first time is obtained, the method includes:
the method comprises the steps of obtaining a second shoe print and second time for collecting the second shoe print, recording the second time as last collection time under the condition that the second shoe print is consistent with the first shoe print, not collecting the shoe print consistent with the first shoe print within a preset time threshold value after the second time, and executing a second event by a first person at the second time.
According to another aspect of the present invention, there is provided a method for tracking a person entry and exit trajectory, the method comprising:
collecting a first shoe print at an entrance and an exit of a vehicle and a first time when the first shoe print is collected, and acquiring a first person image at the first time;
obtaining a first attribute through the first shoe print, obtaining a second attribute according to the first person image, and associating the first person image with the first shoe print according to the first attribute and the second attribute;
and after the association is successful, acquiring a first site according to the first time, and obtaining that a first person executes a first event at the first site at the first time, wherein the first person corresponds to the first person image.
In one embodiment, said obtaining a first attribute from said first shoe print, obtaining a second attribute from said first person image, and associating said first person image with said first shoe print based on said first attribute and said second attribute comprises:
acquiring a first attribute through the shoe print, and acquiring a second attribute through the personnel image;
the first attribute includes: a first spatial distribution characteristic, a first gender and a first estimated height;
the second attribute includes: a second spatial distribution characteristic, a second gender and a second estimated height;
associating the shoe print and the person image to the same person if the first spatially distributed feature and the second spatially distributed feature coincide, the first gender and the second gender coincide, and the first estimated height and the second estimated height coincide.
In one embodiment, after the association is successful, acquiring a first site according to the first time, and obtaining a first event executed by a first person at the first site at the first time includes:
acquiring a first face direction of the first person in the first person image, and acquiring a first shoe print direction of the first shoe print;
and under the condition that the first face direction and the first shoe mark direction face towards the interior of the carriage of the public transport means, judging that the first person gets on the bus at a first station at the first time.
In one embodiment, the obtaining of the first site according to the first time is performed after a first event performed by the first person at the first site at the first time is obtained, and the method includes:
acquiring a second shoe print and second time when the second shoe print is acquired, and acquiring a second person image of the first person at the second time under the condition that the second shoe print is consistent with the first shoe print;
acquiring a second face direction of the first person and a second shoe print direction of the second shoe print in the second person image;
and under the condition that the second face direction and the second shoe mark direction face the outside of the carriage of the public transport means, acquiring a second station according to the second time, and judging that the first person gets off at the second station at the second time.
In one embodiment, the obtaining of the first site according to the first time is performed after a first event performed by the first person at the first site at the first time is obtained, and the method includes:
the method comprises the steps of obtaining a second shoe print and collecting second time of the second shoe print, recording the second time as last collecting time under the condition that the second shoe print is consistent with the first shoe print, not collecting the shoe print consistent with the first shoe print within a preset time threshold value after the second time, obtaining a second station according to the second time, and obtaining a second time, wherein a first person is in the second station to execute a second event.
In one embodiment, the capturing the first shoe print and the obtaining the second shoe print at the vehicle entrance comprises:
collecting the first shoe print or acquiring the second shoe print at the entrance and the exit of the vehicle when the door of the vehicle is opened;
in a door-closed state of the vehicle, the first shoe print at the vehicle entrance is not captured and the second shoe print is not acquired.
In one embodiment, the collecting the first shoe print at the vehicle entrance comprises:
collecting the first shoe print at the entrance and the exit of the vehicle when the door of the vehicle is in an open state;
and under the state that the vehicle door of the vehicle is closed, the first shoe print at the vehicle entrance is not collected.
According to another aspect of the present invention, there is provided a person entry and exit trajectory tracking system, the system comprising: shoe print collection equipment, image collection equipment and analysis processor:
the shoe print acquisition equipment acquires a first shoe print and first time for acquiring the first shoe print, and sends the first shoe print to the analysis processor;
the image acquisition equipment acquires a first person image at the first time and sends the first person image to the analysis processor;
the analysis processor acquires a first attribute through the first shoe print, acquires a second attribute according to the first person image, and associates the first person image with the first shoe print according to the first attribute and the second attribute; after the association is successful, the analysis processor acquires a first position at the first time, and obtains a first event executed by the first person at the first position, wherein the first person corresponds to the first person image.
According to another aspect of the present invention, there is also provided a computer device, comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor implements the steps of the above-mentioned person entry and exit trajectory tracking method when executing the computer program.
According to another aspect of the present invention, there is also provided a computer readable storage medium, on which a computer program is stored, which when executed by a processor, implements the steps of a person entry and exit trajectory tracking method as described above.
According to the method, the system, the computer equipment and the storage medium for tracking the personnel entrance and exit track, the personnel image and the shoe stamp of the same person are associated according to the acquired shoe stamp information and the personnel image information in the same time and space and the personnel attribute acquired by the shoe stamp information and the personnel attribute acquired by the personnel image, so that the event executed by the personnel at the entrance and exit is judged more accurately, the more accurate personnel entrance and exit track is acquired, and the accurate passenger flow statistical analysis is achieved.
Drawings
Fig. 1 is a view illustrating an application scenario of a method for tracking a person's entry and exit trajectory according to an embodiment of the present invention;
FIG. 2 is a flow chart of a method for tracking a person's entry and exit trajectory according to an embodiment of the present invention;
fig. 3 is a diagram illustrating an application scenario of a method for tracking a person's entry and exit trajectory according to another embodiment of the present invention;
fig. 4 is a first flowchart of a method for tracking a person's entry and exit trajectory according to another embodiment of the present invention;
FIG. 5 is a flowchart illustrating a second method for tracking a person's entry and exit trajectory according to another embodiment of the present invention;
fig. 6 is a flowchart illustrating a method for tracking a person's entry and exit trajectory according to another embodiment of the present invention;
fig. 7 is a fourth flowchart illustrating a method for tracking a person's entry and exit trajectory according to another embodiment of the present invention;
fig. 8 is a first flowchart of a method for tracking a person's entry and exit trajectory according to another embodiment of the present invention;
FIG. 9 is a schematic view of shoe prints collected by a tracking method for a person's entry and exit trajectory according to another embodiment of the present invention;
FIG. 10 is a flowchart illustrating a second method for tracking a person's entry and exit trajectory according to another embodiment of the present invention;
fig. 11 is a schematic diagram of a person entry and exit trajectory tracking system in accordance with an embodiment of the present invention;
fig. 12 is a schematic diagram of a person entry and exit trajectory tracking system according to another embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
Fig. 1 is a view of an application scenario of a method for tracking a person entry and exit trajectory according to an embodiment of the present invention, as shown in fig. 1, wherein a shoe print collecting device 12 and an image collecting device 14 may be connected to an analysis processor through a network. The shoe print collecting device 12 is positioned at the entrance A and can collect shoe prints of persons entering and exiting the entrance A, and the image collecting device 14 is positioned near the entrance A and can collect images of persons entering and exiting the entrance A.
In an embodiment, fig. 2 is a flowchart of a method for tracking a person entry and exit trajectory according to an embodiment of the present invention, and as shown in fig. 2, a method for tracking a person entry and exit trajectory is provided, which is exemplified by the application of the method to the shoe mark capturing device 12 and the image capturing device 14 in fig. 1, and includes the following steps:
step S210, acquiring a first shoe print, acquiring the first time of the first shoe print, and acquiring a first person image at the first time;
in step S210, the shoe mark collecting device 12 collects a first shoe mark and records the timestamp, and obtains a first time of collecting the shoe mark according to the timestamp, and obtains a first person image captured by the image collecting device 14 at the first time, and by this step S210, it is obtained that the shoe mark information collected by the shoe mark collecting device 12 and the person image information collected by the image collecting device 14 are in the same space at the same time.
Step S220, acquiring a first attribute through the first shoe print, acquiring a second attribute according to the first person image, and associating the first person image with the first shoe print according to the first attribute and the second attribute;
in step S220, a first attribute of a person corresponding to the first shoe stamp is obtained through the first shoe stamp, and a second attribute of the person corresponding to the person image is obtained through the first person image, where the first attribute and the second attribute are obtained mainly by means of a preset judgment rule based on experience, for example, the gender of the person corresponding to the first shoe stamp can be judged through the size and shape of the first shoe stamp, and even the weight of the person corresponding to the first shoe stamp can be judged if the pressure generated by the first shoe stamp can be collected; and the face and body shape characteristics of the person corresponding to the first person image can be collected through the first person image, when the person attributes in the first attribute are consistent with the person attributes in the second attribute, the first shoe mark and the first person image can be judged to belong to the same person, and the first person image and the first shoe mark are correlated and correspond to the same person. In step S220, shoe print information and person image information corresponding to the same person are obtained according to the first shoe print and the first person image acquired by the shoe print acquiring device 12 and the image acquiring device 14 at the same time in the same space, so that the problems that the identity of the person in or out cannot be recognized when only the shoe print acquiring device is provided, and the specific action of the person in or out cannot be accurately counted and judged under the crowded condition when only the person image acquiring device is provided are overcome.
Step S230, after the association is successful, obtaining a first event executed by a first person at a first time, wherein the first person corresponds to the first person image;
in this step S230, after the first person image and the first shoe print are correlated, a first event performed by the first person at the doorway a at the first time may be judged through the first shoe print and the facial and body movements presented by the first person in the first person image, where the first event may be a description of the first person movement or a judgment of its movement trajectory based on the first person movement, for example, the first person answers a call at the doorway a, or the first person leaves the building from the doorway a. In combination with the first time and the identification of the first person image, more specific information can be obtained, for example, at 9 am: and 20, judging that a first person enters the entrance A according to the first shoe print, and judging that the identity of the first person is that the worker is king through the first person image, so that the fact that the worker is king enters the entrance A at 9:20 am can be accurately counted.
According to the method for tracking the person in-out trajectory, shoe prints are collected by the shoe print collection device 12, person images are collected by the image collection device 14, the collected shoe prints and person images in the same space at the same time are compared with the person attributes obtained by the person images, the person images with the consistent person attributes are associated with the shoe prints, and events executed by persons at an entrance and an exit are judged more accurately.
The first time in the above embodiments may be linked with other systems, for example, when the mobile application scene is used, the position of the mobile scene may be obtained according to the first time, so as to further determine the position of the first time execution.
In one embodiment, obtaining a first attribute from the first shoe print, obtaining a second attribute from the first person image, and associating the first person image with the first shoe print based on the first attribute and the second attribute comprises:
acquiring a first attribute through the first shoe print, and acquiring a second attribute through the first person image;
the first attribute includes: a first spatial distribution characteristic, a first gender and a first estimated height;
the second attribute includes: a second spatial distribution characteristic, a second gender and a second estimated height;
associating the first shoe print and the first person image to the same person if the first spatially distributed feature and the second spatially distributed feature coincide, the first gender and the second gender coincide, and the first estimated height and the second estimated height coincide.
In this embodiment, the shoe print collecting device 12 can collect a plurality of shoe prints at the same time point, and the first spatial distribution characteristic of the person, such as the southwest corner located in the shoe print collecting area, can be determined by the different shoe prints distributed in the shoe print image; the first sex and the first estimated height of the person can be further judged through the length of the shoe mark. The image acquisition device 14 acquires a plurality of person images at the same time point, and spatial position distributions of different persons in the images are also different, for example, four person images are captured simultaneously in the image, so that a second spatial distribution characteristic can be obtained according to the distributions of the four persons in the images, the shoe print acquisition device 12 can be selected as a reference for the second distribution characteristic, the distribution of the first person in the shoe print image is judged according to the position of the first person in the image, the second spatial distribution characteristic is compared with the first spatial distribution characteristic, and shoe prints with consistent spatial distribution characteristics and images corresponding to the same person can be obtained; in addition, the second sex of the person can be further judged through the face, the hairstyle and the like, meanwhile, the height difference among different persons can be identified through the person image, and the second estimated height of the person is calculated. By combining the first spatial distribution characteristic of the first shoe print, the second spatial distribution characteristic of the first person at the spatial position in the first person image, the first gender, the second gender, the first estimated height and the second estimated height, the association relationship between different shoe prints and different persons can be accurately judged.
In one embodiment, obtaining a first event performed by a first person at a first time after the association is successful comprises:
acquiring a first face direction of a first person in a first person image and acquiring a first shoe print direction of a first shoe print; under the condition that the first face direction and the first shoe print direction face the first direction, it is judged that the first person executes the first event at the first time.
In this embodiment, under the condition that the first person image and the first shoe mark both belong to the first person, the first face direction of the first person in the first person image is obtained, for example, when the first face direction and the first shoe mark direction of the first person face toward the inside of the building where the doorway a is located, it can be determined that the first person enters the doorway a. Similarly, when the first face direction and the first shoe print direction of the first person are both directed to the outside of the building where the doorway a is located, it can be determined that the first person leaves the doorway a. The human face and the footprint are in the same direction, the angles of the first human face direction and the first shoe print direction are not required to be completely the same, but the directions are in the same range, so that the possibility that a person who twists head and heels while walking of a first person speaks or looks in other directions can be covered. Through the embodiment, the specific event executed by the first person at the first time can be estimated through the first face image and the first shoe print of the first person, and the accuracy of passenger flow analysis is further improved.
In one embodiment, after the association is successful, obtaining a second shoe impression and a second time at which the second shoe impression was captured after the first event was performed by the first person at the first time, and obtaining a second person image of the first person at the second time if the second shoe impression is consistent with the first shoe impression;
acquiring a second face direction of the first person and a second shoe print direction of the second shoe print in the second person image; and under the condition that the second face direction and the second shoe print direction face the second direction, judging that the first person executes the second event at the second time.
In this embodiment, the shoe print collecting device 12 collects the second shoe print, and after comparison, finds that the second shoe print is identical to the first shoe print, that is, the second shoe print also belongs to the first person, obtains a second time for collecting the second shoe print, and obtains a second person image captured by the image collecting device 14 at the second time, obtains a second face direction of the first person in the second person image, obtains a second shoe print direction of the foot of the first person in the second shoe print, and when the second face direction is identical to the second shoe print direction, for example, when both the second face direction and the second shoe print direction face the outside of the building where the entrance a is located, it is determined that the first person leaves the entrance a. In this embodiment, the shoe print collecting device 12 records the collected shoe print and compares the shoe print collected later with the shoe print collected earlier, wherein the recording and comparing can set the effective time according to the application scene, for example, the effective time is 24 hours, the shoe print collecting device 12 will empty the database storing the shoe print once every 24 hours, and then collect the shoe print again and store the shoe print in the database; the shoe print collecting device 12 can also collect shoe prints and send the shoe prints to other analyzing devices for comparison by the analyzing devices. Through this embodiment, the behavior that the human face cannot be shot corresponds to the first person who has been correlated before, for example, when the first person enters the building, after authentication is performed through the shot human face, the image of the first person is correlated with the shoe mark, so when the first person leaves the building, the image acquisition device 14 can judge that the first person is the first person through shoe mark comparison under the condition that the facial features of the first person cannot be captured, when the shoe mark acquisition device 12 acquires the second shoe mark of the first person, and still, the first person can be judged to have left the entrance a at the second time by facing the outside of the building through the direction of the second human face and the direction of the second shoe mark. Through the implementation mode in the embodiment, the event executed by the person corresponding to the person image can be judged by using the associated person image and shoe print under the condition that the face characteristics of the person cannot be captured, and the accuracy of passenger flow statistical analysis is improved.
In one embodiment, after the association is successful, resulting in a first event being performed by a first person at a first time, the method includes:
the second time of obtaining second shoe print and gathering second shoe print under the unanimous condition of second shoe print and first shoe print, obtain second shoe print and gather the second time of second shoe print under the unanimous condition of second shoe print with first shoe print, record the second time and be the time of gathering at last time the second time after gather in the predetermined time threshold value not again with the unanimous shoe print of first shoe print, according to the second time obtains the second website, obtains the second time, first personnel are in the second website carries out the second incident.
In this embodiment, it is determined that the last collecting time of the shoe imprints is the leaving time of the first person, and in a preset time threshold after the second time, if the shoe imprints consistent with the first shoe imprints are collected again, the last collecting time corresponding to the first shoe imprints is updated, the preset time threshold may be set according to a specific application scenario, for example, when the scheme is applied to an elevator, a situation that the first person moves to the vicinity of the elevator door but does not leave the elevator immediately may occur. If the shoe prints belonging to the first person are collected, it can be judged that the first person stays at the entrance. Through the implementation mode in the embodiment, the misjudgment of people when staying at the entrance can be overcome, meanwhile, the special situation that people leave the entrance in a backward posture can be overcome, for example, the time threshold is set to be the running time of the elevator for one time, even if the face direction and the shoe mark direction of the first person face the inside of the elevator, the shoe mark of the first person is not collected after the second time, the first person can still be judged to leave the elevator at the second time, and the accuracy of passenger flow statistical analysis data is further improved.
Fig. 3 is a view showing an application scenario of the method for tracking a person entering and exiting trajectory according to another embodiment of the present invention, as shown in fig. 3, when the method for tracking a person entering and exiting trajectory is applied to a vehicle, a shoe print collecting device 12 is disposed at an entrance a, an image collecting device 14 faces the entrance a, a shoe print collecting device 16 is disposed at an entrance B, and an image collecting device 18 faces the entrance B. Similarly, when the vehicle has only one entrance, only one group of collecting devices is arranged, and when the vehicle has more entrances, more groups of collecting devices can be arranged. The shoe print collecting device 12, the shoe print collecting device 16, the image collecting device 14 and the image collecting device 18 can be connected through a network, and can also be connected to the same analysis processor through the network for data interaction.
In an embodiment, fig. 4 is a flowchart of a method for tracking a person entry and exit trajectory according to another embodiment of the present invention, and as shown in fig. 4, a method for tracking a person entry and exit trajectory is provided, which is described by taking as an example the method applied to the shoe mark capturing device 12 and the image capturing device 14 in fig. 3, or the shoe mark capturing device 16 and the image capturing device 18, and includes the following steps:
step S410: collecting a first shoe print at an entrance of a vehicle and a first time when the first shoe print is collected, and acquiring a first person image at the first time;
in step S410, a shoe mark collecting device 12 or a shoe mark collecting device 16 on a public transport vehicle collects a first shoe mark and records a timestamp, a first time when the shoe mark collecting device 12 or the shoe mark collecting device 16 collects the shoe mark is obtained according to the timestamp, a first person image captured by an image collecting device 14 or an image collecting device 18 which belongs to the same entrance and exit with the shoe mark collecting device at the first time is obtained, and shoe mark information and person image information collected by the shoe mark collecting device 12 or the shoe mark collecting device 16 and the image collecting device 14 or the image collecting device 18 in the same space at the same time are obtained through step S410.
Step S420: acquiring a first attribute through the first shoe print, acquiring a second attribute according to the first person image, and associating the first person image with the first shoe print according to the first attribute and the second attribute;
in step S420, a first attribute of a person corresponding to the shoe seal is obtained through the shoe seal, and a second attribute of the person corresponding to the person image is obtained through the first person image, where the first attribute and the second attribute are obtained mainly by a preset judgment rule based on experience, for example, the gender of the person corresponding to the first shoe seal can be judged through the size and shape of the first shoe seal, and even the weight of the person corresponding to the first shoe seal can be judged if the pressure generated by the first shoe seal can be collected; and the face and body shape characteristics of the person corresponding to the first person image can be collected through the first person image, when the person attributes in the first attribute are consistent with the person attributes in the second attribute, the first shoe mark and the first person image can be judged to belong to the same person, and the first person image and the first shoe mark are correlated and correspond to the same person. In step S420, shoe print information and person image information corresponding to the same person are obtained according to the first shoe print and the first person image acquired by the group of shoe print acquisition devices and image acquisition devices at the same time in the same space, so that the problems that the identity of the person in and out cannot be recognized when only the shoe print acquisition devices are provided, and the flow of people cannot be accurately counted and the specific actions of the person in and out cannot be judged under the crowded condition when only the person image acquisition devices are provided are overcome, and the accuracy of passenger flow statistical analysis is improved.
Step S430: and after the association is successful, acquiring a first site according to the first time, and obtaining that the first person executes a first event at the first site at the first time, wherein the first person corresponds to the first person image.
In step S430, after the first person image and the first shoe print are associated with each other, an event that the first person performed at the entrance a of the vehicle at the first time is determined according to the first shoe print and the facial and body movements of the first person represented in the first person image, where the performed time mainly refers to getting on/off behaviors, for example, 9 a.m. on a certain day: and 20, judging that the first person enters the vehicle according to the first shoe mark, judging that the identity of the first person is Wang-a through the entrance A by the first person image, accurately counting that the first person enters the vehicle through the entrance A at 9:20 am on a certain day, and obtaining the geographical position and the first station information of the vehicle at the time by a positioning module on the vehicle or data in a management system of the vehicle, so that the first person enters the vehicle through the entrance A at the first station at 9:20 am on a certain day.
According to the method for tracking the entrance and exit trajectory of the people, shoe imprints are collected by shoe imprint collection equipment 12 or 16, personnel images are collected by image collection equipment 14 or 18, the collected shoe imprints and personnel images in the same time and space are compared with personnel attributes obtained by the personnel images, the personnel images with consistent personnel attributes are associated with the shoe imprints, events executed by the people at an entrance and an exit are judged more accurately, meanwhile, stations where the transportation means are located are obtained according to the shoe imprint collection time, on one hand, more accurate entrance and exit trajectory of the people is obtained, so that accurate passenger flow statistical analysis is achieved, on the other hand, a record corresponding to the shoe imprints and the personnel images is created, and subsequent calling of the record is facilitated.
In one embodiment, obtaining a first attribute from the first shoe print, obtaining a second attribute from the first person image, and associating the first person image with the first shoe print based on the first attribute and the second attribute comprises:
acquiring a first attribute through a shoe print, and acquiring a second attribute through a personnel image;
the first attribute includes: a first spatial distribution characteristic, a first gender and a first estimated height;
the second attribute includes: a second spatial distribution characteristic, a second gender and a second estimated height;
associating the shoe print and the person image to the same person if the first spatially distributed feature and the second spatially distributed feature coincide, the first gender and the second gender coincide, and the first estimated height and the second estimated height coincide.
In this embodiment, the shoe print collecting device 12 or the shoe print collecting device 16 can collect a plurality of shoe prints at the same time point, and the distribution of different shoe prints in the image is different, so that the first spatial distribution characteristic of the person, such as the southwest corner located in the shoe print collecting area, can be judged; the first sex and the first estimated height of the person can be further judged through the length of the shoe mark. The image acquisition equipment 14 or the image acquisition equipment 18 acquires a plurality of personnel images at the same time point, and the spatial distribution of different personnel in the images is different, so that the second distribution characteristics of the personnel can be obtained; the second sex of the person can be further judged through the face, the hairstyle and the like, meanwhile, the height difference among different persons can be identified through the person image, and the second estimated height of the person is calculated. By combining the first spatial distribution characteristic of the shoe print, the second spatial distribution characteristic of the person in the image, the first gender, the second gender, the first estimated height and the second estimated height, the association relationship between different shoe prints and different persons can be accurately judged.
In an embodiment, fig. 5 is a second flowchart of a method for tracking a person entry and exit trajectory according to another embodiment of the present invention, as shown in fig. 5, after the association is successful, the obtaining a first station according to a first time, and obtaining a first event executed by a first person at the first station at the first time includes:
step S510: acquiring a first face direction of a first person in a first person image and acquiring a first shoe print direction of a first shoe print;
step S520: under the condition that the first face direction and the first shoe mark direction face towards the interior of the carriage of the public transport vehicle, it is judged that the first person gets on the bus at the first station at the first time.
According to the method for tracking the person getting-in and getting-out track, under the condition that the first person image and the first shoe print belong to the first person, the first face direction of the first person in the first person image and the first shoe print direction of the foot of the first person in the first shoe print are obtained, and when the first face direction and the first shoe print direction of the first person face towards the inside of the carriage of the vehicle, the first person can be judged to get on the vehicle. Likewise, when both the first face direction and the first shoe print direction of the first person are oriented toward the outside of the compartment of the vehicle, it may be determined that the first person gets off. The direction is consistent, the angles of the first face direction and the first shoe print direction are not required to be completely consistent, but the directions are consistent within a certain range, so that the possibility that a person who turns around while getting on the vehicle and looks at other directions can be covered. Through the implementation mode in the embodiment, the first person can be judged to get on the bus at the first station at the first time through the first face image and the first shoe mark of the first person, and the accuracy of passenger flow analysis is further improved.
In an embodiment, fig. 6 is a flowchart three of a method for tracking a person entry and exit trajectory according to another embodiment of the present invention, as shown in fig. 6, after acquiring a first site according to a first time and obtaining a first event executed by a first person at the first site at the first time, the method includes:
step S610: acquiring a second shoe print and second time when the second shoe print is acquired, and acquiring a second person image of the first person at the second time under the condition that the second shoe print is consistent with the first shoe print;
step S620: acquiring a second face direction of the first person and a second shoe print direction of the second shoe print in the second person image;
step S630: and under the condition that the second face direction and the second shoe mark direction face the outside of the carriage of the public transport means, acquiring a second station according to the second time, and judging that the first person gets off the bus at the second station at the second time.
In this embodiment, the shoe print collecting device 16 collects the second shoe print, and after comparison, the second shoe print is found to be consistent with the first shoe print, that is, the second shoe print also belongs to the first person, then the second time for collecting the second shoe print is obtained, and the second person image captured by the image collecting device 18 at the second time is obtained, the second face direction of the first person in the second person image is obtained, the second shoe print direction of the foot of the first person in the second shoe print is obtained, and when the second face direction of the first person and the second shoe print direction are both towards the outside of the carriage of the transportation vehicle, it is determined that the first person gets off from the entrance B. In the embodiment, the shoe mark collecting device 16 records the collected shoe marks and shares the record with other shoe mark collecting devices on the vehicle, the shoe marks collected by all the shoe mark collecting devices on the vehicle are compared with the previously collected shoe marks, the record and the comparison can set the effective time according to specific application scenes, and can be set as the running time from the starting point to the terminal point of the vehicle; all the collecting devices on the vehicle can also collect shoe prints and send the shoe prints to an external analysis device to be compared by the analysis device. Through this embodiment, the behavior that the face cannot be shot corresponds to the first person who has been associated before, for example, when the first person gets on the bus, after the identity verification is performed through the shot face, the image of the first person is associated with the shoe print, so when the first person gets off the bus, because the facial features of the first person cannot be captured by the image acquisition device 18, but when the second shoe print of the first person is acquired by the shoe print acquisition device 16, the shoe print can be judged to belong to the first person through shoe print comparison, and it can still be judged that the first person leaves the access opening B at the second time by directing the direction of the second face and the direction of the second shoe print to the outside of the carriage. By the method in the embodiment, the event executed by the person corresponding to the person image at a certain station can be judged by using the associated person image and shoe print under the condition that the face characteristics of the person cannot be captured, and the accuracy of passenger flow statistical analysis is improved.
In an embodiment, fig. 7 is a fourth flowchart of a method for tracking a person entry and exit trajectory according to another embodiment of the present invention, as shown in fig. 7, after acquiring a first station according to a first time and obtaining a first event performed by a first person at the first station at the first time, the method includes:
step S710: acquiring a second shoe print and second time for acquiring the second shoe print, and recording the second time as the last acquisition time under the condition that the second shoe print is consistent with the first shoe print;
step S720: and acquiring a second station according to the second time when the shoe mark consistent with the first shoe mark is not acquired within a preset time threshold value after the second time, and executing a second event at the second station by the first person at the second time.
In this embodiment, it is determined that the last collection time of the shoe seal is the time of the first person executing the second event, in a preset time threshold after the second time, if the shoe seal consistent with the first shoe seal is collected again, the last collection time corresponding to the first shoe seal is updated, a station where the bus is located at the last collection time is obtained according to the second time after the bus arrives at the destination, or second station information is recorded as a backup at the second time, after the preset time threshold, if the second time is determined as the getting-off time, the backup second station information is the getting-off station. The preset time threshold may be set according to a specific application scenario, for example, the preset time threshold is set as a time from opening to closing of a bus door after the bus door arrives at a station, or a time interval from a second time to a terminal, for example, a shoe mark of a first person is collected at the second time, but after the shoe mark is collected, a shoe mark consistent with the first shoe mark is not collected during a time from opening to closing of the bus door, and it is determined that the first person gets off at the second time. If the shoe prints belonging to the first person are collected again during the time period, it can be determined that the first person remains at the entrance. Through the implementation mode in the embodiment, the misjudgment of people when staying at the entrance can be overcome, meanwhile, the special situation that people leave the entrance in a backward posture can be overcome, for example, the time threshold is set to be the running time of one trip of the bus, even if the face direction and the shoe mark direction of the first person face towards the inside of the bus, the shoe mark of the first person is not collected after the second time, the first person can still be judged to leave the bus at the second time, and the accuracy of the passenger flow statistical analysis data is further improved.
In one embodiment, capturing the first shoe print at the vehicle entrance and obtaining the second shoe print comprises:
collecting a first shoe print or acquiring a second shoe print at an entrance and an exit of a vehicle when a door of the vehicle is opened;
in a closed door state of a vehicle, the first shoe print at the vehicle entrance is not captured and the second shoe print is not acquired.
In one embodiment, collecting the first shoe print at the vehicle entrance comprises:
collecting a first shoe mark at an entrance and an exit of a vehicle when a door of the vehicle is opened;
the first shoe print at the entrance and exit of the vehicle is not collected when the door of the vehicle is closed.
In the embodiment, considering that the vehicle only has the door opened, the person can get on or off the vehicle, and the power consumption and the storage resource of the acquisition device can be saved.
In an embodiment, fig. 8 is a flowchart illustrating a method for tracking a person entry and exit trajectory according to another embodiment of the present invention, and as shown in fig. 8, the method for tracking a person entry and exit trajectory applied to a bus includes the following steps:
step S802: in the state that the door is not closed, the image acquisition device 14 acquires a first person image; the shoe print collecting device 12 collects the first shoe print;
step S804: identifying a first gender, a first height and a first spatial distribution of the person through a first shoe print; identifying a second gender, a second height, and a second spatial distribution from the first person image;
step S806: associating the first shoe print with the first person image according to a preset principle, wherein the first shoe print and the first person image are both associated to the first person;
step S808: judging whether the first shoe print direction and the first face direction of the first person face towards the interior of the bus or not according to the first shoe print direction and the first face direction;
step S810: the first shoe print direction and the first face direction of the first person face the interior of the bus, and the first person is judged to get on the bus;
step S812: registering the associated first shoe print, the first person image, the first time for collecting the first shoe print and a first site acquired according to the first time into a shoe print comparison database;
step S814: judging that the first person gets off when the first shoe print direction and the first face direction of the first person do not face the interior of the bus;
step S816: and sending the associated first shoe print, the first person image, the first time of collecting the first shoe print and a first site obtained according to the first time to an analysis system and deleting the first shoe print and the first person image from a shoe print comparison database.
According to the method, when the face information cannot be acquired through the video, the person in-out trajectory is judged through the video and the shoe print, and the person in-out trajectory tracking on the public transport means is achieved.
In an embodiment, fig. 9 is a schematic diagram of a shoe mark collected by a method for tracking a person entry and exit trajectory according to another embodiment of the present invention, and as shown in fig. 9, it can be determined that the shoe mark faces north.
In an embodiment, fig. 10 is a flowchart illustrating a second method for tracking a person entry and exit trajectory according to another embodiment of the present invention, and as shown in fig. 10, the method for tracking a person entry and exit trajectory applied to a bus includes:
step S1002: the bus is dispatched at the starting station;
in the step, information of each station such as a starting station, a terminal station and the like is directly obtained from a bus system or a bus is provided with a positioning unit;
step S1004: the image acquisition device 14 acquires a first person image; the shoe print collecting device 12 collects the first shoe print;
step S1006: identifying a first gender, a first height and a first spatial distribution of the person through a first shoe print; identifying a second gender, a second height, and a second spatial distribution from the first person image;
step S1008: associating the first shoe print with the first person image according to a preset principle, wherein the first shoe print and the first person image are both associated to the first person;
step S1010: comparing the shoe print comparison database, and judging whether the first shoe print is collected for the first time;
step S1012: the first shoe print is collected for the first time, the associated first shoe print, the first person image, the first time for collecting the first shoe print and a first site obtained according to the first time are registered into a shoe print comparison database;
in step S1012, it is determined whether the person and the shoe seal enter for the first time according to the database comparison result, the shoe seal is marked as shoe seal a, the collection time is marked as T0, and the associated site information is P0. Preferably, the personnel image and the shoe print are collected when the vehicle door is opened.
Step S1014: setting a time threshold value in which the first shoe mark is not collected for the first time, and judging that the first person gets off the vehicle when the first shoe mark does not appear within the time threshold value;
in step S1014, when the comparison result of the shoe mark B is regarded as the same shoe mark as the shoe mark a, the capture time T1 of the shoe mark B is recorded as the last capture time and station of the shoe mark a in association with the station information P1. Setting a threshold value t, collecting the same shoe mark again within the time range of t after the shoe mark B is collected, and refreshing the last collection time and station of the shoe mark A. If the shoe mark is not collected, T1 is regarded as the getting-off time of the person, and P1 is the getting-off station of the person. If any shoe mark is not collected to be the same as the shoe mark A, the shoe mark A is considered as invalid data, and the situation that a person backs up to get off after getting on the bus possibly exists. The value of the threshold T may be the interval from the acquisition time T1 to the station closing time, or the travel time from the acquisition time T1 to the end of the trip.
Step S1016: recording the associated first shoe print, first person image, getting-off time and getting-off station;
step S1018: and when the bus arrives at the terminal station, the recorded data is sent to an analysis system, and a shoe print comparison database is emptied.
The method overcomes the uncertainty of getting on or off the vehicle due to personnel, such as the situation of backing up to get on or off the vehicle and the situation that the personnel are crowded at the entrance and the exit when other people get off the vehicle.
In one embodiment, a person access trajectory tracking system is provided, fig. 11 is a schematic diagram of a person access trajectory tracking system according to an embodiment of the present invention, as shown in fig. 11, the person access trajectory tracking system includes:
the shoe print collecting device 12 obtains the first shoe print and the first time when the first shoe print is collected, and sends the first shoe print to the analysis processor 116;
the analysis processor 116 acquires a first attribute from the first shoe print transmitted by the shoe print collecting device 12, acquires a second attribute from the first person image transmitted by the image collecting device 14, and associates the first person image with the first shoe print according to the first attribute and the second attribute; after the association is successful, the analysis processor 116 obtains a first location at a first time, resulting in the first person performing a first event at the first location, wherein the first person corresponds to the first person image.
According to the personnel in-and-out trajectory tracking system, the collected shoe print information and the personnel image information in the same time and space are associated with the personnel image and the shoe print of the same personnel according to the personnel attribute acquired by the shoe print information and the personnel attribute acquired by the personnel image, so that the event executed by the personnel at the entrance and exit is judged more accurately, and the more accurate personnel in-and-out trajectory is acquired to achieve accurate passenger flow statistical analysis.
In an embodiment, fig. 12 is a schematic diagram of a people entrance and exit trajectory tracking system according to another embodiment of the present invention, as shown in fig. 12, the people entrance and exit trajectory tracking system for a bus includes:
passenger flow collection system 122 has contained shoe mark collection equipment 12 and image acquisition equipment 14, and shoe mark collection equipment 12 gathers the first shoe mark of vehicle entrance and exit, acquires the very first time of gathering first shoe mark, acquires image acquisition equipment 14 and is in the first person's image of very first time capture, passenger flow collection system 122 can only gather shoe mark and image, under the condition of only gathering shoe mark and image, bus arrival back, and passenger flow collection system 122 will have all images and shoe mark information of timestamp, send to passenger flow analytic system 124. The passenger flow collection system 122 may further include a shoe print comparison subsystem, the shoe print collection device 12 stores the obtained first shoe print in the shoe print database 125, and when the shoe print collection device 12 obtains the second shoe print, the shoe print comparison subsystem 123 compares the second shoe print with the shoe print database 125, and determines whether the second shoe print appears for the first time before. The passenger flow collection system 122 may further include a personal identification system 127 for obtaining a first attribute from the first shoe print, obtaining a second attribute from the first person image, and associating the first person image with the first shoe print according to the first attribute and the second attribute; in addition, the passenger flow collection system 122 may further have a positioning unit 128, and position the vehicle by using a GPS, so as to obtain the first station at the first time.
The passenger flow analysis system 124 performs data analysis on the image and the shoe print information sent by the passenger flow collection system 122, the door opening and closing information with the timestamp and the stop information recorded by the bus system, and realizes the function of the analysis processor 116.
The bus system 126 sends the door opening and closing information with the time stamp and the stop information to the passenger flow analysis system 124, and the passenger flow collection system 122 sends the door opening and closing information with the time stamp and the stop information to the passenger flow collection system 122 when the passenger flow collection system 122 carries the person identification and the shoe print ratio peer-to-peer analysis subsystem.
In one embodiment, a computer device is provided, which includes a memory, a processor, and a computer program stored in the memory and operable on the processor, wherein the processor implements the tracking method of the person entry and exit trajectory when executing the computer program, acquires shoe print information and person image information in the same time and space, associates the person image and the shoe print of the same person according to the person attribute acquired from the shoe print information and the person attribute acquired from the person image, and more accurately judges an event executed by the person at an entrance and exit, thereby acquiring a more accurate person entry and exit trajectory, and achieving accurate passenger flow statistical analysis.
In one embodiment, a computer-readable storage medium having a computer program stored thereon, the computer program when executed by a processor implementing the person entry and exit trajectory tracking method described above, is provided.
The computer readable storage medium associates the shoe stamp with the person image of the same person by using the collected shoe stamp information and the person image information in the same time and space according to the person attribute acquired by the shoe stamp information and the person attribute acquired by the person image, and more accurately judges an event executed by the person at an entrance and exit, so that more accurate person entrance and exit tracks are acquired, and accurate passenger flow statistics is achieved.
It should be understood that, although the steps in the flowchart are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware instructions of a computer program, which can be stored in a non-volatile computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. Any reference to memory, storage, database, or other medium used in the embodiments provided herein may include non-volatile and/or volatile memory, among others. Non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), Double Data Rate SDRAM (DDRSDRAM), Enhanced SDRAM (ESDRAM), Synchronous Link DRAM (SLDRAM), Rambus Direct RAM (RDRAM), direct bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM).
The technical features of the above embodiments can be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the above embodiments are not described, but should be considered as the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.
Claims (15)
1. A method for tracking a person's entry and exit trajectory, the method comprising:
acquiring a first shoe print, acquiring a first time of the first shoe print, and acquiring a first person image at the first time;
obtaining a first attribute through the first shoe print, obtaining a second attribute according to the first person image, and associating the first person image with the first shoe print according to the first attribute and the second attribute;
and after the association is successful, obtaining a first event executed by a first person at the first time, wherein the first person corresponds to the first person image.
2. The method of claim 1, wherein said obtaining a first attribute from the first shoe print, obtaining a second attribute from the first person image, and associating the first person image with the first shoe print based on the first attribute and the second attribute comprises:
acquiring a first attribute through the first shoe print, and acquiring a second attribute through the first person image;
the first attribute includes: a first spatial distribution characteristic, a first gender and a first estimated height;
the second attribute includes: a second spatial distribution characteristic, a second gender and a second estimated height;
associating the first shoe print and the first person image to a first person if the first spatially distributed feature and the second spatially distributed feature match, the first gender and the second gender match, and the first estimated height and the second estimated height match.
3. The method of claim 1, wherein obtaining a first event performed by a first person at the first time after the association is successful comprises:
acquiring a first face direction of the first person in the first person image, and acquiring a first shoe print direction of the first shoe print;
under the condition that the first face direction and the first shoe print direction face to a first direction, it is judged that the first person executes a first event at a first time.
4. The method of claim 3, wherein the association is successful after a first event performed by a first person at the first time is obtained, the method comprising:
acquiring a second shoe print and second time when the second shoe print is acquired, and acquiring a second person image of the first person at the second time under the condition that the second shoe print is consistent with the first shoe print;
acquiring a second face direction of the first person and a second shoe print direction of the second shoe print in the second person image;
and under the condition that the second face direction and the second shoe print direction face a second direction, judging that the first person executes a second event at the second time.
5. The method of claim 3, wherein the association is successful after a first event performed by a first person at the first time is obtained, the method comprising:
the method comprises the steps of obtaining a second shoe print and second time for collecting the second shoe print, recording the second time as last collection time under the condition that the second shoe print is consistent with the first shoe print, not collecting the shoe print consistent with the first shoe print within a preset time threshold value after the second time, and executing a second event by a first person at the second time.
6. A method for tracking a person's entry and exit trajectory, the method comprising:
collecting a first shoe print at an entrance and an exit of a vehicle and a first time when the first shoe print is collected, and acquiring a first person image at the first time;
obtaining a first attribute through the first shoe print, obtaining a second attribute according to the first person image, and associating the first person image with the first shoe print according to the first attribute and the second attribute;
and after the association is successful, acquiring a first site according to the first time, and obtaining that a first person executes a first event at the first site at the first time, wherein the first person corresponds to the first person image.
7. The method of claim 6, wherein said obtaining a first attribute from the first shoe print, obtaining a second attribute from the first person image, and associating the first person image with the first shoe print based on the first attribute and the second attribute comprises:
acquiring a first attribute through the shoe print, and acquiring a second attribute through the personnel image;
the first attribute includes: a first spatial distribution characteristic, a first gender and a first estimated height;
the second attribute includes: a second spatial distribution characteristic, a second gender and a second estimated height;
associating the shoe print and the person image to the same person if the first spatially distributed feature and the second spatially distributed feature coincide, the first gender and the second gender coincide, and the first estimated height and the second estimated height coincide.
8. The method of claim 6, wherein the collecting the first shoe print at the vehicle entrance comprises:
collecting the first shoe print at the entrance and the exit of the vehicle when the door of the vehicle is in an open state;
and under the state that the vehicle door of the vehicle is closed, the first shoe print at the vehicle entrance is not collected.
9. The method of claim 6, wherein after the association is successful, acquiring a first site according to the first time, and obtaining a first event executed by a first person at the first site at the first time comprises:
acquiring a first face direction of the first person in the first person image, and acquiring a first shoe print direction of the first shoe print;
and under the condition that the first face direction and the first shoe mark direction face towards the interior of the carriage of the public transport means, judging that the first person gets on the bus at a first station at the first time.
10. The method of claim 9, wherein the obtaining of the first site based on the first time is followed by a first event performed by a first person at the first site at the first time, the method comprising:
acquiring a second shoe print and second time when the second shoe print is acquired, and acquiring a second person image of the first person at the second time under the condition that the second shoe print is consistent with the first shoe print;
acquiring a second face direction of the first person and a second shoe print direction of the second shoe print in the second person image;
and under the condition that the second face direction and the second shoe mark direction face the outside of the carriage of the public transport means, acquiring a second station according to the second time, and judging that the first person gets off at the second station at the second time.
11. The method of claim 6, wherein the obtaining of the first site based on the first time is followed by a first event performed by a first person at the first site at the first time, the method comprising:
the method comprises the steps of obtaining a second shoe print and collecting second time of the second shoe print, recording the second time as last collecting time under the condition that the second shoe print is consistent with the first shoe print, not collecting the shoe print consistent with the first shoe print within a preset time threshold value after the second time, obtaining a second station according to the second time, and obtaining a second time, wherein a first person is in the second station to execute a second event.
12. The method of claim 10 or 11, wherein the collecting a first shoe print at a vehicle entrance and obtaining a second shoe print comprises:
collecting the first shoe print or acquiring the second shoe print at the entrance and the exit of the vehicle when the door of the vehicle is opened;
in a door-closed state of the vehicle, the first shoe print at the vehicle entrance is not captured and the second shoe print is not acquired.
13. A person entry and exit trajectory tracking system, the system comprising: shoe print collection equipment, image collection equipment and analysis processor:
the shoe print acquisition equipment acquires a first shoe print and first time for acquiring the first shoe print, and sends the first shoe print to the analysis processor;
the image acquisition equipment acquires a first person image at the first time and sends the first person image to the analysis processor;
the analysis processor acquires a first attribute through the first shoe print, acquires a second attribute according to the first person image, and associates the first person image with the first shoe print according to the first attribute and the second attribute; after the association is successful, the analysis processor acquires a first position at the first time, and obtains a first event executed by the first person at the first position, wherein the first person corresponds to the first person image.
14. A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the steps of the method of any of claims 1 to 12 are implemented by the processor when executing the computer program.
15. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method of any one of claims 1 to 12.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910998596.1A CN110992397B (en) | 2019-10-21 | 2019-10-21 | Personnel access track tracking method, system, computer equipment and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910998596.1A CN110992397B (en) | 2019-10-21 | 2019-10-21 | Personnel access track tracking method, system, computer equipment and storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110992397A true CN110992397A (en) | 2020-04-10 |
CN110992397B CN110992397B (en) | 2023-07-21 |
Family
ID=70082151
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910998596.1A Active CN110992397B (en) | 2019-10-21 | 2019-10-21 | Personnel access track tracking method, system, computer equipment and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110992397B (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111836015A (en) * | 2020-07-14 | 2020-10-27 | 深圳英龙华通科技发展有限公司 | Subway passenger number statistical method and system |
CN112353387A (en) * | 2020-09-28 | 2021-02-12 | 吴志龙 | Method for calculating body characteristics based on shoe prints, computer device and storage medium |
CN113674309A (en) * | 2020-05-14 | 2021-11-19 | 杭州海康威视系统技术有限公司 | Object tracking method, device, management platform and storage medium |
CN116524441A (en) * | 2023-07-03 | 2023-08-01 | 四川顶圣工程项目管理有限公司 | Construction site supervision method for engineering project management |
Citations (39)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH07160883A (en) * | 1993-12-09 | 1995-06-23 | Nippon Telegr & Teleph Corp <Ntt> | Personal attribute detecting device |
CN101327126A (en) * | 2008-07-23 | 2008-12-24 | 天津大学 | Method for extracting morphologic characteristic of human body bare footprint feature |
US20110102568A1 (en) * | 2009-10-30 | 2011-05-05 | Medical Motion, Llc | Systems and methods for comprehensive human movement analysis |
US20110200226A1 (en) * | 2010-02-17 | 2011-08-18 | Toshiba Tec Kabushiki Kaisha | Customer behavior collection method and customer behavior collection apparatus |
CN102184539A (en) * | 2011-04-29 | 2011-09-14 | 王靖中 | Image-processing-technology-based plane footprint detection and analysis system and method thereof |
US20120179742A1 (en) * | 2011-01-11 | 2012-07-12 | Videonetics Technology Private Limited | Integrated intelligent server based system and method/systems adapted to facilitate fail-safe integration and/or optimized utilization of various sensory inputs |
JP2013037406A (en) * | 2011-08-03 | 2013-02-21 | Sogo Keibi Hosho Co Ltd | Height estimation device, height estimation method and height estimation program |
CN202819781U (en) * | 2012-08-23 | 2013-03-27 | 高华文 | Shoe with step-counting and positioning functions |
WO2013109795A1 (en) * | 2012-01-17 | 2013-07-25 | Blast Motion Inc. | Intelligent motion capture element |
CN103235932A (en) * | 2013-04-09 | 2013-08-07 | 重庆理工大学 | People counting method and device for accesses |
CN203165067U (en) * | 2013-04-25 | 2013-08-28 | 重庆师范大学 | Entrance population counting device oriented to open scene |
US20140348382A1 (en) * | 2013-05-22 | 2014-11-27 | Hitachi, Ltd. | People counting device and people trajectory analysis device |
JP2014238674A (en) * | 2013-06-06 | 2014-12-18 | 日本電気株式会社 | Information processing system, information processing method, and program |
CN104598891A (en) * | 2015-02-03 | 2015-05-06 | 大连恒锐科技股份有限公司 | Method and device for characteristic analysis of human body based on shoes wearing footprint images |
US20150324390A1 (en) * | 2012-01-12 | 2015-11-12 | Kofax, Inc. | Systems and methods for mobile image capture and processing |
WO2016031314A1 (en) * | 2014-08-25 | 2016-03-03 | Nkワークス株式会社 | Individual identification device, individual identification method, and individual identification program |
CN105469349A (en) * | 2015-11-30 | 2016-04-06 | 上海斐讯数据通信技术有限公司 | Target person rescue method, device and server |
CN106055573A (en) * | 2016-05-20 | 2016-10-26 | 西安邮电大学 | Method and system for shoeprint image retrieval under multi-instance learning framework |
CN106777050A (en) * | 2016-12-09 | 2017-05-31 | 大连海事大学 | It is a kind of based on bag of words and to take into account the footwear stamp line expression and system of semantic dependency |
US20170277957A1 (en) * | 2016-03-25 | 2017-09-28 | Fuji Xerox Co., Ltd. | Store-entering person attribute extraction apparatus, store-entering person attribute extraction method, and non-transitory computer readable medium |
CN107358783A (en) * | 2017-07-26 | 2017-11-17 | 深圳市盛路物联通讯技术有限公司 | A kind of long distance monitoring method and device |
CN107390865A (en) * | 2017-06-28 | 2017-11-24 | 国网上海市电力公司 | A kind of intelligent helmet suitable for wearable cruising inspection system |
CN107424276A (en) * | 2017-08-12 | 2017-12-01 | 宋彦震 | Intelligent-induction floor monitoring system |
JP2017218248A (en) * | 2016-06-03 | 2017-12-14 | 東芝エレベータ株式会社 | Attribute acquisition system and attribute acquisition method |
CN107578041A (en) * | 2017-10-27 | 2018-01-12 | 华润电力技术研究院有限公司 | A kind of detecting system |
CN107582062A (en) * | 2017-08-31 | 2018-01-16 | 南京华苏科技有限公司 | A kind of indoor human body movement locus and Posture acquisition rendering method and device |
CN107833330A (en) * | 2017-11-29 | 2018-03-23 | 孙庆玲 | Public security system for track traffic passenger traffic |
CN108095731A (en) * | 2018-01-31 | 2018-06-01 | 成都四海万联科技有限公司 | A kind of pedal Intelligent monitoring device |
JP2018093283A (en) * | 2016-11-30 | 2018-06-14 | マクセル株式会社 | Monitoring information gathering system |
CN207624075U (en) * | 2017-12-13 | 2018-07-17 | 浙江政安信息安全研究中心有限公司 | A kind of multi information acquisition detector gate |
CN108648319A (en) * | 2018-06-20 | 2018-10-12 | 睿力集成电路有限公司 | The personnel entry/exit management system and method in copper wiring region |
CN109359580A (en) * | 2018-10-12 | 2019-02-19 | 中国科学院福建物质结构研究所 | Footprint based on deep learning identifies and gait detection method and its device |
CN109711299A (en) * | 2018-12-17 | 2019-05-03 | 北京百度网讯科技有限公司 | Vehicle passenger flow statistical method, device, equipment and storage medium |
CN109714544A (en) * | 2019-01-23 | 2019-05-03 | 李彦启 | A kind of police 3 D stereo scene inspection system |
WO2019104949A1 (en) * | 2017-11-28 | 2019-06-06 | 特斯联(北京)科技有限公司 | Residential entrance access control system which achieves human big data acquisition and analysis |
CN109858358A (en) * | 2018-12-28 | 2019-06-07 | 深圳供电局有限公司 | Method and system for tracking person track between buildings and computer readable storage medium |
CN109934081A (en) * | 2018-08-29 | 2019-06-25 | 厦门安胜网络科技有限公司 | A kind of pedestrian's attribute recognition approach, device and storage medium based on deep neural network |
WO2019193816A1 (en) * | 2018-04-05 | 2019-10-10 | 矢崎エナジーシステム株式会社 | Guidance system |
CN110458074A (en) * | 2019-08-02 | 2019-11-15 | 浙江天地人科技有限公司 | A kind of automated collection systems of personnel's face and its corresponding characteristic information |
-
2019
- 2019-10-21 CN CN201910998596.1A patent/CN110992397B/en active Active
Patent Citations (39)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH07160883A (en) * | 1993-12-09 | 1995-06-23 | Nippon Telegr & Teleph Corp <Ntt> | Personal attribute detecting device |
CN101327126A (en) * | 2008-07-23 | 2008-12-24 | 天津大学 | Method for extracting morphologic characteristic of human body bare footprint feature |
US20110102568A1 (en) * | 2009-10-30 | 2011-05-05 | Medical Motion, Llc | Systems and methods for comprehensive human movement analysis |
US20110200226A1 (en) * | 2010-02-17 | 2011-08-18 | Toshiba Tec Kabushiki Kaisha | Customer behavior collection method and customer behavior collection apparatus |
US20120179742A1 (en) * | 2011-01-11 | 2012-07-12 | Videonetics Technology Private Limited | Integrated intelligent server based system and method/systems adapted to facilitate fail-safe integration and/or optimized utilization of various sensory inputs |
CN102184539A (en) * | 2011-04-29 | 2011-09-14 | 王靖中 | Image-processing-technology-based plane footprint detection and analysis system and method thereof |
JP2013037406A (en) * | 2011-08-03 | 2013-02-21 | Sogo Keibi Hosho Co Ltd | Height estimation device, height estimation method and height estimation program |
US20150324390A1 (en) * | 2012-01-12 | 2015-11-12 | Kofax, Inc. | Systems and methods for mobile image capture and processing |
WO2013109795A1 (en) * | 2012-01-17 | 2013-07-25 | Blast Motion Inc. | Intelligent motion capture element |
CN202819781U (en) * | 2012-08-23 | 2013-03-27 | 高华文 | Shoe with step-counting and positioning functions |
CN103235932A (en) * | 2013-04-09 | 2013-08-07 | 重庆理工大学 | People counting method and device for accesses |
CN203165067U (en) * | 2013-04-25 | 2013-08-28 | 重庆师范大学 | Entrance population counting device oriented to open scene |
US20140348382A1 (en) * | 2013-05-22 | 2014-11-27 | Hitachi, Ltd. | People counting device and people trajectory analysis device |
JP2014238674A (en) * | 2013-06-06 | 2014-12-18 | 日本電気株式会社 | Information processing system, information processing method, and program |
WO2016031314A1 (en) * | 2014-08-25 | 2016-03-03 | Nkワークス株式会社 | Individual identification device, individual identification method, and individual identification program |
CN104598891A (en) * | 2015-02-03 | 2015-05-06 | 大连恒锐科技股份有限公司 | Method and device for characteristic analysis of human body based on shoes wearing footprint images |
CN105469349A (en) * | 2015-11-30 | 2016-04-06 | 上海斐讯数据通信技术有限公司 | Target person rescue method, device and server |
US20170277957A1 (en) * | 2016-03-25 | 2017-09-28 | Fuji Xerox Co., Ltd. | Store-entering person attribute extraction apparatus, store-entering person attribute extraction method, and non-transitory computer readable medium |
CN106055573A (en) * | 2016-05-20 | 2016-10-26 | 西安邮电大学 | Method and system for shoeprint image retrieval under multi-instance learning framework |
JP2017218248A (en) * | 2016-06-03 | 2017-12-14 | 東芝エレベータ株式会社 | Attribute acquisition system and attribute acquisition method |
JP2018093283A (en) * | 2016-11-30 | 2018-06-14 | マクセル株式会社 | Monitoring information gathering system |
CN106777050A (en) * | 2016-12-09 | 2017-05-31 | 大连海事大学 | It is a kind of based on bag of words and to take into account the footwear stamp line expression and system of semantic dependency |
CN107390865A (en) * | 2017-06-28 | 2017-11-24 | 国网上海市电力公司 | A kind of intelligent helmet suitable for wearable cruising inspection system |
CN107358783A (en) * | 2017-07-26 | 2017-11-17 | 深圳市盛路物联通讯技术有限公司 | A kind of long distance monitoring method and device |
CN107424276A (en) * | 2017-08-12 | 2017-12-01 | 宋彦震 | Intelligent-induction floor monitoring system |
CN107582062A (en) * | 2017-08-31 | 2018-01-16 | 南京华苏科技有限公司 | A kind of indoor human body movement locus and Posture acquisition rendering method and device |
CN107578041A (en) * | 2017-10-27 | 2018-01-12 | 华润电力技术研究院有限公司 | A kind of detecting system |
WO2019104949A1 (en) * | 2017-11-28 | 2019-06-06 | 特斯联(北京)科技有限公司 | Residential entrance access control system which achieves human big data acquisition and analysis |
CN107833330A (en) * | 2017-11-29 | 2018-03-23 | 孙庆玲 | Public security system for track traffic passenger traffic |
CN207624075U (en) * | 2017-12-13 | 2018-07-17 | 浙江政安信息安全研究中心有限公司 | A kind of multi information acquisition detector gate |
CN108095731A (en) * | 2018-01-31 | 2018-06-01 | 成都四海万联科技有限公司 | A kind of pedal Intelligent monitoring device |
WO2019193816A1 (en) * | 2018-04-05 | 2019-10-10 | 矢崎エナジーシステム株式会社 | Guidance system |
CN108648319A (en) * | 2018-06-20 | 2018-10-12 | 睿力集成电路有限公司 | The personnel entry/exit management system and method in copper wiring region |
CN109934081A (en) * | 2018-08-29 | 2019-06-25 | 厦门安胜网络科技有限公司 | A kind of pedestrian's attribute recognition approach, device and storage medium based on deep neural network |
CN109359580A (en) * | 2018-10-12 | 2019-02-19 | 中国科学院福建物质结构研究所 | Footprint based on deep learning identifies and gait detection method and its device |
CN109711299A (en) * | 2018-12-17 | 2019-05-03 | 北京百度网讯科技有限公司 | Vehicle passenger flow statistical method, device, equipment and storage medium |
CN109858358A (en) * | 2018-12-28 | 2019-06-07 | 深圳供电局有限公司 | Method and system for tracking person track between buildings and computer readable storage medium |
CN109714544A (en) * | 2019-01-23 | 2019-05-03 | 李彦启 | A kind of police 3 D stereo scene inspection system |
CN110458074A (en) * | 2019-08-02 | 2019-11-15 | 浙江天地人科技有限公司 | A kind of automated collection systems of personnel's face and its corresponding characteristic information |
Non-Patent Citations (3)
Title |
---|
朱方: "多信息融合模式分类方法研究及在公交客流识别系统中的应用" * |
李胜广等: "大人流、高通量人员卡口管控设备——多维人员信息感知门", pages 67 - 74 * |
林万续: "犯罪现场视频的勘查与应用", 《辽宁警察学院学报》, no. 3, pages 58 - 62 * |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113674309A (en) * | 2020-05-14 | 2021-11-19 | 杭州海康威视系统技术有限公司 | Object tracking method, device, management platform and storage medium |
CN113674309B (en) * | 2020-05-14 | 2024-02-20 | 杭州海康威视系统技术有限公司 | Method, device, management platform and storage medium for object tracking |
CN111836015A (en) * | 2020-07-14 | 2020-10-27 | 深圳英龙华通科技发展有限公司 | Subway passenger number statistical method and system |
CN112353387A (en) * | 2020-09-28 | 2021-02-12 | 吴志龙 | Method for calculating body characteristics based on shoe prints, computer device and storage medium |
CN116524441A (en) * | 2023-07-03 | 2023-08-01 | 四川顶圣工程项目管理有限公司 | Construction site supervision method for engineering project management |
CN116524441B (en) * | 2023-07-03 | 2023-09-01 | 四川顶圣工程项目管理有限公司 | Construction site supervision method for engineering project management |
Also Published As
Publication number | Publication date |
---|---|
CN110992397B (en) | 2023-07-21 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110992397B (en) | Personnel access track tracking method, system, computer equipment and storage medium | |
US9892325B2 (en) | Image management system | |
CN108369645A (en) | Taxi operation monitoring method, device, storage medium and system | |
CN110910550A (en) | Gate passing method, gate, system, computer device and storage medium | |
CN110991390B (en) | Identity information retrieval method and device, service system and electronic equipment | |
CN109102613B (en) | Community monitoring method based on face recognition | |
JP2008152328A (en) | Suspicious person monitoring system | |
CN110598548A (en) | Standing person identification method and device, computer equipment and storage medium | |
CN110852148A (en) | Visitor destination verification method and system based on target tracking | |
CN111583469A (en) | Parking management method and device | |
CN107833328B (en) | Access control verification method and device based on face recognition and computing equipment | |
CN105825350A (en) | Video analysis-based intelligent tourism early warning decision-making system and use method thereof | |
CN113053013A (en) | Access control method and device based on face recognition, computer equipment and medium | |
CN113065448A (en) | Face recognition method, face recognition system, computer equipment and storage medium | |
CN113330491B (en) | Electronic gate opening method and device and server | |
CN108710827A (en) | A kind of micro- police service inspection in community and information automatic analysis system and method | |
CN111583488A (en) | Multi-dimensional data model information processing system and method based on artificial intelligence | |
CN113343913A (en) | Target determination method, target determination device, storage medium and computer equipment | |
CN113936345B (en) | Method and system for processing check ticket information of fast-passing asynchronous video in scenic spot | |
CN108921962B (en) | Parking lot entrance and exit management method and parking lot management server | |
CN116486332A (en) | Passenger flow monitoring method, device, equipment and storage medium | |
KR102099816B1 (en) | Method and apparatus for collecting floating population data on realtime road image | |
CN113205876B (en) | Method, system, electronic device and medium for determining effective clues of target person | |
CN115909617A (en) | Visitor early warning method, system and device based on multi-source heterogeneous data | |
CN113744443B (en) | Gate channel anti-cheating control method, device, equipment and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |