CN113610890A - Real-time personnel track two-dimensional plane display method based on multiple cameras - Google Patents
Real-time personnel track two-dimensional plane display method based on multiple cameras Download PDFInfo
- Publication number
- CN113610890A CN113610890A CN202110749676.0A CN202110749676A CN113610890A CN 113610890 A CN113610890 A CN 113610890A CN 202110749676 A CN202110749676 A CN 202110749676A CN 113610890 A CN113610890 A CN 113610890A
- Authority
- CN
- China
- Prior art keywords
- person
- face
- tracking
- list
- information
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 22
- 238000012545 processing Methods 0.000 claims description 4
- 238000010586 diagram Methods 0.000 abstract 1
- 230000000694 effects Effects 0.000 description 2
- 238000012423 maintenance Methods 0.000 description 2
- 238000012360 testing method Methods 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30196—Human being; Person
- G06T2207/30201—Face
Landscapes
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Image Analysis (AREA)
Abstract
A real-time two-dimensional plane display method for personnel trajectories based on multiple cameras comprises the steps that a plurality of cameras are arranged in a garden, the recording range of the cameras covers the whole area of the garden, and the space coordinates of the cameras are determined; the camera is electrically connected to the same server, the server is provided with a portrait acquisition program and a personnel tracking program, and a park panoramic two-dimensional plane map is stored; the face feature information of personnel in the garden is collected through the portrait collection program, the movement track of the same person is identified through the consistency comparison of the characteristics of the personnel in the photo through the personnel tracking program, and the track of the same person is superposed on the two-dimensional plane graph of the garden to be displayed. The invention collects face data through a camera, searches a face database for comparison to obtain the name of a person, obtains the coordinate at the moment, converts the image coordinate into the coordinate on a two-dimensional plane through an algorithm, and finally connects the coordinates one by one to generate a real-time motion trail diagram of the person.
Description
Technical Field
The invention relates to the field of image data processing, in particular to a real-time two-dimensional plane display method for a person track based on multiple cameras.
Background
The safety protection work and the information acquisition in garden are the basic guarantee that the order is stabilized in the garden, and the personnel that pass in and out in the garden are a big hidden danger of order maintenance and property maintenance, to the personal management, the mode of real time monitoring is adopted to the garden more.
Generally, the person trajectory tracking is combined with technologies such as RFID and the like to track persons in real time, and the other mode is to position the persons through a camera to form a rough person trajectory graph.
Disclosure of Invention
In view of the above situation, and in order to overcome the defects of the prior art, the invention aims to provide a real-time two-dimensional plane display method for a person track based on multiple cameras, which is based on the above two problems, proposes to tile multiple cameras of the same type, and can generate a person motion accurate track map of a person in a fixed scene in real time through an algorithm after acquiring face data of the person and comparing and acquiring the identity of the person.
The solution is that, a real-time personnel track two-dimensional plane display method based on multi-path cameras, which is characterized in that the method comprises:
s1, installing a plurality of cameras in a garden, wherein the recording range of the cameras covers the whole area of the garden;
s2, determining the space coordinates of all cameras in the park;
s3, electrically connecting the cameras to the same server, wherein the server is provided with a portrait acquisition program and a personnel tracking program and stores a garden panoramic two-dimensional plane map;
s4, acquiring information of a face database, calling a camera by a portrait acquisition program to acquire face feature information of people in the garden, memorizing and storing new face features to form a new identity, closing the portrait acquisition program after the information acquisition is finished, and starting a person tracking program;
s5, tracking the person track, wherein the person tracking program identifies the motion track of the same person through the consistency comparison of the person characteristics in the picture, automatically adds the motion track to the current park two-dimensional plane graph, and superposes the track of the same person on the park two-dimensional plane graph to display the track.
Preferably, the method for determining the spatial coordinates of all the cameras in the park includes the following steps:
A1. selecting a certain corner in the garden, and determining a space coordinate point of the corner as an origin a (0, 0, 0);
A2. measuring coordinates (x, y, z) from each camera to a point a one by adopting a distance measuring tool, wherein x and y are relative distances from the camera to the point a, the distance unit is centimeter, and z is the actual height of the camera;
A3. a two-dimensional plan of the park was plotted using a 1:50 ratio;
A4. and acquiring parameter information of the acquisition camera by a calibration method of the acquisition camera.
Preferably, the face data and the input name are acquired by the face image acquisition program and stored in a face database.
Preferably, the person trajectory tracking only tracks a person whose identity information is stored in the face database, and the specific tracking step of the person tracking program includes:
B1. setting a coordinate point previous _ point of a person at the time t0 as (0, 0), setting a coordinate point t1 at the current time as current _ point as (0, 0), setting a tracking list as tracking _ list as null, setting a person ID and a name binding list as name _ face _ list, and sequentially numbering cameras as camera 1, camera 2.
B2. Acquiring frame information of camera 1, detecting all face data at the current time t1, comparing the face data with a person information base, performing intersection processing on the identified persons and a tracking _ list, setting the person tracking state of the intersection part in the tracking _ list to be 1, and setting all the other person tracking states to be 0;
B3. detecting all people in the frame, detecting whether the person ID is in the name _ face _ list or not, and if the person ID is in the name _ face _ list, only updating the coordinate point information corresponding to the person ID in the tracking _ list; if the face information does not exist, acquiring the ID of the face information, matching the face information, if the face information is matched, adding the face information into the name _ face _ list, initializing relevant coordinate point information of the face information, and storing the face information into a tracking _ list;
B4. drawing a personnel list path under the current tracking _ list, and connecting two coordinate points of current _ point and previous _ point of the personnel list path on the two-dimensional plane graph through a calling program;
B5. acquiring frame information under camera2, camera3,. and camera n;
B6. and (5) repeating the steps 2, 3 and 4 to finally form a real-time trajectory graph of each tracked person.
The invention has the beneficial effects that:
1. the method comprises the steps that a person track generates a picture in real time, face data are collected through a camera, attributes such as names of the person are obtained after a face database is searched for comparison, image comparison of the person under a current frame and an image comparison of a previous frame are collected frame by frame, whether the person belongs to the same person or not is judged, if the person belongs to the same person, a coordinate at the moment is obtained, the image coordinate is converted into a coordinate on a two-dimensional plane through an algorithm, and finally the coordinates are connected one by one to generate a real-time motion track picture of the person;
2. the method can improve the tracking track of personnel without changing the original application scene, and can obtain better effect at lower cost.
Detailed Description
The following further describes the embodiments of the present invention in detail.
A real-time two-dimensional plane display method for a person track based on multiple cameras is characterized by comprising the following steps:
s1, installing a plurality of cameras in a garden, wherein the recording ranges of the cameras cover the whole area of the garden, the cameras are uniformly arranged in order to accurately depict the tracks of people, the camera ranges of adjacent cameras are intersected, and the cameras are additionally arranged at dead angles which are difficult to shoot;
s2, determining the space coordinates of all cameras in the park;
s3, the camera is electrically connected to the same server, a portrait acquisition program and a personnel tracking program are installed on the server, a garden panoramic two-dimensional plane map is stored, and the two-dimensional plane map is updated periodically to enable the two-dimensional plane map in the server to be highly consistent with the actual garden condition;
s4, acquiring information of a face database, calling a camera by a portrait acquisition program to acquire face feature information of people in the garden, memorizing and storing new face features to form a new identity, closing the portrait acquisition program after the information acquisition is finished, starting a person tracking program, and starting the portrait acquisition program only when the face database is updated;
s5, tracking the person track, wherein the person tracking program identifies the motion track of the same person through the consistency comparison of the person characteristics in the picture, automatically adds the motion track to the current park two-dimensional plane graph, and superposes the track of the same person on the park two-dimensional plane graph to display the track. The information acquisition of the face database is realized by taking a plurality of photos of a person to be positioned in a snapshot mode, including front photos, side photos and the like, so that the uniformly tracked object has characteristic contrast at a plurality of different angles, the system identification accuracy is enhanced, and after a face acquisition program is started, the program automatically acquires new information and finishes the upgrade of the face database.
The method for determining the space coordinates of all the cameras in the park comprises the following steps:
A1. selecting a certain corner in the garden, and determining a space coordinate point of the corner as an origin a (0, 0, 0);
A2. measuring coordinates (x, y, z) from each camera to a point a one by adopting a distance measuring tool, wherein x and y are relative distances from the camera to the point a, the distance unit is centimeter, and z is the actual height of the camera;
A3. a two-dimensional plan of the park was plotted using a 1:50 ratio;
A4. and acquiring parameter information of the acquisition camera by a calibration method of the acquisition camera.
The camera calibration method obtains the parameter information of the camera as the confidence carried by the camera, including the focal length and the distortion coefficient of the camera, and is used for optimizing the test and adjustment of each camera.
And the human image acquisition program acquires the human face data and inputs the name of the person and stores the human face data and the name of the person in a human face database. The corresponding name of each face needs to be input, so that the effect of one-to-one matching is achieved, and the corresponding name information can be displayed when the action track of a certain person is called and read, so that the face name information is clear at a glance; because the personnel in the garden do not change in real time, the portrait acquisition program does not need to work and update all the time, and after the personnel acquire the portrait, the portrait acquisition program is closed so as to reduce the operation burden of the server.
Personnel's trajectory tracking only tracks the personnel who has identity information in the people's face database, and personnel's tracking procedure specifically tracks the step and includes:
B1. setting a coordinate point previous _ point of a person at the time t0 as (0, 0), setting a coordinate point t1 at the current time as current _ point as (0, 0), and simultaneously setting a tracking list as tracking _ list as null, namely, a person list being tracked, a person ID and a name binding list name _ face _ list, after a person appears in a video stream, assigning a random number, namely the ID of the person, the person ID and the name binding list are binding lists of the person ID and the person name, and the cameras are numbered as camera 1, camera 2.
B2. Acquiring frame information of camera 1, detecting all face data at the current time t1, comparing the face data with a person information base, performing intersection processing on the identified persons and a tracking _ list, setting the person tracking state of the intersection part in the tracking _ list to be 1, and setting all the other person tracking states to be 0;
B3. detecting all people in the frame, detecting whether the person ID is in the name _ face _ list or not, and if the person ID is in the name _ face _ list, only updating the coordinate point information corresponding to the person ID in the tracking _ list; if the face information does not exist, acquiring the ID of the face information, matching the face information, if the face information is matched, adding the face information into the name _ face _ list, initializing relevant coordinate point information of the face information, and storing the face information into a tracking _ list;
B4. drawing a personnel list path under the current tracking _ list, and connecting two coordinate points of current _ point and previous _ point of the personnel list path on the two-dimensional plane graph through a calling program;
B5. acquiring frame information under camera2, camera3,. and camera n;
B6. and (5) repeating the steps 2, 3 and 4 to finally form a real-time trajectory graph of each tracked person.
When the system is used, firstly, the space coordinates of the camera in a garden or other scenes are provided for installing the camera, if the garden is larger, the camera can be properly additionally installed, software is deployed to a server provided with a display card to run, and after the test camera can work normally, a portrait acquisition program starting program is run; then, providing an integral panoramic two-dimensional image of the park for marking the follow-up personnel track, automatically upgrading a human face database by a program through capturing a plurality of photos of personnel to be positioned, including front photos, side photos and the like, and closing the photo acquisition program after all personnel acquire the photos; and finally, starting a personnel tracking program, automatically loading the current two-dimensional plane graph by the program, and starting track display on the two-dimensional plane graph of the park after the face of the personnel is captured.
The above-mentioned embodiments do not limit the scope of the present invention, and various modifications and improvements of the technical solutions of the present invention by those skilled in the art should be included in the protection scope defined by the claims of the present invention without departing from the design concept of the present invention.
Claims (4)
1. A real-time two-dimensional plane display method for a person track based on multiple cameras is characterized by comprising the following steps:
s1, installing a plurality of cameras in a garden, wherein the recording range of the cameras covers the whole area of the garden;
s2, determining the space coordinates of all cameras in the park;
s3, electrically connecting the cameras to the same server, wherein the server is provided with a portrait acquisition program and a personnel tracking program and stores a garden panoramic two-dimensional plane map;
s4, acquiring information of a face database, calling a camera by a portrait acquisition program to acquire face feature information of people in the garden, memorizing and storing new face features to form a new identity, closing the portrait acquisition program after the information acquisition is finished, and starting a person tracking program;
s5, tracking the person track, wherein the person tracking program identifies the motion track of the same person through the consistency comparison of the person characteristics in the picture, automatically adds the motion track to the current park two-dimensional plane graph, and superposes the track of the same person on the park two-dimensional plane graph to display the track.
2. The real-time two-dimensional plane display method for the trajectory of the people based on the multiple cameras according to claim 1, wherein the method for determining the spatial coordinates of all the cameras in the park comprises the following steps:
A1. selecting a certain corner in the garden, and determining a space coordinate point of the corner as an origin a (0, 0, 0);
A2. measuring coordinates (x, y, z) from each camera to a point a one by adopting a distance measuring tool, wherein x and y are relative distances from the camera to the point a, the distance unit is centimeter, and z is the actual height of the camera;
A3. a two-dimensional plan of the park was plotted using a 1:50 ratio;
A4. and acquiring parameter information of the acquisition camera by a calibration method of the acquisition camera.
3. The real-time personnel trajectory two-dimensional plane display method based on the multi-channel camera as claimed in claim 1, wherein the human image acquisition program is stored in a human face database by acquiring human face data and inputting a name of a person.
4. A real-time two-dimensional planar display method for personnel trajectory based on multiple cameras according to claim 1 or 3, wherein the personnel trajectory tracking only tracks the personnel who have identity information in the face database, and the personnel tracking program specifically tracks the personnel by steps including:
B1. setting a coordinate point previous _ point of a person at the time t0 as (0, 0), setting a coordinate point t1 at the current time as current _ point as (0, 0), setting a tracking list as tracking _ list as null, setting a person ID and a name binding list as name _ face _ list, and sequentially numbering cameras as camera 1, camera 2.
B2. Acquiring frame information of camera 1, detecting all face data at the current time t1, comparing the face data with a person information base, performing intersection processing on the identified persons and a tracking _ list, setting the person tracking state of the intersection part in the tracking _ list to be 1, and setting all the other person tracking states to be 0;
B3. detecting all people in the frame, detecting whether the person ID is in the name _ face _ list or not, and if the person ID is in the name _ face _ list, only updating the coordinate point information corresponding to the person ID in the tracking _ list; if the face information does not exist, acquiring the ID of the face information, matching the face information, if the face information is matched, adding the face information into the name _ face _ list, initializing relevant coordinate point information of the face information, and storing the face information into a tracking _ list;
B4. drawing a personnel list path under the current tracking _ list, and connecting two coordinate points of current _ point and previous _ point of the personnel list path on the two-dimensional plane graph through a calling program;
B5. acquiring frame information under camera2, camera3,. and camera;
B6. and (5) repeating the steps 2, 3 and 4 to finally form a real-time trajectory graph of each tracked person.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110749676.0A CN113610890A (en) | 2021-07-02 | 2021-07-02 | Real-time personnel track two-dimensional plane display method based on multiple cameras |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110749676.0A CN113610890A (en) | 2021-07-02 | 2021-07-02 | Real-time personnel track two-dimensional plane display method based on multiple cameras |
Publications (1)
Publication Number | Publication Date |
---|---|
CN113610890A true CN113610890A (en) | 2021-11-05 |
Family
ID=78337204
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110749676.0A Pending CN113610890A (en) | 2021-07-02 | 2021-07-02 | Real-time personnel track two-dimensional plane display method based on multiple cameras |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113610890A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114286059A (en) * | 2021-12-31 | 2022-04-05 | 杭州登虹科技有限公司 | Wireless video monitoring system for kindergarten |
CN116777947A (en) * | 2023-06-21 | 2023-09-19 | 上海汉朔信息科技有限公司 | User track recognition prediction method and device and electronic equipment |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010063001A (en) * | 2008-09-05 | 2010-03-18 | Mitsubishi Electric Corp | Person-tracking device and person-tracking program |
CN109448026A (en) * | 2018-11-16 | 2019-03-08 | 南京甄视智能科技有限公司 | Passenger flow statistical method and system based on head and shoulder detection |
CN110135384A (en) * | 2019-04-03 | 2019-08-16 | 南通大学 | A kind of system and method for face tracking and identification based on video flowing |
KR20190142553A (en) * | 2018-06-18 | 2019-12-27 | 주식회사 쓰임기술 | Tracking method and system using a database of a person's faces |
CN112365522A (en) * | 2020-10-19 | 2021-02-12 | 中标慧安信息技术股份有限公司 | Method for tracking personnel in park across borders |
CN112733719A (en) * | 2021-01-11 | 2021-04-30 | 西南交通大学 | Cross-border pedestrian track detection method integrating human face and human body features |
-
2021
- 2021-07-02 CN CN202110749676.0A patent/CN113610890A/en active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010063001A (en) * | 2008-09-05 | 2010-03-18 | Mitsubishi Electric Corp | Person-tracking device and person-tracking program |
KR20190142553A (en) * | 2018-06-18 | 2019-12-27 | 주식회사 쓰임기술 | Tracking method and system using a database of a person's faces |
CN109448026A (en) * | 2018-11-16 | 2019-03-08 | 南京甄视智能科技有限公司 | Passenger flow statistical method and system based on head and shoulder detection |
CN110135384A (en) * | 2019-04-03 | 2019-08-16 | 南通大学 | A kind of system and method for face tracking and identification based on video flowing |
CN112365522A (en) * | 2020-10-19 | 2021-02-12 | 中标慧安信息技术股份有限公司 | Method for tracking personnel in park across borders |
CN112733719A (en) * | 2021-01-11 | 2021-04-30 | 西南交通大学 | Cross-border pedestrian track detection method integrating human face and human body features |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114286059A (en) * | 2021-12-31 | 2022-04-05 | 杭州登虹科技有限公司 | Wireless video monitoring system for kindergarten |
CN116777947A (en) * | 2023-06-21 | 2023-09-19 | 上海汉朔信息科技有限公司 | User track recognition prediction method and device and electronic equipment |
CN116777947B (en) * | 2023-06-21 | 2024-02-13 | 上海汉朔信息科技有限公司 | User track recognition prediction method and device and electronic equipment |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109887040B (en) | Moving target active sensing method and system for video monitoring | |
US11145130B2 (en) | Method for automatically capturing data from non-networked production equipment | |
CN111080679B (en) | Method for dynamically tracking and positioning indoor personnel in large-scale place | |
US8254633B1 (en) | Method and system for finding correspondence between face camera views and behavior camera views | |
TWI416068B (en) | Object tracking method and apparatus for a non-overlapping-sensor network | |
CN107958258A (en) | For following the trail of the method and system of the object in limited area | |
CN107529221A (en) | A kind of follow-up analysis system and method for combination video monitoring and Wi Fi positioning | |
CN114693746B (en) | Intelligent monitoring system and method based on identity recognition and cross-camera target tracking | |
CN109977813A (en) | A kind of crusing robot object localization method based on deep learning frame | |
US20120114176A1 (en) | Image processing apparatus and image processing method | |
CN109190508A (en) | A kind of multi-cam data fusion method based on space coordinates | |
CN110910460B (en) | Method and device for acquiring position information and calibration equipment | |
CN113610890A (en) | Real-time personnel track two-dimensional plane display method based on multiple cameras | |
CN111914635A (en) | Human body temperature measurement method, device and system and electronic equipment | |
JP2009055139A (en) | Person tracking system, apparatus, and program | |
JPWO2014080613A1 (en) | COLOR CORRECTION DEVICE, COLOR CORRECTION METHOD, AND COLOR CORRECTION PROGRAM | |
EP2618288A1 (en) | Monitoring system and method for video episode viewing and mining | |
US11900549B2 (en) | Method for automatically capturing data from non-networked production equipment | |
TW202244847A (en) | Target tracking method and apparatus, electronic device and storage medium | |
CN110991297A (en) | Target positioning method and system based on scene monitoring | |
CN112307912A (en) | Method and system for determining personnel track based on camera | |
CN115376034A (en) | Motion video acquisition and editing method and device based on human body three-dimensional posture space-time correlation action recognition | |
CN111935641A (en) | Indoor self-positioning realization method, intelligent mobile device and storage medium | |
CN111739056A (en) | Trajectory tracking system | |
CN110276379A (en) | A kind of the condition of a disaster information rapid extracting method based on video image analysis |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |