CN111107319B - Target tracking method, device and system based on regional camera - Google Patents
Target tracking method, device and system based on regional camera Download PDFInfo
- Publication number
- CN111107319B CN111107319B CN201911361106.3A CN201911361106A CN111107319B CN 111107319 B CN111107319 B CN 111107319B CN 201911361106 A CN201911361106 A CN 201911361106A CN 111107319 B CN111107319 B CN 111107319B
- Authority
- CN
- China
- Prior art keywords
- camera
- value
- local area
- information
- acquiring
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/181—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/70—Information retrieval; Database structures therefor; File system structures therefor of video data
- G06F16/78—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
- G06F16/787—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using geographical or spatial information, e.g. location
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Library & Information Science (AREA)
- Data Mining & Analysis (AREA)
- Databases & Information Systems (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Signal Processing (AREA)
- Alarm Systems (AREA)
- Closed-Circuit Television Systems (AREA)
- Image Analysis (AREA)
Abstract
The invention discloses a target tracking method, a target tracking device and a target tracking system based on a regional camera, and relates to the technical field of video monitoring. A target tracking method based on a regional camera comprises the steps of collecting installation position information of the camera; dividing a local area according to the installation position of the camera, wherein camera nodes in the local area share image data to form local cluster information; constructing object track monitoring data, and sequencing the objects from small to large according to the occurrence probability values of the objects; and acquiring the characteristic information of the target object, performing characteristic comparison from the first object according to the sequence, and acquiring the trace monitoring data of the corresponding object as the trace track of the target object when the characteristic comparison is passed. The invention provides a target tracking system which is high in speed and precision and can process multiple cameras and multiple scenes.
Description
Technical Field
The invention relates to the technical field of video monitoring.
Background
With the improvement of the requirements on smart cities, at least 1000 million monitoring cameras are used for city monitoring and alarm systems in China. Although the addition of cameras brings benefits to large-scale prevention and can acquire massive video data for real-time alarm and post-affair query, how to manually utilize massive video data becomes a huge challenge. In order to solve the problem of ' who, where and what ' of people ' interested in videos, the intelligent video monitoring technology is applied, wherein the core part is a video content understanding technology based on computer vision, and target behaviors in the videos are further analyzed through a series of algorithm analysis such as feature extraction, target detection and recognition, target tracking and the like on original video images. Most algorithms of the current video monitoring technology are based on the analysis processing of a single video source. For example, a person appears at a certain place at a certain moment, and the person is searched based on the characteristics of the video source input to find out all the time about the person.
At present, for the research on video monitoring with multiple cameras, a monitoring network structure is mainly constructed, so that the information of when a target object appears in a certain camera in a monitoring area and the moving path information of the target in a camera network map are obtained, and the information parameters of target motion can be collected. In the current stage, a common searching mode is to use a plurality of video sources as a large monitoring system, input target object characteristics in a video database of the whole monitoring system, search relevant information in the large system database, and output time and place information. This method requires centralized storage and processing of a large amount of video data, which not only puts high demands on the running hardware devices, but also often takes a long time to perform the search, resulting in low search efficiency.
Disclosure of Invention
The invention aims to: the defects of the prior art are overcome, and a target tracking method, a target tracking device and a target tracking system based on the area camera are provided. According to the invention, a local area of the camera is constructed, and searching is carried out based on the local area when the target track is searched without searching in a database of the whole monitoring system, so that a target tracking system which is high in speed and precision and can process multiple cameras and multiple scenes is formed.
In order to achieve the above object, the present invention provides the following technical solutions:
a target tracking method based on a regional camera comprises the following steps:
collecting the installation position information of a camera;
dividing local areas according to the installation positions of the cameras, wherein each local area comprises a plurality of cameras, and each camera is a camera node; in each local area, sharing image data acquired by any one camera node to other camera nodes in the local area to form local cluster information;
in the object trace monitoring data, calculating the occurrence probability of each object according to the occurrence rule of the object, and sequencing the objects from small to large according to the occurrence probability value;
and acquiring the characteristic information of the target object, performing characteristic comparison from the first object according to the sequence, and acquiring the trace monitoring data of the corresponding object as the trace track of the target object when the characteristic comparison is passed.
Further, the step of calculating the occurrence probability of the object is,
in a local area, acquiring the time information of all objects which appear in the corresponding positions of the local area;
acquiring the number of days M [ i ] that each object passes through the position within N days of a preset time period, wherein i represents the number of the object, i =1,2, … …, and calculating the occurrence frequency value P [ i ] = M [ i ]/N of the object with the number i;
and taking the P [ i ] value as the appearance probability value of the object at the position, wherein the appearance probability value corresponding to the small P [ i ] value is also small.
Further, the method also comprises the step of,
for a plurality of objects with the same P [ i ], acquiring the difference value of the time points of the objects appearing at the corresponding positions in M [ i ] days;
judging whether the difference is smaller than a preset time threshold, wherein the preset time threshold corresponds to a preset additional value Y, and the Y is a number which is larger than 0 and smaller than 1;
and when the difference value is judged to be smaller than the preset time threshold value, taking the sum of the P [ i ] value and the Y value as the occurrence probability value of the object at the position.
Further, the appearance position information of the object is provided by the installation position of the camera, and the track of the object in the local area is obtained according to the installation position of the camera node where the object appears.
Further, each camera is provided with an independent memory to store image data taken by itself.
Further, for each local area, a local shared memory is provided to manage local cluster information, and each camera node in the local area can save data to the local shared memory and read data in the local shared memory.
The invention also provides a target tracking device, which comprises the following structure:
the system comprises a region dividing module, a local cluster information acquiring module and a processing module, wherein the region dividing module is used for acquiring the installation position information of the cameras and dividing local regions according to the installation positions of the cameras, each local region comprises a plurality of cameras, and each camera is a camera node;
the information processing module is used for detecting all objects appearing in the local cluster information, and the appearance time information and the appearance position information to construct object trace monitoring data, wherein in the object trace monitoring data, the appearance probability of each object is calculated according to the appearance rule of the object, and the objects are sorted from small to large according to the appearance probability value;
and the target object tracking module is used for acquiring the characteristic information of the target object, performing characteristic comparison from the first object according to the sequence, and acquiring the trace monitoring data of the corresponding object as the trace track of the target object when the characteristic comparison is passed.
Further, the information processing module includes a probability computation sub-module configured to,
in a local area, acquiring the time information of all objects which appear in the corresponding positions of the local area; acquiring the number of days M [ i ] that each object passes through the position within N days of a preset time period, wherein i represents the number of the object, i =1,2, … …, and calculating the occurrence frequency value P [ i ] = M [ i ]/N of the object with the number i; and taking the P [ i ] value as the appearance probability value of the object at the position, wherein the appearance probability value corresponding to the small P [ i ] value is also small.
Further, the probability computation sub-module is further configured to,
for a plurality of objects with the same P [ i ], acquiring the difference value of the time points of the objects appearing at the corresponding positions in M [ i ] days; judging whether the difference is smaller than a preset time threshold, wherein the preset time threshold corresponds to a preset additional value Y, and the Y is a number which is larger than 0 and smaller than 1; and when the difference value is judged to be smaller than the preset time threshold value, taking the sum of the P [ i ] value and the Y value as the occurrence probability value of the object at the position.
The invention also provides a target tracking system based on the area camera, which comprises a plurality of cameras and the target tracking device for acquiring the track of the target object.
Due to the adoption of the technical scheme, compared with the prior art, the invention has the following advantages and positive effects as examples: by constructing the local area of the camera, searching is carried out based on the local area when the target track is searched without searching in the database of the whole monitoring system, and the target tracking system which is high in speed and precision and can process multiple cameras and multiple scenes is formed.
Drawings
Fig. 1 is a flowchart of a target tracking method based on a regional camera according to an embodiment of the present invention.
Fig. 2 is a diagram illustrating an exemplary operation of performing local area division according to an embodiment of the present invention.
Fig. 3 is a block diagram of a target tracking device according to an embodiment of the present invention.
Description of reference numerals:
a monitoring system 100;
the system comprises a region dividing module 210, a position information acquisition sub-module 211 and a region construction sub-module 212;
an information processing module 220, a probability calculation sub-module 221, and an object ordering sub-module 222;
a target object tracking module 230, a feature acquisition interface 231, a feature ratio sub-module 232.
Detailed Description
The following describes the target tracking method, device and system based on the area camera disclosed in the present invention in further detail with reference to the accompanying drawings and specific embodiments. It should be noted that technical features or combinations of technical features described in the following embodiments should not be considered as being isolated, and they may be combined with each other to achieve better technical effects. In the drawings of the embodiments described below, the same reference numerals appearing in the respective drawings denote the same features or components, and may be applied to different embodiments. Thus, once an item is defined in one drawing, it need not be further discussed in subsequent drawings.
It should be noted that the structures, proportions, sizes, and other dimensions shown in the drawings and described in the specification are only for the purpose of understanding and reading the present disclosure, and are not intended to limit the scope of the invention, which is defined by the claims, and any modifications of the structures, changes in the proportions and adjustments of the sizes and other dimensions, should be construed as falling within the scope of the invention unless the function and objectives of the invention are affected. The scope of the preferred embodiments of the present invention includes additional implementations in which functions may be executed out of order from that described or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the embodiments of the present invention.
Techniques, methods, and apparatus known to those of ordinary skill in the relevant art may not be discussed in detail but are intended to be part of the specification where appropriate. In all examples shown and discussed herein, any particular value should be construed as merely illustrative, and not limiting. Thus, other examples of the exemplary embodiments may have different values.
Examples
Referring to fig. 1, a target tracking method based on a regional camera includes the steps of:
and S100, collecting the installation position information of the camera.
A plurality of cameras are arranged in the monitoring area, and each camera has a monitoring range based on the installation position.
In this embodiment, by way of example and not limitation, the installation position of the camera may be obtained by a wireless communication device that obtains the camera, or a position sensor may be disposed on each camera to obtain the installation position.
S200, local areas are divided according to the installation positions of the cameras, each local area comprises a plurality of cameras, and each camera is a camera node. In each local area, image data acquired by any one camera node is shared with other camera nodes in the local area to form local cluster information.
The area division may be set based on user-defined selection of a user, for example, the user sets a camera arranged along a certain urban road as a local area according to needs; or the system can be intelligently divided based on the position relevance, for example, the cameras corresponding to all entrances and exits of a certain shopping square are set to be a local area by collecting the installation positions of the cameras.
Each local area comprises a plurality of camera nodes, and image data acquired by any one camera node is shared with other camera nodes in the local area to form local cluster information.
By way of example and not limitation, referring to FIG. 2, the monitoring area of the overall monitoring system 100 is divided into 3 local areas: local area 1, local area 2, and local area 3 130, local area 1 includes 8 cameras, local area 2 includes 8 cameras, and local area 3 includes 8 cameras. It should be noted that the number of camera nodes in each local area may also be different, and should not be taken as a limitation to the present invention.
For 8 camera nodes 111 in the 1 st local area, image data collected by any one camera node is shared by other 7 camera nodes, and monitoring data of 8 camera nodes form local cluster information of the 1 st local area. For 8 camera nodes 121 in the 2 nd local area, image data collected by any one camera node is shared by the other 7 camera nodes, and monitoring data of the 8 camera nodes form local cluster information of the 2 nd local area. For 8 camera nodes 131 in the 3 rd local area, image data collected by any one camera node is shared by the other 7 camera nodes, and monitoring data of the 8 camera nodes form local cluster information of the 2 nd local area.
S300, detecting all objects appearing in the local cluster information, and the appearance time information and the appearance position information, and constructing object track monitoring data, wherein in the object track monitoring data, the appearance probability of each object is calculated according to the appearance rule of the object, and the objects are sorted from small to large according to the appearance probability value.
And for each local area, constructing object track monitoring data according to the corresponding local cluster information. The object trace monitoring data comprises all objects appearing in the local cluster information, and the appearance time and appearance position of all the objects. In this embodiment, the information of the appearance position of the object is provided by the installation position of the camera, and the track of the object in the local area is obtained according to the installation position of the camera node where the object appears. For any object, the track of the object is known by outputting the address information of the cameras according to the time sequence.
The object may be a person, or may be other objects such as a pet, a vehicle, etc.
When the camera is specifically set, each camera can be provided with an independent memory to store the monitoring image data shot by the camera.
While for each local area a local shared memory may also be provided to manage local cluster information. Each camera node in the local area can save data to the local shared memory and read data in the local shared memory.
In this embodiment, the step of calculating the occurrence probability of the object is:
in a local area, acquiring the time information of all objects which appear in the corresponding positions of the local area;
acquiring the number of days M [ i ] that each object passes through the position within N days of a preset time period, wherein i represents the number of the object, i =1,2, … …, and calculating the occurrence frequency value P [ i ] = M [ i ]/N of the object with the number i;
and taking the P [ i ] value as the appearance probability value of the object at the position, wherein the appearance probability value corresponding to the small P [ i ] value is also small.
The preset time period N may be, by way of example and not limitation, any time period such as three days, one week, half a month, one month, and the like, and the user may adaptively select the time length of the track acquired according to the need, which should not be taken as a limitation to the present invention.
On the other hand, for multiple objects that are the same for P [ i ], further ordering is performed by:
acquiring the difference value of the time points of each object appearing at the corresponding position in M [ i ] days;
judging whether the difference is smaller than a preset time threshold, wherein the preset time threshold corresponds to a preset additional value Y, and the Y is a number which is larger than 0 and smaller than 1;
and when the difference value is judged to be smaller than the preset time threshold value, taking the sum of the P [ i ] value and the Y value as the occurrence probability value of the object at the position.
By way of example and not limitation, for example, for the 1 st local area in fig. 2, a database is established to store the local cluster information, the preset time period N is one month, and the preset time threshold i is 2 hours. In each local area, the detected whereabouts of all persons are stored, by way of example and not limitation, such as 4 persons (objects) detected, i.e., a 1, a 2, b, c, with corresponding numbers 1,2,3, 4.
The appearance rule of the object A1 is as follows: in one month, every day passes through this area, M [1] =30, P [1] =30/30= 1.
The appearance rule of the object A2 is as follows: in one month, every day passes through this area, M [2] =30, P [2] =30/30= 1.
The appearance rule of the object B is as follows: in one month, 15 days pass through this area, M [3] =15, P [3] =15/30 = 0.5.
The appearance of subject c is: in one month, 2 days pass through this area, M [4] =2, and P [4] =2/30= 0.067.
Since subject nail 1 and subject nail 2 appeared on the same days, further comparisons were made for the time difference of appearance per day.
By comparison, the difference between the time points of the object nail 1 appearing in 30 days is smaller than the preset time threshold value by 2 hours, the preset additional value Y =0.1, and the occurrence probability value of the object nail 1 is 1+ Y = 1.1. When the difference between the time points of the object nail 2 appearing in 30 days is greater than the preset time threshold for 2 hours, it can be found that the appearance probability value of the object nail 1 is P [1] =1.
The occurrence probability values of the four persons A1, A2, B and C are ranked in the order of small and big, and the order is as follows: third is the first row, second is the second row, first 2 is the third row, and first 1 is the fourth row. I.e., the smaller the probability value of occurrence, the more forward the ranking. The above-described ordering may be automatically updated in accordance with the updating of the monitoring data.
S400, acquiring characteristic information of the target object, performing characteristic comparison from the first object according to the sequence, and acquiring the trace monitoring data of the corresponding object as the trace track of the target object when the characteristic comparison is passed.
When the track information of a certain suspected target needs to be checked and the characteristic information of the suspected target is input, the target with the lowest probability value appears in the database, and the target object needing to be tracked can be quickly and accurately checked. The implementation scheme takes the characteristics of certain specific behaviors implemented by the searched suspect target into consideration during data search: it is desirable that the specific behavior is not discovered, and thus the number of occurrences at the same location is minimized to reduce the likelihood that the specific behavior is discovered. On the basis of the sorting which is not S300, the comparison is started from the object with the lowest occurrence probability value, and the track data of the suspected target can be extracted more efficiently.
In this embodiment, the track of the target object is preferably output based on a real scene map of the monitored area.
Referring to fig. 3, a target tracking device is further provided according to another embodiment of the present invention.
The device 200 comprises the following structure:
the area division module 210 is configured to collect mounting position information of the cameras, perform local area division according to the mounting positions of the cameras, where each local area includes multiple cameras and each camera is a camera node, and share image data collected by any one camera node in each local area with other camera nodes in the local area to form local cluster information.
In specific setting, the region dividing module 210 includes a position information collecting sub-module 211 and a region constructing sub-module 212.
And the position information acquisition submodule 211 is configured to acquire mounting position information of each camera.
The area construction submodule 212 is configured to perform local area division on the camera, and in each local area, share image data acquired by any one camera node to other camera nodes in the local area to form local cluster information. One local area corresponds to one local cluster information.
The information processing module 220 is used for detecting all objects appearing in the local cluster information, the appearance time information and the appearance position information, and constructing object trace monitoring data, calculating the appearance probability of each object according to the appearance rule of each object in the object trace monitoring data, and sequencing each object from small to large according to the appearance probability value.
In a specific setting, the information processing module 220 includes a probability calculation sub-module 221 and an object sorting sub-module 222.
A probability calculation submodule 221, configured to perform the following operations: in a local area, acquiring time information of all objects which appear from the position aiming at the position of each camera node; acquiring the number of days M [ i ] that each object passes through the position within N days of a preset time period, wherein i represents the number of the object, i =1,2, … …, and calculating the occurrence frequency value P [ i ] = M [ i ]/N of the object with the number i; and taking the P [ i ] value as the appearance probability value of the object at the position, wherein the appearance probability value corresponding to the small P [ i ] value is also small.
And for a plurality of objects with the same P [ i ], acquiring the difference value of the time points of the objects appearing at the corresponding positions in M [ i ] days; judging whether the difference is smaller than a preset time threshold, wherein the preset time threshold corresponds to a preset additional value Y, and the Y is a number which is larger than 0 and smaller than 1; and when the difference value is judged to be smaller than the preset time threshold value, taking the sum of the P [ i ] value and the Y value as the occurrence probability value of the object at the position.
An object ordering sub-module 222, configured to perform the following operations: and sequencing the objects according to the occurrence probability value from small to large.
The target object tracking module 230 is configured to obtain feature information of the target object, perform feature comparison from the first object according to the aforementioned sorting, and obtain the trace monitoring data of the corresponding object as the trace track of the target object when the feature comparison passes.
In a specific configuration, the target tracking module 230 includes a feature acquisition interface 231 and a feature ratio sub-module 232.
The feature collection interface 231 is configured to obtain feature information of the target object, where the feature information may be text content or graphic content, and the feature collection interface may include a text input field and a photo upload field.
And the feature comparison sub-module 232 is configured to perform feature comparison from the first object according to the sorting, and when the feature comparison is passed, obtain the trace monitoring data of the corresponding object as the trace track of the target object. The track is preferably based on a live-action map output of the monitored area.
Other technical features are referred to the foregoing embodiments and will not be described herein.
In the foregoing description, the disclosure of the present invention is not intended to limit itself to these aspects. Rather, the various components may be selectively and operatively combined in any number within the intended scope of the present disclosure. In addition, terms like "comprising," "including," and "having" should be interpreted as inclusive or open-ended, rather than exclusive or closed-ended, by default, unless explicitly defined to the contrary. All technical, scientific, or other terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs unless defined otherwise. Common terms found in dictionaries should not be interpreted too ideally or too realistically in the context of related art documents unless the present disclosure expressly limits them to that. Any changes and modifications of the present invention based on the above disclosure will be within the scope of the appended claims.
Claims (6)
1. A target tracking method based on a regional camera is characterized by comprising the following steps:
collecting the installation position information of a camera;
dividing local areas according to the installation positions of the cameras, wherein each local area comprises a plurality of cameras, and each camera is a camera node; in each local area, sharing image data acquired by any one camera node to other camera nodes in the local area to form local cluster information;
detecting all objects appearing in the local cluster information, and appearance time information and appearance position information to construct object track monitoring data; in the object track monitoring data, calculating the occurrence probability of each object according to the occurrence rule of the object, and sequencing the objects from small to large according to the occurrence probability value;
acquiring characteristic information of a target object, performing characteristic comparison from a first object according to the sequence, and acquiring trace monitoring data of a corresponding object as a trace track of the target object when the characteristic comparison is passed;
wherein the step of calculating the occurrence probability of the object is:
in a local area, acquiring time information of all objects which appear in the position corresponding to the local area aiming at the position of each camera node;
acquiring the number of days M [ i ] that each object passes through the position within N days of a preset time period, wherein i represents the number of the object, i =1,2, … …, and calculating the occurrence frequency value P [ i ] = M [ i ]/N of the object with the number i;
taking the P [ i ] value as the occurrence probability value of the object at the position, wherein the occurrence probability value corresponding to the smaller P [ i ] value is also smaller;
and for a plurality of objects with the same P [ i ], acquiring the difference value of the time points of the objects appearing at the corresponding positions in M [ i ] days; judging whether the difference is smaller than a preset time threshold, wherein the preset time threshold corresponds to a preset additional value Y, and the Y is a number which is larger than 0 and smaller than 1; and when the difference value is judged to be smaller than the preset time threshold value, taking the sum of the P [ i ] value and the Y value as the occurrence probability value of the object at the position.
2. The method of claim 1, wherein: the appearance position information of the object is provided by the installation position of the camera, and the track of the object in the local area is obtained according to the installation position of the camera node where the object appears.
3. The method of claim 1, wherein each camera is configured with a separate memory to store the image data captured by itself.
4. A method according to claim 3, characterized in that for each local area, a local shared memory is provided to manage local cluster information, and each camera node in the local area can save data to the local shared memory and read data in the local shared memory.
5. An object tracking device, comprising the following structure:
the area division module is used for acquiring the installation position information of the cameras and carrying out local area division according to the installation positions of the cameras, each local area comprises a plurality of cameras, and each camera is a camera node; in each local area, sharing image data acquired by any one camera node to other camera nodes in the local area to form local cluster information;
the information processing module is used for detecting all objects appearing in the local cluster information, and the appearance time information and the appearance position information, and constructing object track monitoring data; in the object track monitoring data, calculating the occurrence probability of each object according to the occurrence rule of the object, and sequencing the objects from small to large according to the occurrence probability value;
the target object tracking module is used for acquiring the characteristic information of the target object, performing characteristic comparison from the first object according to the sequence, and acquiring the trace monitoring data of the corresponding object as the trace track of the target object when the characteristic comparison is passed;
wherein the information processing module comprises a probability computation submodule configured to: in a local area, acquiring time information of all objects which appear in the position corresponding to the local area aiming at the position of each camera node; acquiring the number of days M [ i ] that each object passes through the position within N days of a preset time period, wherein i represents the number of the object, i =1,2, … …, and calculating the occurrence frequency value P [ i ] = M [ i ]/N of the object with the number i; taking the P [ i ] value as the occurrence probability value of the object at the position, wherein the occurrence probability value corresponding to the smaller P [ i ] value is also smaller;
and for a plurality of objects with the same P [ i ], acquiring the difference value of the time points of the objects appearing at the corresponding positions in M [ i ] days; judging whether the difference is smaller than a preset time threshold, wherein the preset time threshold corresponds to a preset additional value Y, and the Y is a number which is larger than 0 and smaller than 1; and when the difference value is judged to be smaller than the preset time threshold value, taking the sum of the P [ i ] value and the Y value as the occurrence probability value of the object at the position.
6. The utility model provides a target tracking system based on regional camera, includes a plurality of cameras, its characterized in that: the target tracking device of claim 5 is further included to obtain a trajectory of the target object.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911361106.3A CN111107319B (en) | 2019-12-25 | 2019-12-25 | Target tracking method, device and system based on regional camera |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911361106.3A CN111107319B (en) | 2019-12-25 | 2019-12-25 | Target tracking method, device and system based on regional camera |
Publications (2)
Publication Number | Publication Date |
---|---|
CN111107319A CN111107319A (en) | 2020-05-05 |
CN111107319B true CN111107319B (en) | 2021-05-28 |
Family
ID=70425130
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201911361106.3A Active CN111107319B (en) | 2019-12-25 | 2019-12-25 | Target tracking method, device and system based on regional camera |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111107319B (en) |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111899281B (en) * | 2020-07-15 | 2023-10-31 | 北京和利时系统工程有限公司 | Method and system for realizing control strategy of auxiliary monitoring system based on behavior tree |
CN113343795B (en) * | 2021-05-24 | 2024-04-26 | 广州智慧城市发展研究院 | Target associated video tracking processing method |
CN113438450B (en) * | 2021-06-11 | 2022-05-17 | 深圳市大工创新技术有限公司 | Dynamic target tracking and monitoring method |
CN117407480B (en) * | 2023-12-14 | 2024-03-19 | 杭州计算机外部设备研究所(中国电子科技集团公司第五十二研究所) | Map display method and device based on photoelectric holder |
CN117557789B (en) * | 2024-01-12 | 2024-04-09 | 国研软件股份有限公司 | Intelligent detection method and system for offshore targets |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2265023A1 (en) * | 2008-03-14 | 2010-12-22 | Sony Computer Entertainment Inc. | Subject tracking device and subject tracking method |
CN105491327A (en) * | 2015-11-18 | 2016-04-13 | 浙江宇视科技有限公司 | Video tracking method and device based on road network |
CN106295594A (en) * | 2016-08-17 | 2017-01-04 | 北京大学 | A kind of based on dynamic route tree across photographic head method for tracking target and device |
CN106570147A (en) * | 2016-10-27 | 2017-04-19 | 武汉烽火众智数字技术有限责任公司 | GIS road network analysis-based jump type video tracking method and system |
CN109344267A (en) * | 2018-09-06 | 2019-02-15 | 苏州千视通视觉科技股份有限公司 | Relay method for tracing and system based on PGIS map |
CN109992726A (en) * | 2018-10-17 | 2019-07-09 | 招商银行股份有限公司 | Position predicting method, device and readable storage medium storing program for executing |
CN110188691A (en) * | 2019-05-30 | 2019-08-30 | 银河水滴科技(北京)有限公司 | A kind of motion track determines method and device |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109598240B (en) * | 2018-12-05 | 2019-11-05 | 深圳市安软慧视科技有限公司 | Video object quickly recognition methods and system again |
CN110519324B (en) * | 2019-06-06 | 2020-08-25 | 特斯联(北京)科技有限公司 | Person tracking method and system based on network track big data |
-
2019
- 2019-12-25 CN CN201911361106.3A patent/CN111107319B/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2265023A1 (en) * | 2008-03-14 | 2010-12-22 | Sony Computer Entertainment Inc. | Subject tracking device and subject tracking method |
CN105491327A (en) * | 2015-11-18 | 2016-04-13 | 浙江宇视科技有限公司 | Video tracking method and device based on road network |
CN106295594A (en) * | 2016-08-17 | 2017-01-04 | 北京大学 | A kind of based on dynamic route tree across photographic head method for tracking target and device |
CN106570147A (en) * | 2016-10-27 | 2017-04-19 | 武汉烽火众智数字技术有限责任公司 | GIS road network analysis-based jump type video tracking method and system |
CN109344267A (en) * | 2018-09-06 | 2019-02-15 | 苏州千视通视觉科技股份有限公司 | Relay method for tracing and system based on PGIS map |
CN109992726A (en) * | 2018-10-17 | 2019-07-09 | 招商银行股份有限公司 | Position predicting method, device and readable storage medium storing program for executing |
CN110188691A (en) * | 2019-05-30 | 2019-08-30 | 银河水滴科技(北京)有限公司 | A kind of motion track determines method and device |
Also Published As
Publication number | Publication date |
---|---|
CN111107319A (en) | 2020-05-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111107319B (en) | Target tracking method, device and system based on regional camera | |
CN112183353B (en) | Image data processing method and device and related equipment | |
CN108491827B (en) | Vehicle detection method and device and storage medium | |
US8160371B2 (en) | System for finding archived objects in video data | |
US10515117B2 (en) | Generating and reviewing motion metadata | |
KR20080075091A (en) | Storage of video analysis data for real-time alerting and forensic analysis | |
CN103345492A (en) | Method and system for video enrichment | |
Xu et al. | Video analytics with zero-streaming cameras | |
WO2022142417A1 (en) | Target tracking method and apparatus, electronic device, and storage medium | |
CN111209776A (en) | Method, device, processing server, storage medium and system for identifying pedestrians | |
CN109784220B (en) | Method and device for determining passerby track | |
CN109800329B (en) | Monitoring method and device | |
CN103699679A (en) | Method and equipment for retrieving information of target objects | |
CN115830076B (en) | Personnel track video intelligent analysis system | |
CN112597871A (en) | Unsupervised vehicle re-identification method and system based on two-stage clustering and storage medium | |
CN108268598A (en) | A kind of analysis system and analysis method based on vedio data | |
CN111639689A (en) | Face data processing method and device and computer readable storage medium | |
CN114743139A (en) | Video scene retrieval method and device, electronic equipment and readable storage medium | |
Lin et al. | Moving camera analytics: Emerging scenarios, challenges, and applications | |
CN112651992B (en) | Track tracking method and system | |
Hadji et al. | Region of interest and redundancy problem in migratory birds wild life surveillance | |
CN114265952B (en) | Target retrieval method and device | |
CN116610849A (en) | Method, device, equipment and storage medium for acquiring moving objects with similar tracks | |
US20230076241A1 (en) | Object detection systems and methods including an object detection model using a tailored training dataset | |
CN112528818B (en) | Data statistics method, device, electronic equipment and machine-readable storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
CP02 | Change in the address of a patent holder |
Address after: 201210 8th floor, building 1, 298 Xiangke Road, Pudong New Area, Shanghai Patentee after: MOUXIN TECHNOLOGY (SHANGHAI) Co.,Ltd. Address before: Room 507, building 1, No. 800, Naxian Road, pilot Free Trade Zone, Pudong New Area, Shanghai 201210 Patentee before: MOUXIN TECHNOLOGY (SHANGHAI) Co.,Ltd. |
|
CP02 | Change in the address of a patent holder |