CN113536084B - Space-time event extraction system and method - Google Patents

Space-time event extraction system and method Download PDF

Info

Publication number
CN113536084B
CN113536084B CN202110724990.3A CN202110724990A CN113536084B CN 113536084 B CN113536084 B CN 113536084B CN 202110724990 A CN202110724990 A CN 202110724990A CN 113536084 B CN113536084 B CN 113536084B
Authority
CN
China
Prior art keywords
time
target
data
space
characteristic value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110724990.3A
Other languages
Chinese (zh)
Other versions
CN113536084A (en
Inventor
杨帆
董正宏
吴忠望
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Peoples Liberation Army Strategic Support Force Aerospace Engineering University
Original Assignee
Peoples Liberation Army Strategic Support Force Aerospace Engineering University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Peoples Liberation Army Strategic Support Force Aerospace Engineering University filed Critical Peoples Liberation Army Strategic Support Force Aerospace Engineering University
Priority to CN202110724990.3A priority Critical patent/CN113536084B/en
Publication of CN113536084A publication Critical patent/CN113536084A/en
Application granted granted Critical
Publication of CN113536084B publication Critical patent/CN113536084B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/907Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/909Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using geographical or spatial information, e.g. location
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Biophysics (AREA)
  • Evolutionary Computation (AREA)
  • Biomedical Technology (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Computational Linguistics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Databases & Information Systems (AREA)
  • Library & Information Science (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The invention particularly relates to a spatio-temporal event extraction method, which comprises the following steps: 1. acquiring target real-time space-time trajectory data; 2. extracting a characteristic value of each target real-time space-time trajectory data according to the space-time meta model; 3. and judging whether the characteristic value of the target real-time space-time trajectory data and the characteristic value of the event type meet the corresponding relation or not, and if the characteristic value of the target real-time space-time trajectory data and the characteristic value of the event type meet the corresponding relation, judging that the target real-time space-time trajectory data is the event type. The space-time event extraction system provided by the invention can extract events according to a series of factors such as space-time relationship, staying place, staying time, visiting sequence, occurrence time and the like, obtain richer space flight information and provide support for extracting a target space-time association relationship.

Description

Space-time event extraction system and method
Technical Field
The invention relates to the technical field of computer data processing, in particular to a system and a method for extracting a spatiotemporal event.
Background
In order to realize all-time, all-weather and all-region integration of aerospace multi-source heterogeneous information and serve national security, command decision and civil economy, not only the time and the place of a target need to be known, but also the time and the place of the target need to be excavated, so that rich target information can be obtained. The method is characterized in that space-time event extraction research based on space multisource heterogeneous information is developed as soon as possible based on the existing space electronic sensing and remote sensing information resource basis of China, events of targets participating in different places are effectively mined, the problems of discovery and prediction of sensitive events of sensitive targets are solved, and the situation can be predicted in time.
Extracting events related to the target from the raw data is an effective method to reduce the amount of data and pattern changes to enhance the analysis focus. Currently, the detection of events is mainly performed by data outliers in time series, and such events depend on event data of single behaviors, for example: click volume and access times. However, the event of target participation reflected by the spatio-temporal data sequence is not an abnormal value, so the patent judges the event of target participation according to the relationship among a series of elements such as a target, a spatio-temporal relationship, a staying place, a duration, an access sequence, an occurrence time and the like.
Disclosure of Invention
Based on this, it is necessary to provide a spatiotemporal time extraction system capable of determining the event type of target participation from the relationship between elements in the spatiotemporal data sequence, in order to solve the problem that the event type of target participation cannot be determined accurately when the existing event extraction technology mainly determines the event type by the data abnormal value in the spatiotemporal data sequence and the data in the spatiotemporal data sequence is not the abnormal value.
In order to achieve the above purpose, the invention provides the following technical scheme:
the invention provides a space-time event extraction system, which comprises a data acquisition module, a data preparation module, a data training module and a data judgment module, wherein the data acquisition module is used for acquiring a time-space event;
the data acquisition module is used for acquiring target historical space-time trajectory data and target real-time space-time trajectory data and respectively sending the data to the data preparation module;
the data preparation module is used for receiving the target historical space-time trajectory data and the target real-time space-time trajectory data sent by the data acquisition module, extracting a characteristic value of each piece of target real-time space-time trajectory data according to a built-in event meta-model, sending the characteristic value of each piece of target historical space-time trajectory data to the data judgment module, marking an event type of each piece of target historical space-time trajectory data, and sending the characteristic value and the event type of each piece of target historical space-time trajectory data as training data to the data training module;
the data training module is used for receiving training data sent by the data preparation module, taking the training data as the input of a convolutional neural network, training the corresponding relation between the characteristic value of each piece of target historical space-time trajectory data and the event type by using the conventional convolutional neural network method, obtaining the characteristic value of each type of event and sending the characteristic value to the data judgment module;
the data judgment module is used for receiving the characteristic value of each piece of target real-time space-time trajectory data sent by the data preparation module and the characteristic value of each type of event sent by the data training module, judging whether the characteristic value of each piece of target real-time space-time trajectory data and the characteristic value of the event type meet the corresponding relation or not, and if the characteristic value of the target real-time space-time trajectory data and the characteristic value of a certain event type meet the corresponding relation, judging that the target real-time space-time trajectory data is the event type.
Further, the target historical spatiotemporal trajectory data comprises a target type, a time when the target reaches each trajectory, a stay time of the target in each trajectory, a stay position of the target in each trajectory, a stay time of the target in each trajectory and spatial semantic information of the stay position.
Further, the target real-time space-time trajectory data comprises a target type, the time when the target reaches each trajectory, the stay time of the target in each trajectory, the stay position of the target in each trajectory, the stay time of the target in each trajectory and the spatial semantic information of the stay position.
Further, the characteristic value of each piece of target historical space-time data and the characteristic value of each piece of target real-time space-time data are extracted through an event meta-model, and the characteristic values of the target historical space-time data and the characteristic values of the target real-time space-time data respectively comprise a target type, the time when the target reaches each track, the stay time of the target on each track and the stay position of the target on each track. The spatiotemporal trajectory may involve a spatiotemporal sequence of multiple sites, but the spatiotemporal relationships between 2 sites visited in succession may be summarized using an event meta-model, i.e. the event meta-model may be used to combine spatiotemporal trajectories describing multiple visited sites. The expression of the event meta-model is as follows:
Figure BDA0003138238000000031
wherein, c 0 Indicates the object type, L n Indicating the stopping position, T, at which the target reaches the nth track n Indicating the moment at which the target reaches the nth track,
Figure BDA0003138238000000032
indicating the dwell time, L, of the target on the nth track n+1 Indicating the stopping position, T, at which the target reaches the (n + 1) th track n+1 Indicating that the target reaches the (n + 1) thThe time of day of the trajectory is,
Figure BDA0003138238000000033
indicating the dwell time of the target at the (n + 1) th track, E indicating the marked event type, n =1,2,3, \ 8230.
Further, the data judgment module judges whether the characteristic value of each piece of target real-time spatiotemporal trajectory data and the characteristic value of the event type satisfy the corresponding relationship, and the method comprises the following steps:
if target real-time space-time trajectory data P j Has a characteristic value of
Figure BDA0003138238000000034
Wherein, c 0j Target type in jth track, L nj Is the ith node, T, of the j-th trace nj The time of reaching the ith node in the jth trace,
Figure BDA0003138238000000035
is the dwell time at the ith node, L, in the jth trace (n+1)j The time when the (i + 1) th node is reached in the jth trace,
Figure BDA0003138238000000036
is the dwell time at the (i + 1) th node in the jth trace;
event type E i The characteristic values of (A) are:
Figure BDA0003138238000000041
then judge P j And E i Similarity of (2) i Can be expressed as:
Figure BDA0003138238000000042
P j class of eventsType j Comprises the following steps:
Figure BDA0003138238000000043
further, training the training data by using a convolutional neural network, namely classifying the training data according to event types; the output of the convolutional neural network is the classification result, and the classification result can be classified into what types of event types; the form of each class of output is also consistent with the meta-model.
The invention also provides a spatiotemporal event extraction method, which uses the spatiotemporal event extraction system and comprises the following steps:
1. acquiring real-time space-time trajectory data of a target;
2. extracting a characteristic value of each piece of target real-time space-time trajectory data according to the event meta-model;
3. and judging whether the characteristic value of the target real-time space-time trajectory data and the characteristic value of the event type meet the corresponding relation or not, and if the characteristic value of the target real-time space-time trajectory data and the characteristic value of a certain event type meet the corresponding relation, judging that the target real-time space-time trajectory data is the event type.
Further, the characteristic value of each type of event is obtained by the following steps:
1. acquiring historical space-time trajectory data of a target from space electronic sensing data, space remote sensing images and/or GIS data;
2. extracting a characteristic value of each piece of target historical space-time trajectory data according to the event meta-model, and marking the event type of each piece of target historical space-time trajectory data;
3. and (3) taking the characteristic value and the event type of each piece of target historical space-time trajectory data as the input of a convolutional neural network, and training the corresponding relation between the characteristic value and the event type of each piece of target historical space-time trajectory data by using the conventional convolutional neural network method to obtain the characteristic value of each type of event.
Further, acquiring a target type, a time when the target reaches each track, a stay time of the target in each track and a stay position of the target in each track from the space electronic perception data; acquiring a target type, the time when the target reaches each track, the stay time of the target in each track and the stay position of the target in each track from the space remote sensing image; and acquiring the staying position of the target in each track, the staying time of the target in each track and the space semantic information of the staying position from the GIS data.
The invention has the beneficial technical effects that:
the space-time event extraction system and method provided by the invention can extract events according to a series of factors such as space-time relationship, dwell place, dwell time, access sequence, occurrence time and the like, obtain richer space flight information and provide support for extracting the target space-time association relationship.
Drawings
FIG. 1 is a flow chart of spatiotemporal event extraction;
FIG. 2 is a diagram of an event meta-model.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and examples.
The invention provides a space-time event extraction system, which comprises a data acquisition module, a data preparation module, a data training module and a data judgment module, wherein the data acquisition module is used for acquiring a time-space event;
the data acquisition module is used for acquiring target historical space-time trajectory data and target real-time space-time trajectory data and respectively sending the data to the data preparation module;
the data preparation module is used for receiving the target historical space-time trajectory data and the target real-time space-time trajectory data sent by the data acquisition module, extracting a characteristic value of each piece of target real-time space-time trajectory data according to a built-in event meta-model and sending the characteristic value to the data judgment module;
extracting the characteristic value of each piece of target historical space-time trajectory data, marking the event type of each piece of target historical space-time trajectory data, and sending the characteristic value and the event type of each piece of target historical space-time trajectory data as training data to a data training module;
the data training module is used for receiving training data sent by the data preparation module, taking the training data as the input of a convolutional neural network, training the corresponding relation between the characteristic value of each piece of target historical space-time trajectory data and the event type by using the conventional convolutional neural network method, obtaining the characteristic value of each type of event and sending the characteristic value to the data judgment module;
the data judgment module is used for receiving the characteristic value of each target real-time space-time trajectory data sent by the data preparation module and the characteristic value of each type of event sent by the data training module,
and judging whether the characteristic value of each piece of target real-time space-time trajectory data and the characteristic value of the event type meet the corresponding relationship, and if the characteristic value of the target real-time space-time trajectory data and the characteristic value of a certain event type meet the corresponding relationship, judging that the target real-time space-time trajectory data is the event type.
Further, the target historical spatiotemporal trajectory data comprises a target type, a time when the target reaches each trajectory, a stay time of the target in each trajectory, a stay position of the target in each trajectory, a stay time of the target in each trajectory and space semantic information of the stay position.
Further, the target real-time space-time trajectory data comprises a target type, the time when the target reaches each trajectory, the stay time of the target in each trajectory, the stay position of the target in each trajectory, the stay time of the target in each trajectory and the spatial semantic information of the stay position.
Further, the characteristic value of each piece of target historical spatiotemporal data and the characteristic value of each piece of target real-time spatiotemporal data are extracted through an event meta-model, the characteristic values of the target historical spatiotemporal data and the characteristic values of the target real-time spatiotemporal data comprise a target type, a time when the target reaches each track, a stay time of the target in each track and a stay position of the target in each track, the spatiotemporal tracks may relate to a spatiotemporal sequence of a plurality of places, but the spatiotemporal relation between 2 places which are visited continuously can be summarized through the event meta-model, that is, the event meta-model can be used for combining and describing the spatiotemporal tracks relating to a plurality of visited places. The expression of the event meta-model is as follows:
Figure BDA0003138238000000071
wherein, c 0 Indicates the object type, L n Indicating the stopping position of the target at the nth track, T n Indicating the moment at which the target reaches the nth track,
Figure BDA0003138238000000072
indicating the dwell time, L, of the target on the nth track n+1 Indicating the stopping position, T, at which the target reaches the (n + 1) th track n+1 Indicating the time at which the target reaches the (n + 1) th track,
Figure BDA0003138238000000073
indicating the dwell time of the target at the (n + 1) th track, E indicating the marked event type, n =1,2,3, \8230.
Further, training the training data by using a convolutional neural network is a process of classifying the training data according to event types. The output of the convolutional neural network is the classification result, and the classification result is classified into what types according to the number of event types. The form of each output class is consistent with the meta-model, and for a certain class of events E i Each attribute of which is a specific characteristic value, such as [ cruiser, quay 1, 22]。
Further, the data judgment module judges whether the characteristic value of each piece of target real-time spatiotemporal trajectory data and the characteristic value of the event type satisfy the corresponding relationship, and comprises the following steps:
if target real-time space-time trajectory data P j Has a characteristic value of
Figure BDA0003138238000000074
Wherein, P j Is the jth track, not necessarily which type of object's track, c 0j Target type in jth track, L nj Is the ith node, T, of the jth trace that arrives in nj The time when the ith node is reached in the jth trace,
Figure BDA0003138238000000081
is the dwell time at the ith node, L, in the jth trace (n+1)j The time when the (i + 1) th node is reached in the jth trace,
Figure BDA0003138238000000082
is the dwell time at the (i + 1) th node in the jth trace;
event type E i The characteristic values of (A) are:
Figure BDA0003138238000000083
wherein E is i The model is the ith model in the space-time event model obtained by training;
then judge P j And E i Similarity of (2) i Can be expressed as:
Figure BDA0003138238000000084
P j event type of j Is composed of
Figure BDA0003138238000000085
The invention also provides a spatiotemporal event extraction method, which uses the spatiotemporal event extraction system and comprises the following steps:
1. data acquisition: acquiring target historical space-time trajectory data and target real-time space-time trajectory data from space electronic sensing data, space remote sensing images and/or GIS data;
2. preparing data: extracting a characteristic value of each piece of target historical space-time trajectory data according to the event meta-model, and marking the event type of each piece of target historical space-time trajectory data; extracting a characteristic value of each piece of target real-time space-time trajectory data according to the event meta-model;
3. training data: taking the characteristic value and the event type of each piece of target historical space-time trajectory data as the input of a convolutional neural network, training the corresponding relation between the characteristic value and the event type of each piece of target historical space-time trajectory data by using the conventional convolutional neural network method, and obtaining the characteristic value of each type of event;
4. and (3) data judgment: and judging whether the characteristic value of the target real-time space-time trajectory data and the characteristic value of the event type meet the corresponding relation or not, and if the characteristic value of the target real-time space-time trajectory data and the characteristic value of a certain event type meet the corresponding relation, judging that the target real-time space-time trajectory data is the event type.
Further, acquiring a target type, a time when the target reaches each track, a stay time of the target in each track and a stay position of the target in each track from the space electronic perception data; acquiring a target type, the time when the target reaches each track, the stay time of the target in each track and the stay position of the target in each track from the space remote sensing image; and acquiring the staying position of the target in each track, the staying time of the target in each track and the spatial semantic information of the staying position from the GIS data.
The above-mentioned embodiments only express several embodiments of the present invention, and the description thereof is more specific and detailed, but not construed as limiting the scope of the present invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the inventive concept, which falls within the scope of the present invention. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (10)

1. A space-time event extraction system is characterized by comprising a data acquisition module, a data preparation module, a data training module and a data judgment module;
the data acquisition module is used for acquiring target historical space-time trajectory data and target real-time space-time trajectory data and respectively sending the data to the data preparation module;
the data preparation module is used for receiving the target historical space-time trajectory data and the target real-time space-time trajectory data sent by the data acquisition module, extracting a characteristic value of each piece of target real-time space-time trajectory data and sending the characteristic value to the data judgment module, extracting a characteristic value of each piece of target historical space-time trajectory data, marking an event type of each piece of target historical space-time trajectory data, and sending the characteristic value and the event type of each piece of target historical space-time trajectory data as training data to the data training module;
the data training module is used for receiving training data sent by the data preparation module, using the training data as input of a convolutional neural network, training the corresponding relation between the characteristic value of each piece of target historical space-time trajectory data and the event type, obtaining the characteristic value of each type of event and sending the characteristic value to the data judgment module;
the data judgment module is used for receiving the characteristic value of each piece of target real-time space-time trajectory data sent by the data preparation module and the characteristic value of each type of event sent by the data training module, judging whether the characteristic value of each piece of target real-time space-time trajectory data and the characteristic value of the event type meet the corresponding relation or not, and if the characteristic value of the target real-time space-time trajectory data and the characteristic value of a certain event type meet the corresponding relation, judging that the target real-time space-time trajectory data is the event type.
2. The spatiotemporal event extraction system according to claim 1, wherein the target historical spatiotemporal trajectory data includes target type, time of arrival of the target at each trajectory, dwell time of the target at each trajectory, dwell position of the target at each trajectory, dwell time of the target at each trajectory, and spatial semantic information of the dwell position.
3. The spatiotemporal event extraction system of claim 1, wherein the target real-time spatiotemporal trajectory data comprises a target type, a time at which the target reaches each trajectory, a dwell time of the target on each trajectory, a dwell position of the target on each trajectory, a dwell time of the target on each trajectory, and spatial semantic information of the dwell position.
4. The spatiotemporal event extraction system according to claim 1, wherein the eigenvalues of each target historical spatiotemporal data and the eigenvalues of target real-time spatiotemporal data are extracted by an event meta-model, the expression of which is as follows:
Figure FDA0003138237990000021
wherein, c 0 Represents the object type, L n Indicating the stopping position, T, at which the target reaches the nth track n Indicating the moment when the target reaches the nth track,
Figure FDA0003138237990000022
represents the dwell time of the target on the nth track, L n+1 Indicating the stopping position, T, at which the target reaches the (n + 1) th track n+1 Indicating the time at which the target reaches the (n + 1) th track,
Figure FDA0003138237990000023
indicating the dwell time of the target at the (n + 1) th track, E indicating the marked event type, n =1,2,3, \ 8230.
5. The spatiotemporal event extraction system as defined in claim 1, wherein the eigenvalues of the target historical spatiotemporal data and the eigenvalues of the target real-time spatiotemporal data each comprise a target type, a time when the target reaches each trajectory, a dwell time of the target in each trajectory, and an eigenvalue of a dwell position of the target in each trajectory.
6. The spatiotemporal event extraction system according to claim 1, wherein the data determination module determines whether the eigenvalue of each target real-time spatiotemporal trajectory data and the eigenvalue of the event type satisfy a correspondence, comprising the steps of:
if target real-time space-time trajectory data P j Has a characteristic value of
Figure FDA0003138237990000024
Wherein, c 0j Target type in jth track, L nj Is the ith node, T, of the jth trace that arrives in nj The time when the ith node is reached in the jth trace,
Figure FDA0003138237990000025
is the dwell time at the ith node, L, in the jth trace (n+1)j The time when the (i + 1) th node is reached in the jth trace,
Figure FDA0003138237990000031
is the dwell time at the (i + 1) th node in the jth trace;
event type E i Has a characteristic value of
Figure FDA0003138237990000032
Then judge P j And E i Similarity of (2) i Can be expressed as
Figure FDA0003138237990000033
P j Event type of j Is composed of
Figure FDA0003138237990000034
7. A spatiotemporal event extraction method using the spatiotemporal event extraction system according to any one of claims 1 to 6, comprising the steps of:
(1) Acquiring real-time space-time trajectory data of the target;
(2) Extracting a characteristic value of each piece of target real-time space-time trajectory data according to the event meta-model;
(3) And judging whether the characteristic value of the target real-time space-time trajectory data and the characteristic value of the event type meet the corresponding relation or not, and if the characteristic value of the target real-time space-time trajectory data and the characteristic value of a certain event type meet the corresponding relation, judging that the target real-time space-time trajectory data is the event type.
8. The spatiotemporal event extraction method according to claim 7, wherein the feature value of each event type is obtained by:
(1) Acquiring historical space-time trajectory data of a target from space electronic sensing data, space remote sensing images and/or GIS data;
(2) Extracting a characteristic value of each piece of target historical space-time trajectory data according to the event meta-model, and marking the event type of each piece of target historical space-time trajectory data;
(3) And taking the characteristic value and the event type of each piece of target historical space-time trajectory data as the input of a convolutional neural network, and training the corresponding relation between the characteristic value and the event type of each piece of target historical space-time trajectory data by using the conventional convolutional neural network method to obtain the characteristic value of each type of event.
9. The spatiotemporal event extraction method according to claim 7, characterized in that the target type, the time when the target reaches each track, the stay time of the target on each track, and the stay position of the target on each track are obtained from the space electronic sensing data; and obtaining the type of the target, the time when the target reaches each track, the stay time of the target in each track and the stay position of the target in each track from the space remote sensing image.
10. The spatiotemporal event extraction method according to claim 7, wherein the GIS data acquires the staying position of the target in each track, and the spatial semantic information of the staying time and the staying position of the target in each track.
CN202110724990.3A 2021-06-29 2021-06-29 Space-time event extraction system and method Active CN113536084B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110724990.3A CN113536084B (en) 2021-06-29 2021-06-29 Space-time event extraction system and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110724990.3A CN113536084B (en) 2021-06-29 2021-06-29 Space-time event extraction system and method

Publications (2)

Publication Number Publication Date
CN113536084A CN113536084A (en) 2021-10-22
CN113536084B true CN113536084B (en) 2022-10-14

Family

ID=78097060

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110724990.3A Active CN113536084B (en) 2021-06-29 2021-06-29 Space-time event extraction system and method

Country Status (1)

Country Link
CN (1) CN113536084B (en)

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110334111B (en) * 2019-06-13 2023-06-02 武汉市公安局视频侦查支队 Multidimensional track analysis method and device
CN112465866B (en) * 2020-11-27 2024-02-02 杭州海康威视数字技术股份有限公司 Multi-target track acquisition method, device, system and storage medium

Also Published As

Publication number Publication date
CN113536084A (en) 2021-10-22

Similar Documents

Publication Publication Date Title
CN101778260B (en) Method and system for monitoring and managing videos on basis of structured description
CN102164270A (en) Intelligent video monitoring method and system capable of exploring abnormal events
CN111291587A (en) Pedestrian detection method based on dense crowd, storage medium and processor
CN113808166B (en) Single-target tracking method based on clustering difference and depth twin convolutional neural network
CN115527269B (en) Intelligent human body posture image recognition method and system
Zhao et al. Detection of passenger flow on and off buses based on video images and YOLO algorithm
CN105657372A (en) Method and system for realizing intelligent detection and early warning of on-duty guard posts by videos
CN116311063A (en) Personnel fine granularity tracking method and system based on face recognition under monitoring video
US20180144074A1 (en) Retrieving apparatus, display device, and retrieving method
Dou et al. An improved yolov5s fire detection model
Huang et al. Survey of target detection algorithms in SAR images
CN113536084B (en) Space-time event extraction system and method
Ji et al. News videos anchor person detection by shot clustering
CN117636454A (en) Intelligent video behavior analysis method based on computer vision
CN110378384A (en) A kind of image classification method of combination privilege information and sequence support vector machines
CN108595469A (en) A kind of semantic-based agricultural machinery monitor video image section band Transmission system
CN116189026A (en) Pedestrian re-recognition method and device and storage medium
CN110427920B (en) Real-time pedestrian analysis method oriented to monitoring environment
Xu et al. Crowd density estimation based on improved Harris & OPTICS Algorithm
Min et al. Vehicle detection method based on deep learning and multi-layer feature fusion
CN111832451A (en) Airworthiness monitoring process supervision system and method based on video data processing
CN112597976B (en) Intelligent prevention and control method and intelligent prevention and control system for target object
Mohandoss et al. Multi-Object Detection using Enhanced YOLOv2 and LuNet Algorithms in Surveillance Videos
Xie et al. Semantic-based traffic video retrieval using activity pattern analysis
CN113468390A (en) Space-time co-occurrence analysis model and method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant