CN111724412A - Method and device for determining motion trail and computer storage medium - Google Patents

Method and device for determining motion trail and computer storage medium Download PDF

Info

Publication number
CN111724412A
CN111724412A CN202010555795.8A CN202010555795A CN111724412A CN 111724412 A CN111724412 A CN 111724412A CN 202010555795 A CN202010555795 A CN 202010555795A CN 111724412 A CN111724412 A CN 111724412A
Authority
CN
China
Prior art keywords
motion
track
trajectory
overlapping
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010555795.8A
Other languages
Chinese (zh)
Inventor
简春菲
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Hikvision Digital Technology Co Ltd
Original Assignee
Hangzhou Hikvision Digital Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Hikvision Digital Technology Co Ltd filed Critical Hangzhou Hikvision Digital Technology Co Ltd
Priority to CN202010555795.8A priority Critical patent/CN111724412A/en
Publication of CN111724412A publication Critical patent/CN111724412A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning

Abstract

The embodiment of the application discloses a method and a device for determining a motion trail and a computer storage medium, and belongs to the field of monitoring. Determining the motion tracks of a first motion track and a second motion track which appear in the same time period to obtain a first overlapping track and a second overlapping track, and determining the similarity between the first overlapping track and the second overlapping track; and if the similarity between the first overlapped track and the second overlapped track is greater than the similarity threshold value, determining that the first motion track and the second motion track are the motion tracks of the same object. In the embodiment of the application, whether the two motion tracks are the motion tracks of the same object is judged based on the part of the two motion tracks overlapped in space-time, so that the situation that whether the two motion tracks are the motion tracks of the same object is judged according to the image characteristics of the objects corresponding to different motion tracks is avoided, and the situation that the judgment is wrong according to the image characteristics is avoided, and the monitoring accuracy is improved.

Description

Method and device for determining motion trail and computer storage medium
Technical Field
The embodiment of the application relates to the technical field of monitoring, in particular to a method and a device for determining a motion trail and a computer storage medium.
Background
In order to realize accurate monitoring of a large area, a plurality of cameras are deployed in the area, the plurality of cameras are respectively responsible for monitoring a part of the area, and the monitoring ranges of the cameras are different. In such a scenario, the motion trajectories of the cameras for the same object need to be acquired, and the motion trajectories of the cameras for the same object need to be correlated, so that the object can be monitored in the area.
In the related art, any camera can track the motion trajectories of all objects in a corresponding monitoring range, and simultaneously acquire the image characteristics of each object. In this way, for two objects respectively tracked by two cameras, whether the two objects are the same object can be judged according to the image characteristics of the two objects. If the object is the same object, the two motion tracks of the object tracked by the two cameras can be correlated to obtain the motion track of the object in the monitoring range of the two cameras.
However, the above monitoring method is prone to have a case that the same object is not determined as the same object, which results in a monitoring failure.
Disclosure of Invention
The embodiment of the application provides a method and a device for determining motion tracks and a computer storage medium, which can improve the accuracy of judging which motion tracks are the motion tracks of the same object, thereby improving the monitoring precision of the object. The technical scheme is as follows:
in one aspect, a method for determining a motion trajectory is provided, the method comprising:
determining motion tracks of a first motion track and a second motion track which are tracked by a second camera in the same time period according to the first motion track tracked by the first camera and the second motion track tracked by the second camera to obtain a first overlapping track and a second overlapping track, wherein the monitoring range of the first camera is overlapped with the monitoring range of the second camera;
determining a similarity between the first overlapping trajectory and the second overlapping trajectory;
and if the similarity between the first overlapping track and the second overlapping track is greater than a similarity threshold, determining that the first motion track and the second motion track are the motion tracks of the same object.
Optionally, the determining a similarity between the first overlapping trajectory and the second overlapping trajectory comprises:
determining spatiotemporal features of the first and second overlapping trajectories, respectively, the spatiotemporal features being indicative of features of shape over time of the respective motion trajectories;
determining a similarity between the first overlapping trajectory and the second overlapping trajectory based on the spatiotemporal features of the first overlapping trajectory and the spatiotemporal features of the second overlapping trajectory.
Optionally, the spatiotemporal features include curvatures of respective trajectory points on the corresponding motion trajectories;
the determining the spatiotemporal characteristics of the first overlapping trajectory comprises:
dividing the first motion track into a plurality of sections of tracks;
respectively determining a motion model of each track in the plurality of tracks, wherein the motion model is used for indicating the change situation of the shape of the corresponding track along with time;
determining a dynamic coefficient of each track point on the first motion track according to a motion model of each track in the plurality of tracks, wherein the dynamic coefficient is used for indicating a corresponding relation between the position of the corresponding track point and time;
and determining the curvature of each track point on the first overlapped track according to the dynamic coefficient of each track point on the first motion track.
Optionally, the dividing the first motion trajectory into a plurality of segments of trajectories includes:
and dividing the first motion track into a plurality of sections of tracks in a sliding window mode according to the size of a reference window and a reference sliding step length, wherein the reference sliding step length is smaller than the size of the reference window.
Optionally, the separately determining a motion model of each of the plurality of segments of trajectories includes:
and determining a motion model of each section of track in the plurality of sections of tracks in a ridge regression mode.
Optionally, the spatiotemporal features include one or more of a motion direction, a motion speed, a distance between two adjacent track points on the corresponding motion trajectory, a length of the corresponding motion trajectory, and a duration of the corresponding motion trajectory.
In another aspect, an apparatus for determining a motion trajectory is provided, the apparatus comprising:
the first determining module is used for determining motion tracks of a first camera and a second camera in the same time period according to a first motion track tracked by the first camera and a second motion track tracked by the second camera to obtain a first overlapping track and a second overlapping track, wherein the first camera and the second camera are two cameras adjacent to each other in a monitoring range;
a second determining module for determining a similarity between the first overlapping trajectory and the second overlapping trajectory;
a third determining module, configured to determine that the first motion trajectory and the second motion trajectory are motion trajectories of the same object if a similarity between the first overlapping trajectory and the second overlapping trajectory is greater than a similarity threshold.
Optionally, the second determining module is configured to:
determining spatiotemporal features of the first and second overlapping trajectories, respectively, the spatiotemporal features being indicative of features of shape over time of the respective motion trajectories;
determining a similarity between the first overlapping trajectory and the second overlapping trajectory based on the spatiotemporal features of the first overlapping trajectory and the spatiotemporal features of the second overlapping trajectory.
Optionally, the spatiotemporal features include curvatures of respective trajectory points on the corresponding motion trajectories;
the second determination module is to:
dividing the first motion track into a plurality of sections of tracks;
respectively determining a motion model of each track in the plurality of tracks, wherein the motion model is used for indicating the change situation of the shape of the corresponding track along with time;
determining a dynamic coefficient of each track point on the first motion track according to a motion model of each track in the plurality of tracks, wherein the dynamic coefficient is used for indicating a corresponding relation between the position of the corresponding track point and time;
and determining the curvature of each track point on the first overlapped track according to the dynamic coefficient of each track point on the first motion track.
Optionally, the second determining module is configured to:
and dividing the first motion track into a plurality of sections of tracks in a sliding window mode according to the size of a reference window and a reference sliding step length, wherein the reference sliding step length is smaller than the size of the reference window.
Optionally, the second determining module is configured to:
and determining a motion model of each section of track in the plurality of sections of tracks in a ridge regression mode.
Optionally, the spatiotemporal features include one or more of a motion direction, a motion speed, a distance between two adjacent track points on the corresponding motion trajectory, a length of the corresponding motion trajectory, and a duration of the corresponding motion trajectory.
In another aspect, an apparatus for determining a motion trajectory is provided, the apparatus comprising a processor, a communication interface, a memory, and a communication bus;
the processor, the communication interface and the memory complete mutual communication through the communication bus;
the memory is used for storing computer programs;
the processor is used for executing the program stored in the memory to realize the method for determining the motion trail.
In another aspect, a computer-readable storage medium is provided, in which a computer program is stored, which computer program, when being executed by a processor, carries out the steps of the method of determining a motion trajectory as provided in the preceding description.
The beneficial effects brought by the technical scheme provided by the embodiment of the application at least comprise:
in the embodiment of the application, whether the two motion tracks are the motion tracks of the same object is judged based on the part of the two motion tracks overlapped in space-time, so that the situation that whether the two motion tracks are the motion tracks of the same object is judged according to the image characteristics of the objects corresponding to different motion tracks is avoided, and the situation that the judgment is wrong according to the image characteristics is avoided, and the monitoring accuracy is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic diagram of a monitoring system according to an embodiment of the present application.
Fig. 2 is a flowchart of a method for determining a motion trajectory according to an embodiment of the present disclosure.
Fig. 3 is a schematic diagram of a motion trajectory provided in an embodiment of the present application.
Fig. 4 is a schematic structural diagram of an apparatus for determining a motion trajectory according to an embodiment of the present application.
Fig. 5 is a schematic structural diagram of a terminal according to an embodiment of the present application.
Fig. 6 is a schematic structural diagram of a server according to an embodiment of the present application.
Detailed Description
To make the objects, technical solutions and advantages of the embodiments of the present application more clear, the embodiments of the present application will be further described in detail with reference to the accompanying drawings.
Before explaining the embodiments of the present application in detail, an application scenario related to the embodiments of the present application will be explained.
At present, in an indoor monitoring scene, an oblique camera is usually deployed at the position of a peripheral wall of an indoor area, but the oblique camera is easily shielded due to a view angle, so that acquired image features are unavailable, and problems such as tracking interruption easily occur. Moreover, the positioning accuracy of the tracking track of the tilt camera is poor, so the accuracy of cross-camera association is low.
In order to realize higher camera association accuracy, a vertical camera can be adopted in indoor monitoring so as to effectively avoid the shielding between targets and the problem of image distortion at the far end of the camera. The vertical camera adopted in the indoor monitoring scene specifically means that: a plurality of cameras are installed on the top of a monitored area (such as the ceiling of a room), and each camera is responsible for acquiring video of a sub-area of the monitored area in the vertical direction, namely the monitoring range of the camera. Each camera in this scenario is also referred to as a vertical surveillance camera.
In the above-mentioned indoor monitoring scene based on the vertical camera, if the monitored object is a pedestrian, the image features of a certain pedestrian collected by the camera are usually the head and shoulders of the pedestrian due to the fact that the camera is deployed at the top of the monitored area. This will result in very poor discrimination of image features of different pedestrians, making the correlation error rate across the motion trajectory under the camera very high.
The method for determining the motion trail provided by the embodiment of the application is applied to the scene of the motion trail related to the cross-camera, and aims to provide a method for judging whether the two motion trails are the motion trail of the same object or not based on the overlapped part of the two motion trails in time and space, so that the condition that the judgment errors are caused by the image characteristics can be avoided, and the monitoring accuracy is improved.
It should be noted that the method for determining a motion trajectory provided in the embodiment of the present application can be applied not only to a scene in which a motion trajectory is associated based on a vertical monitoring camera, but also to a scene in which a motion trajectory is associated based on a tilted monitoring camera (for example, a camera is mounted on a wall of a room). Or, it can be applied to any scene that needs to be associated with the motion trajectory across the cameras, and is not illustrated here.
Fig. 1 is a schematic view of a monitoring system according to an embodiment of the present application. As shown in fig. 1, the monitoring system 100 includes a plurality of cameras 101 and a terminal device 102. Any camera 101 and the terminal device 102 are connected in a wireless or wired manner to communicate with each other.
Any of the cameras 102 has a corresponding monitoring range. The camera 102 is used for capturing video of a corresponding monitoring range. The terminal device 102 is configured to associate a motion trajectory of the same object based on videos acquired by the cameras to obtain a motion trajectory of the object in the entire monitoring area, so as to monitor the object.
The camera 101 may be any camera capable of capturing video. The terminal device 102 may be a server terminal, a user terminal, or the like.
It should be noted that the monitoring system shown in fig. 1 is for illustration only. Optionally, the terminal device 102 in fig. 1 is a camera of a plurality of cameras. In other words, the method for determining a motion trajectory provided by the present application can be applied to a terminal device and also can be applied to a certain camera, and this is not specifically limited in this embodiment of the present application.
The following explains the method for determining a motion trajectory provided in the embodiments of the present application in detail.
Fig. 2 is a flowchart of a method for determining a motion trajectory according to an embodiment of the present disclosure. The method shown in fig. 2 is described by taking the application to a terminal device as an example. As shown in fig. 2, the method includes the following steps.
Step 201: determining the motion tracks of the first motion track and the second motion track in the same time period according to the first motion track tracked by the first camera and the second motion track tracked by the second camera to obtain a first overlapping track and a second overlapping track, wherein the monitoring range of the first camera is overlapped with the monitoring range of the second camera.
The first overlapping track is a partial track in the first motion track, and the second overlapping track is a partial track in the second motion track.
Because the monitoring range of the first camera and the monitoring range of the second camera overlap, when the same object passes through the monitoring ranges of the two cameras, two motion trajectories tracked by the two cameras necessarily have motion trajectories which overlap in space and time. The situation that whether two motion tracks are the motion tracks of the same object or not needs to be judged through the image characteristics of the objects corresponding to different motion tracks is avoided, so that the situation that the judgment is wrong through the image characteristics can be avoided, and the monitoring accuracy is improved.
For example, in an indoor monitoring scene, a plurality of vertical monitoring cameras are installed to monitor the entire indoor area. In order to ensure that the monitoring range can cover all areas, the monitoring ranges between adjacent vertical monitoring cameras are overlapped. The region where the monitoring ranges overlap is also referred to as a monitoring overlap region.
When a pedestrian enters a monitoring overlapping area which can be covered by the two vertical monitoring cameras, the two vertical monitoring cameras track the motion trail of the individual under the monitoring overlapping area. Meaning that the trajectories traced by the two cameras to the individual are overlapping (or repeating) trajectories over a period of time during which the object is moving within the monitoring overlap region. The overlapping tracks are also referred to as individual overlapping tracks.
The first motion trajectory in step 201 includes a plurality of trajectory points, each trajectory point has two parameters, one parameter is a position of the corresponding trajectory point, and the other parameter is a timestamp. Each trace point represents the location of the object at the corresponding timestamp.
Therefore, in a possible implementation manner, the implementation manner of the foregoing 201 is: the method comprises the steps of obtaining a timestamp distribution range on a first motion track, determining a timestamp distribution range of a second motion track, determining a superposition part between the timestamp distribution range on the first motion track and the timestamp distribution range of the second motion track, and obtaining an overlapping time period, wherein a track corresponding to the overlapping time period on the first motion track is a first overlapping track, and a track corresponding to the overlapping time period on the second motion track is a second overlapping track.
The implementation mode is equivalent to aligning the same or similar points of the timestamps of the two motion tracks, and finally, the track corresponding to the matched part on the timestamps is the overlapping track. Fig. 3 is a schematic diagram of a motion trajectory provided in an embodiment of the present application. There are three motion trajectories in the coordinate system shown in fig. 3, labeled id1, id2, and id3, respectively. In fig. 3, the horizontal direction is a coordinate axis corresponding to the time stamp, and the vertical direction is a coordinate axis corresponding to the position. The overlapping tracks of id1 and id2, and the overlapping tracks of id1 and id3 are shown in FIG. 3.
It should be noted that, if the object moves in the one-dimensional space, the position of the trace point only needs to be represented by a numerical value in one dimension. If the object moves in a two-dimensional space, the position of the track point needs numerical representation of two dimensions. Further, if the object moves in a three-dimensional space, the position of the track point needs to be represented by three-dimensional numerical values.
The following description will be given by taking an example in which an object moves in a two-dimensional space. At this time, the position of a certain track point can be represented by a position in a two-dimensional plane. The motion trajectory of an object includes two motion trajectories of the object in two dimensions of a two-dimensional plane. For example, the two-dimensional plane is represented as an XY coordinate system, so that the motion trajectory of an object includes a motion trajectory on the X axis and a motion trajectory on the Y axis. In this scenario, the first overlapping trajectory and the second overlapping trajectory can be represented by two motion trajectories, respectively, and will not be described in detail herein.
In addition, sampling intervals on the motion tracks tracked by different cameras may be different, and in order to facilitate rapid determination of overlapping tracks between the motion tracks, the first motion track and the second motion track are preprocessed in advance so that timestamp intervals of the first motion track and the second motion track are the same.
The following implementation will take the first motion trajectory as an example to illustrate how to pre-process the motion trajectory tracked by the camera.
In a possible implementation manner, the implementation process of preprocessing the first motion trajectory is as follows: and interpolating track points in the first motion track. And adding and deleting the track points according to the reference timestamp interval, so that the timestamp interval between two adjacent track points is consistent with the reference timestamp interval. This preprocessing process is also referred to as interpolation preprocessing.
After the second motion trajectory is also processed according to the reference timestamp interval, the timestamp intervals of the first motion trajectory and the second motion trajectory can be kept consistent.
Optionally, the process of adding and deleting the trace points according to the reference timestamp interval includes: and inserting a new track point between the two track points according to a track interpolation formula, and removing the track points which do not meet the conditions through a specified reference time stamp interval to obtain a new track with consistent time intervals. In other words, a new trace of fixed frequency is obtained.
For example, in an indoor monitoring scene, when an individual enters a monitoring view angle of one of the vertical monitoring cameras, the vertical monitoring camera tracks the individual in real time to form a complete motion trajectory of the individual under the monitoring range of the vertical monitoring camera. This motion trajectory is also referred to as a single-camera person trajectory. And generating a complete single-camera person track of each individual in the monitoring range of each vertical monitoring camera through the real-time tracking of each vertical monitoring camera.
After the motion trail of each individual tracked by each vertical monitoring camera in real time is obtained, the motion trail is preprocessed according to the implementation mode, and the motion trail with consistent time stamp intervals is obtained.
In addition, after the interpolation preprocessing is performed on the first motion trajectory, optionally, median filtering is also performed on the first motion trajectory, so as to filter out trace points with larger noise in the first motion trajectory. The specific method of median filtering is as follows: for a point i, the estimated value of its unknown true value is determined by the median of its temporally successive nearby k points (including its own observations). In the embodiment of the present application, the coordinate of the current trace point is estimated as the median of the coordinates of the nearby k points that are continuous in time.
Step 202: a similarity between the first overlapping trajectory and the second overlapping trajectory is determined.
In one possible implementation manner, the implementation procedure of step 202 is: determining spatiotemporal features of the first and second overlapping trajectories, respectively, the spatiotemporal features being indicative of a feature of a shape of the respective motion trajectory over time, determining a similarity between the first and second overlapping trajectories based on the spatiotemporal features of the first and second overlapping trajectories.
The space-time characteristics comprise one or more of curvature, motion direction, motion speed of the track points, distance between two adjacent track points on the corresponding motion track, length of the corresponding motion track and duration of the corresponding motion track.
The curvature of the track point can represent the bending degree of the position of the track point on the motion track, so that the shape of the motion track at each detail can be determined through the curvature of the track point. The motion direction refers to the direction of the last track point of the motion track relative to the first track point, and the overall trend of the motion track can be determined through the motion direction. The motion speed refers to the average motion speed of the object from the position indicated by the first track point to the position indicated by the last track point, and the average motion state of the object can be determined through the motion speed. In a possible implementation manner, the length of the motion trajectory is represented by the number of trajectory points included in the motion trajectory, and the more motion trajectories that overlap, the greater the probability that the motion trajectory is the same object. The duration of the motion trail refers to the difference between the timestamp of the last trace point and the timestamp of the first trace point on the motion trail. The longer the duration of the overlapped motion trajectories is, the greater the probability that the motion trajectories of the same object are. It should be noted that the length of the motion trajectory and the duration of the motion trajectory essentially represent information in the same dimension, and therefore, in the embodiment of the present application, one of the length of the motion trajectory and the duration of the motion trajectory may be selected as the spatio-temporal feature.
Therefore, in the embodiment of the present application, the motion trajectory is described by using the highly explanatory spatiotemporal features such as curvature, motion direction, speed, euclidean distance, trajectory length, trajectory duration, and the like. Therefore, the similarity of the motion tracks can be determined at a plurality of angles such as the shape, the spatial distance, the time dimension, the motion state and the like of the motion tracks, and whether the two motion tracks are the motion tracks of the same object can be accurately and quickly judged.
Under the condition that the space-time characteristics comprise the curvatures of all track points on the corresponding motion tracks, the implementation process of determining the space-time characteristics of the first overlapped track comprises the following steps: dividing the first motion track into a plurality of sections of tracks; respectively determining a motion model of each track in the plurality of tracks, wherein the motion model is used for indicating the change situation of the shape of the corresponding track along with time; determining a dynamic coefficient of each track point on the first motion track according to a motion model of each track in the plurality of tracks, wherein the dynamic coefficient is used for indicating the corresponding relation between the position of the corresponding track point and time; and determining the curvature of each track point on the first overlapped track according to the dynamic coefficient of each track point on the first motion track.
In one possible implementation, the object is described as moving in a two-dimensional space. At this time, the position of a certain track point can be represented by a position in a two-dimensional plane. The motion trajectory of an object includes two motion trajectories of the object in two dimensions of a two-dimensional plane. For example, the two-dimensional plane is represented as an XY coordinate system, so that the motion trajectory of an object includes a motion trajectory on the X axis and a motion trajectory on the Y axis. The curvature of any one of the trace points on the first overlapping trace can be determined by the following equation:
Figure BDA0002544264970000101
wherein the elements x 'and y' are the first derivatives of the motion trail on the x-axis and the motion trail on the y-axis respectively. The elements x ', y' are the second derivatives of the motion trajectory on the x-axis and the motion trajectory on the y-axis, respectively.
In order to prevent the over-fitting phenomenon caused by fitting a single motion model of the whole track, the motion track can be divided into multiple sections of tracks, and the multiple sections of tracks are respectively fitted. The technical effect of doing so is: more dynamic models on the actual motion trail of the object are learned, so that the track point position in the determined motion model is more fit with the actual situation of movement, and the accuracy of similarity calculation of the subsequent motion trail is improved.
In one possible implementation, the motion trajectory may be divided into a plurality of segments by a sliding window operation. Specifically, the first motion trajectory is divided into a plurality of segments of trajectories in a sliding window manner according to the size of a reference window and a reference sliding step length. Wherein the reference sliding step is smaller than the reference window size.
For example, the first motion trajectory has 100 trace points arranged in sequence from morning to evening according to the timestamp, and the 100 trace points are respectively marked as trace point 1, trace point 2, trace point … …, and trace point 100. The size of the reference window is 7 track points, and the reference sliding step length is 1 track point. After the sliding window operation, 94 sections of tracks are obtained, namely the motion tracks corresponding to track points 1-7, the motion tracks corresponding to track points 2-8, the motion tracks corresponding to track points 3-9, … …, the motion tracks corresponding to track points 93-99 and the motion tracks corresponding to track points 94-100.
The technical effect of determining the multi-section track in the sliding mode is as follows: the method can enable two adjacent tracks to have overlapped tracks, so that the determination of a plurality of motion models of the overlapped tracks is realized, and the dynamic coefficient of a track point on the overlapped tracks can be the average value of a plurality of dynamic coefficients in the plurality of motion models, so that the problem that the curvature is inaccurate due to the fact that the single motion model is determined by mistake is avoided.
It should be noted that the sliding window operation manner is only one implementation manner of dividing the first motion trajectory into multiple segments of trajectories, and the implementation manner of dividing the first motion trajectory into multiple segments of trajectories is not limited in the present application, and only overlapping trajectories are required to be provided between two adjacent segments of trajectories.
In addition, in a possible implementation manner, the implementation process of determining the motion model of each of the plurality of segments of trajectories is as follows: and determining a motion model of each section of track in the plurality of sections of tracks in a ridge regression mode.
The ridge regression is a linear regression algorithm added with a regular penalty term, and a regression coefficient is obtained by abandoning unbiasedness of a least square method at the cost of losing part of information and reducing precision, so that the regression method is more practical and reliable. The embodiment of the present application does not limit the specific implementation manner of ridge regression.
In one possible implementation, the object is described as moving in a two-dimensional space. At this time, the position of a certain track point can be represented by a position in a two-dimensional plane. The motion trajectory of an object includes two motion trajectories of the object in two dimensions of a two-dimensional plane. For example, the two-dimensional plane is represented as an XY coordinate system, so that the motion trajectory of an object includes a motion trajectory on the X axis and a motion trajectory on the Y axis. At this time, the specific process of fitting the motion model of a certain section of track by using ridge regression is as follows:
firstly, the time stamps of all track points in the track are normalized. The normalization is not limited in the manner, and the operation is intended to convert the time stamps into relative value relationships to simplify the calculation. And secondly, calculating the quadratic power of the time stamp of the track point, and expressing the relation between x and t and between y and t by using a function with the highest quadratic term for the walking of the human body, but not limited to the function with the highest quadratic term. Thirdly, selecting n track points of the track (taking n as an example of 7), wherein the data of each track point comprises t and t2X and y, respectively calling ridge regression algorithm to fit relational expressions of x and t, and y and t, wherein the expressions are respectively At + Bt2+C=x,Dt+Et2+F=y。
Taking fitting x and t as an example, the training set is t and t of 7 track points2And (3) a two-dimensional data set is labeled with x of seven track points, and the values of the coefficients of the x and t expressions can be obtained by sending the training set and the labeled data into a ridge regression algorithm. And obtaining the relational expression of y and t in the same way.
It should be noted that, the first motion trajectory is taken as an example for description, and the implementation manner of determining the spatiotemporal feature of the second motion trajectory may refer to the first motion trajectory, which is not described herein again.
In addition, it should be noted that, if the object moves in the multidimensional space, the fitting of the motion model for each dimension is only required when determining the motion model of the motion trajectory, but each dimension does not need to be considered in determining the curvature, the motion direction, the motion speed, the euclidean distance, and the like. For example, in determining the movement direction, the vector direction between the spatial position of the last track point (spatial position in the multidimensional space) and the spatial position of the first track point on the movement trajectory is taken into account.
In addition, under the condition that the space-time characteristics include the curvature of the track points, the motion direction, the motion speed, the distance between two adjacent track points on the corresponding motion trajectory, the length of the corresponding motion trajectory, and the duration of the corresponding motion trajectory, the implementation manner of determining the similarity between the first overlapped trajectory and the second overlapped trajectory according to the space-time characteristics of the first overlapped trajectory and the space-time characteristics of the second overlapped trajectory can be realized through the following steps.
(1) And determining curvature similarity, motion direction similarity, motion speed similarity and distance similarity.
Curvature similarity is used to indicate the difference between curvatures between two trace points with the same timestamp. Since the first overlapping trajectory and the second overlapping trajectory are motion trajectories composed of trajectory points overlapping on the time stamp, optionally, the curvature similarity is determined from a curvature difference between two trajectory points having the same time stamp. Alternatively, the curvature similarity can be determined in other ways, which are not illustrated here.
For example, the first overlapping track and the second overlapping track each include 10 track points. The time stamps of the 10 track points included in the first overlapped track and the time stamps of the 10 track points included in the second overlapped track are in one-to-one correspondence. For convenience of representation, 10 track points included in the first overlapped track are respectively represented as a1, a2, a3, a4, a5, a6, a7, a8, a9 and a10, and 10 track points included in the second overlapped track are respectively represented as b1, b2, b3, b4, b5, b6, b7, b8, b9 and b 10. The track points with the same number in the track point marks are the track points with the same timestamps, for example, a1 and b1 are two track points with the same timestamps, a2 and b2 are two track points with the same timestamps, and the like.
Then the similarity of the curvatures of the first overlapping trajectory and the second overlapping trajectory can be expressed as follows:
1/Wcurvature=(Sa1-Sb1)2+(Sa2-Sb2)2+(Sa3-Sb3)2……+(Sa10-Sb10)2
W is as described aboveCurvatureWhich represents the similarity of curvature of the first and second overlapping tracks, S represents the curvature of the track point indicated by its subscript.
Further, the movement direction similarity can be determined according to the movement directions of the first overlapping trajectory and the second overlapping trajectory. In a possible implementation manner, the larger the angle between the moving direction of the first overlapping trajectory and the moving direction of the second overlapping trajectory, the smaller the similarity of the moving directions. The smaller the angle between the moving direction of the first overlapping track and the moving direction of the second overlapping track, the greater the similarity of the moving directions.
Further, the motion speed similarity can be determined from the motion speeds of the first and second overlapping trajectories. In one possible implementation, the smaller the difference between the moving speed of the first overlapping trajectory and the moving speed of the second overlapping trajectory, the greater the degree of similarity of the moving speeds. The greater the difference between the moving speed of the first overlapped trajectory and the moving speed of the second overlapped trajectory, the smaller the moving direction similarity.
Further, the distance similarity can be determined from a difference in euclidean distance between two adjacent trace points having the same time stamp. In one possible implementation, the greater the difference in euclidean distance between two adjacent trace points having the same timestamp, the smaller the distance similarity. The smaller the difference in euclidean distance between two adjacent trace points having the same time stamp, the greater the distance similarity.
(2) And determining the similarity between the first overlapping track and the second overlapping track according to the curvature similarity, the motion direction similarity, the motion speed similarity, the distance similarity, the lengths of the corresponding motion tracks and the time lengths of the corresponding motion tracks.
In one possible implementation manner, the similarity between the first overlapping track and the second overlapping track and any one of the curvature similarity, the motion direction similarity, the motion speed similarity, the distance similarity, the length of the corresponding motion track and the time length of the corresponding motion track show a positive correlation.
For example, in one possible implementation manner, weights are set in advance for the curvature similarity, the motion direction similarity, the motion speed similarity, the distance similarity, the length of the corresponding motion trajectory, and the duration of the corresponding motion trajectory, respectively, then the curvature similarity, the motion direction similarity, the motion speed similarity, the distance similarity, the length of the corresponding motion trajectory, and the duration of the corresponding motion trajectory are multiplied by the corresponding weights, and the obtained products are added together to obtain a value, which is the similarity between the first overlapping trajectory and the second overlapping trajectory.
Further, it should be noted that, alternatively, in a scenario where it is determined whether the first motion trajectory and the second motion trajectory are motion trajectories of the same object or motion trajectories of the same object as the third motion trajectory, the similarity between the first overlapping trajectory and the second overlapping trajectory and the similarity between the first overlapping trajectory and the third overlapping trajectory, which are trajectories of the third motion trajectory that overlap with the first motion trajectory in the time stamp, may be determined by spatio-temporal characteristics other than the two characteristics of the trajectory length and the trajectory duration, respectively. In the case where the similarity of the first overlapping track and the second overlapping track, and the similarity of the first overlapping track and the third overlapping track are both greater than the similarity threshold, then the track lengths of the first overlapping track and the second overlapping track (here, the track durations may also be compared) are compared with the track lengths of the first overlapping track and the third overlapping track. And if the track lengths of the first overlapping track and the second overlapping track are greater than the track lengths of the first overlapping track and the third overlapping track, the first motion track and the second motion track are the motion tracks of the same object. And if the track lengths of the first overlapping track and the second overlapping track are smaller than the track lengths of the first overlapping track and the third overlapping track, the first motion track and the third motion track are the motion tracks of the same object. In this way, the efficiency of determining whether or not the same motion trajectory is present can be improved.
Step 203: and if the similarity between the first overlapped track and the second overlapped track is greater than the similarity threshold value, determining that the first motion track and the second motion track are the motion tracks of the same object.
The similarity threshold is a preset value, which can be set by an administrator or adaptively adjusted in the process of determining the motion trajectory, so as to improve the accuracy of the similarity threshold. This will not be described in detail in the examples of the present application.
The technical effects of the present application are described below by taking an indoor monitoring scene as an example.
In an indoor monitoring scene, because each vertical monitoring camera has a monitoring overlapping area, an individual (or a pedestrian) wants to walk from the monitoring range of one vertical monitoring camera to the monitoring range of an adjacent vertical monitoring camera, and definitely walks through the monitoring area where the two cameras overlap. If the two cameras shoot the same pedestrian at the same time, the tracks of the pedestrian tracked by the two cameras have strong similarity, and the tracks of the pedestrian under different cameras can be globally associated through calculation of the track feature similarity of the overlapping area, so that the complete track of the pedestrian in the whole monitoring range (a plurality of monitoring cameras) is obtained.
Therefore, in the embodiment of the present application, the motion model of the individual can be dynamically fitted with high accuracy through ridge regression and sliding window by the motion trajectory of the person under the single camera. Then, based on the motion model, curvature is extracted, and characteristics with strong interpretability, such as motion direction, motion speed, Euclidean distance, track length, overlapping time and the like, are determined. The method and the device can calculate the similarity of the pedestrian tracks at a plurality of angles such as the shape, the space distance, the time dimension, the motion state and the like of the pedestrian motion tracks, and can accurately and quickly judge whether the two motion tracks are the same pedestrian.
In summary, in the embodiment of the present application, whether two motion trajectories are motion trajectories of the same object is determined based on a portion of the two motion trajectories that overlap in space-time, so that it is avoided that whether two motion trajectories are motion trajectories of the same object needs to be determined according to image features of objects corresponding to different motion trajectories, and thus, a situation that determination is made incorrectly according to image features is avoided, and monitoring accuracy is improved.
All the above optional technical solutions can be combined arbitrarily to form an optional embodiment of the present application, and the present application embodiment is not described in detail again.
Fig. 4 is a schematic structural diagram of an apparatus for determining a motion trajectory according to an embodiment of the present application, where the apparatus may be implemented by software, hardware, or a combination of the two. Optionally, the apparatus 400 comprises:
a first determining module 401, configured to determine, according to a first motion trajectory tracked by a first camera and a second motion trajectory tracked by a second camera, motion trajectories that both occur in the same time period of the first motion trajectory and the second motion trajectory, to obtain a first overlapping trajectory and a second overlapping trajectory, where the first camera and the second camera are two cameras that are adjacent to each other in a monitoring range;
a second determining module 402 for determining a similarity between the first overlapping trajectory and the second overlapping trajectory;
a third determining module 403, configured to determine that the first motion trajectory and the second motion trajectory are motion trajectories of the same object if a similarity between the first overlapping trajectory and the second overlapping trajectory is greater than a similarity threshold.
Optionally, the second determining module is configured to:
determining spatiotemporal features of the first and second overlapping trajectories, respectively, the spatiotemporal features being indicative of features of shape over time of the respective motion trajectories;
determining a similarity between the first overlapping trajectory and the second overlapping trajectory based on the spatiotemporal features of the first overlapping trajectory and the spatiotemporal features of the second overlapping trajectory.
Optionally, the spatiotemporal features include curvatures of respective trajectory points on the corresponding motion trajectories;
the second determination module is to:
dividing the first motion track into a plurality of sections of tracks;
respectively determining a motion model of each track in the plurality of tracks, wherein the motion model is used for indicating the change situation of the shape of the corresponding track along with time;
determining a dynamic coefficient of each track point on the first motion track according to a motion model of each track in the plurality of tracks, wherein the dynamic coefficient is used for indicating a corresponding relation between the position of the corresponding track point and time;
and determining the curvature of each track point on the first overlapped track according to the dynamic coefficient of each track point on the first motion track.
Optionally, the second determining module is configured to:
and dividing the first motion track into a plurality of sections of tracks in a sliding window mode according to the size of a reference window and a reference sliding step length, wherein the reference sliding step length is smaller than the size of the reference window.
Optionally, the second determining module is configured to:
and determining a motion model of each section of track in the plurality of sections of tracks in a ridge regression mode.
Optionally, the spatiotemporal features include one or more of a motion direction, a motion speed, a distance between two adjacent track points on the corresponding motion trajectory, a length of the corresponding motion trajectory, and a duration of the corresponding motion trajectory.
In summary, in the embodiment of the present application, whether two motion trajectories are motion trajectories of the same object is determined based on a portion of the two motion trajectories that overlap in space-time, so that it is avoided that whether two motion trajectories are motion trajectories of the same object needs to be determined according to image features of objects corresponding to different motion trajectories, and thus, a situation that determination is made incorrectly according to image features is avoided, and monitoring accuracy is improved.
It should be noted that: in the apparatus for determining a motion trajectory provided in the above embodiment, when determining a motion trajectory, only the division of the above functional modules is used for illustration, and in practical applications, the above function distribution may be completed by different functional modules according to needs, that is, the internal structure of the device is divided into different functional modules, so as to complete all or part of the above described functions. In addition, the apparatus for determining a motion trajectory and the method for determining a motion trajectory provided by the above embodiments belong to the same concept, and specific implementation processes thereof are detailed in the method embodiments and are not described herein again.
Fig. 5 shows a block diagram of a terminal 500 according to an embodiment of the present application. The terminal device shown in fig. 1 may be implemented by the terminal. The terminal 500 may be: a smart phone, a tablet computer, an MP3 player (Moving picture Experts Group Audio Layer III, motion picture Experts compression standard Audio Layer 3), an MP4 player (Moving picture Experts Group Audio Layer IV, motion picture Experts compression standard Audio Layer 4), a notebook computer or a desktop computer. Terminal 500 may also be referred to by other names such as user equipment, portable terminal, laptop terminal, desktop terminal, and the like.
In general, the terminal 500 includes: a processor 501 and a memory 502.
The processor 501 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and so on. The processor 501 may be implemented in at least one hardware form of a DSP (Digital Signal Processing), an FPGA (Field-Programmable Gate Array), and a PLA (Programmable Logic Array). The processor 501 may also include a main processor and a coprocessor, where the main processor is a processor for processing data in an awake state, and is also called a Central Processing Unit (CPU); a coprocessor is a low power processor for processing data in a standby state. In some embodiments, the processor 501 may be integrated with a GPU (Graphics Processing Unit), which is responsible for rendering and drawing the content required to be displayed on the display screen. In some embodiments, processor 501 may also include an AI (Artificial Intelligence) processor for processing computational operations related to machine learning.
Memory 502 may include one or more computer-readable storage media, which may be non-transitory. Memory 502 may also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in memory 502 is used to store at least one instruction for execution by processor 501 to implement the method of determining a motion profile provided by method embodiments herein.
In some embodiments, the terminal 500 may further optionally include: a peripheral interface 503 and at least one peripheral. The processor 501, memory 502 and peripheral interface 503 may be connected by a bus or signal lines. Each peripheral may be connected to the peripheral interface 503 by a bus, signal line, or circuit board. Specifically, the peripheral device includes: at least one of radio frequency circuitry 504, touch screen display 505, camera 506, audio circuitry 507, positioning components 508, and power supply 509.
The peripheral interface 503 may be used to connect at least one peripheral related to I/O (Input/Output) to the processor 501 and the memory 502. In some embodiments, the processor 501, memory 502, and peripheral interface 503 are integrated on the same chip or circuit board; in some other embodiments, any one or two of the processor 501, the memory 502, and the peripheral interface 503 may be implemented on a separate chip or circuit board, which is not limited in this embodiment.
The Radio Frequency circuit 504 is used for receiving and transmitting RF (Radio Frequency) signals, also called electromagnetic signals. The radio frequency circuitry 504 communicates with communication networks and other communication devices via electromagnetic signals. The rf circuit 504 converts an electrical signal into an electromagnetic signal to transmit, or converts a received electromagnetic signal into an electrical signal. Optionally, the radio frequency circuit 504 includes: an antenna system, an RF transceiver, one or more amplifiers, a tuner, an oscillator, a digital signal processor, a codec chipset, a subscriber identity module card, and so forth. The radio frequency circuitry 504 may communicate with other terminals via at least one wireless communication protocol. The wireless communication protocols include, but are not limited to: metropolitan area networks, various generation mobile communication networks (2G, 3G, 4G, and 5G), Wireless local area networks, and/or WiFi (Wireless Fidelity) networks. In some embodiments, the rf circuit 504 may further include NFC (Near Field Communication) related circuits, which are not limited in this application.
The display screen 505 is used to display a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. When the display screen 505 is a touch display screen, the display screen 505 also has the ability to capture touch signals on or over the surface of the display screen 505. The touch signal may be input to the processor 501 as a control signal for processing. At this point, the display screen 505 may also be used to provide virtual buttons and/or a virtual keyboard, also referred to as soft buttons and/or a soft keyboard. In some embodiments, the display screen 505 may be one, providing the front panel of the terminal 500; in other embodiments, the display screens 505 may be at least two, respectively disposed on different surfaces of the terminal 500 or in a folded design; in still other embodiments, the display 505 may be a flexible display disposed on a curved surface or on a folded surface of the terminal 500. Even more, the display screen 505 can be arranged in a non-rectangular irregular figure, i.e. a shaped screen. The Display screen 505 may be made of LCD (liquid crystal Display), OLED (Organic Light-Emitting Diode), and the like.
The camera assembly 506 is used to capture images or video. Optionally, camera assembly 506 includes a front camera and a rear camera. Generally, a front camera is disposed at a front panel of the terminal, and a rear camera is disposed at a rear surface of the terminal. In some embodiments, the number of the rear cameras is at least two, and each rear camera is any one of a main camera, a depth-of-field camera, a wide-angle camera and a telephoto camera, so that the main camera and the depth-of-field camera are fused to realize a background blurring function, and the main camera and the wide-angle camera are fused to realize panoramic shooting and VR (Virtual Reality) shooting functions or other fusion shooting functions. In some embodiments, camera assembly 506 may also include a flash. The flash lamp can be a monochrome temperature flash lamp or a bicolor temperature flash lamp. The double-color-temperature flash lamp is a combination of a warm-light flash lamp and a cold-light flash lamp, and can be used for light compensation at different color temperatures.
Audio circuitry 507 may include a microphone and a speaker. The microphone is used for collecting sound waves of a user and the environment, converting the sound waves into electric signals, and inputting the electric signals to the processor 501 for processing, or inputting the electric signals to the radio frequency circuit 504 to realize voice communication. For the purpose of stereo sound collection or noise reduction, a plurality of microphones may be provided at different portions of the terminal 500. The microphone may also be an array microphone or an omni-directional pick-up microphone. The speaker is used to convert electrical signals from the processor 501 or the radio frequency circuit 504 into sound waves. The loudspeaker can be a traditional film loudspeaker or a piezoelectric ceramic loudspeaker. When the speaker is a piezoelectric ceramic speaker, the speaker can be used for purposes such as converting an electric signal into a sound wave audible to a human being, or converting an electric signal into a sound wave inaudible to a human being to measure a distance. In some embodiments, audio circuitry 507 may also include a headphone jack.
The positioning component 508 is used to locate the current geographic position of the terminal 500 for navigation or LBS (location based Service). The positioning component 508 may be a positioning component based on the GPS (global positioning System) in the united states, the beidou System in china, the graves System in russia, or the galileo System in the european union.
Power supply 509 is used to power the various components in terminal 500. The power source 509 may be alternating current, direct current, disposable or rechargeable. When power supply 509 includes a rechargeable battery, the rechargeable battery may support wired or wireless charging. The rechargeable battery may also be used to support fast charge technology.
In some embodiments, terminal 500 also includes one or more sensors 510. The one or more sensors 510 include, but are not limited to: acceleration sensor 511, gyro sensor 512, pressure sensor 513, fingerprint sensor 514, optical sensor 515, and proximity sensor 516.
The acceleration sensor 511 may detect the magnitude of acceleration on three coordinate axes of the coordinate system established with the terminal 500. For example, the acceleration sensor 511 may be used to detect components of the gravitational acceleration in three coordinate axes. The processor 501 may control the touch screen 505 to display the user interface in a landscape view or a portrait view according to the gravitational acceleration signal collected by the acceleration sensor 511. The acceleration sensor 511 may also be used for acquisition of motion data of a game or a user.
The gyro sensor 512 may detect a body direction and a rotation angle of the terminal 500, and the gyro sensor 512 may cooperate with the acceleration sensor 511 to acquire a 3D motion of the user on the terminal 500. The processor 501 may implement the following functions according to the data collected by the gyro sensor 512: motion sensing (such as changing the UI according to a user's tilting operation), image stabilization at the time of photographing, game control, and inertial navigation.
The pressure sensor 513 may be disposed on a side bezel of the terminal 500 and/or an underlying layer of the touch display screen 505. When the pressure sensor 513 is disposed on the side frame of the terminal 500, a user's holding signal of the terminal 500 may be detected, and the processor 501 performs left-right hand recognition or shortcut operation according to the holding signal collected by the pressure sensor 513. When the pressure sensor 513 is disposed at the lower layer of the touch display screen 505, the processor 501 controls the operability control on the UI interface according to the pressure operation of the user on the touch display screen 505. The operability control comprises at least one of a button control, a scroll bar control, an icon control and a menu control.
The fingerprint sensor 514 is used for collecting a fingerprint of the user, and the processor 501 identifies the identity of the user according to the fingerprint collected by the fingerprint sensor 514, or the fingerprint sensor 514 identifies the identity of the user according to the collected fingerprint. Upon recognizing that the user's identity is a trusted identity, the processor 501 authorizes the user to perform relevant sensitive operations including unlocking the screen, viewing encrypted information, downloading software, paying, and changing settings, etc. The fingerprint sensor 514 may be provided on the front, back, or side of the terminal 500. When a physical button or a vendor Logo is provided on the terminal 500, the fingerprint sensor 514 may be integrated with the physical button or the vendor Logo.
The optical sensor 515 is used to collect the ambient light intensity. In one embodiment, the processor 501 may control the display brightness of the touch display screen 505 based on the ambient light intensity collected by the optical sensor 515. Specifically, when the ambient light intensity is high, the display brightness of the touch display screen 505 is increased; when the ambient light intensity is low, the display brightness of the touch display screen 505 is turned down. In another embodiment, processor 501 may also dynamically adjust the shooting parameters of camera head assembly 506 based on the ambient light intensity collected by optical sensor 515.
A proximity sensor 516, also referred to as a distance sensor, is typically disposed on the front panel of the terminal 500. The proximity sensor 516 is used to collect the distance between the user and the front surface of the terminal 500. In one embodiment, when the proximity sensor 516 detects that the distance between the user and the front surface of the terminal 500 gradually decreases, the processor 501 controls the touch display screen 505 to switch from the bright screen state to the dark screen state; when the proximity sensor 516 detects that the distance between the user and the front surface of the terminal 500 becomes gradually larger, the processor 501 controls the touch display screen 505 to switch from the screen-rest state to the screen-on state.
Those skilled in the art will appreciate that the configuration shown in fig. 5 is not intended to be limiting of terminal 500 and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components may be used.
Embodiments of the present application further provide a non-transitory computer-readable storage medium, and when instructions in the storage medium are executed by a processor of a terminal, the terminal is enabled to execute the method for determining a motion trajectory provided in the above embodiments.
Embodiments of the present application further provide a computer program product containing instructions, which when run on a terminal, cause the terminal to perform the method for determining a motion trajectory provided in the foregoing embodiments.
Fig. 6 is a schematic structural diagram of a server according to an embodiment of the present application. The terminal server in fig. 1 can be realized by the server shown in fig. 6. The server may be a server in a cluster of background servers. Specifically, the method comprises the following steps:
the server 600 includes a Central Processing Unit (CPU)601, a system memory 604 including a Random Access Memory (RAM)602 and a Read Only Memory (ROM)603, and a system bus 605 connecting the system memory 604 and the central processing unit 601. The server 600 also includes a basic input/output system (I/O system) 606, which facilitates the transfer of information between devices within the computer, and a mass storage device 607, which stores an operating system 613, application programs 614, and other program modules 615.
The basic input/output system 606 includes a display 608 for displaying information and an input device 609 such as a mouse, keyboard, etc. for user input of information. Wherein a display 608 and an input device 609 are connected to the central processing unit 601 through an input output controller 610 connected to the system bus 605. The basic input/output system 606 may also include an input/output controller 610 for receiving and processing input from a number of other devices, such as a keyboard, mouse, or electronic stylus. Similarly, input/output controller 610 may also provide output to a display screen, a printer, or other type of output device.
The mass storage device 607 is connected to the central processing unit 601 through a mass storage controller (not shown) connected to the system bus 605. The mass storage device 607 and its associated computer-readable media provide non-volatile storage for the server 600. That is, mass storage device 607 may include a computer-readable medium (not shown), such as a hard disk or CD-ROM drive.
Without loss of generality, computer readable media may comprise computer storage media and communication media. Computer storage media includes volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. Computer storage media includes RAM, ROM, EPROM, EEPROM, flash memory or other solid state memory technology, CD-ROM, DVD, or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices. Of course, those skilled in the art will appreciate that computer storage media is not limited to the foregoing. The system memory 604 and mass storage device 607 described above may be collectively referred to as memory.
According to various embodiments of the present application, the server 600 may also operate as a remote computer connected to a network through a network, such as the Internet. That is, the server 600 may be connected to the network 612 through the network interface unit 611 connected to the system bus 605, or may be connected to other types of networks or remote computer systems (not shown) using the network interface unit 611.
The memory further includes one or more programs, and the one or more programs are stored in the memory and configured to be executed by the CPU. The one or more programs include instructions for performing the method for determining a motion trajectory as provided by embodiments of the present application and described below.
Embodiments of the present application further provide a non-transitory computer-readable storage medium, and when instructions in the storage medium are executed by a processor of a server, the server is enabled to execute the method for determining a motion trajectory provided in the foregoing embodiments.
Embodiments of the present application further provide a computer program product containing instructions, which when run on a server, cause the server to execute the method for determining a motion trajectory provided in the foregoing embodiments.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program instructing relevant hardware, where the program may be stored in a computer-readable storage medium, and the above-mentioned storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The above description is only for the purpose of illustrating the preferred embodiments of the present application and is not to be construed as limiting the present application, and any modifications, equivalents, improvements, etc. made within the spirit and principle of the embodiments of the present application should be included in the scope of the embodiments of the present application.

Claims (10)

1. A method of determining a motion trajectory, the method comprising:
determining motion tracks of a first motion track and a second motion track which are tracked by a second camera in the same time period according to the first motion track tracked by the first camera and the second motion track tracked by the second camera to obtain a first overlapping track and a second overlapping track, wherein the monitoring range of the first camera is overlapped with the monitoring range of the second camera;
determining a similarity between the first overlapping trajectory and the second overlapping trajectory;
and if the similarity between the first overlapping track and the second overlapping track is greater than a similarity threshold, determining that the first motion track and the second motion track are the motion tracks of the same object.
2. The method of claim 1, wherein the determining a similarity between the first overlapping trajectory and the second overlapping trajectory comprises:
determining spatiotemporal features of the first and second overlapping trajectories, respectively, the spatiotemporal features being indicative of features of shape over time of the respective motion trajectories;
determining a similarity between the first overlapping trajectory and the second overlapping trajectory based on the spatiotemporal features of the first overlapping trajectory and the spatiotemporal features of the second overlapping trajectory.
3. The method of claim 2, wherein the spatiotemporal features comprise curvatures of respective trajectory points on the respective motion trajectories;
the determining the spatiotemporal characteristics of the first overlapping trajectory comprises:
dividing the first motion track into a plurality of sections of tracks;
respectively determining a motion model of each track in the plurality of tracks, wherein the motion model is used for indicating the change situation of the shape of the corresponding track along with time;
determining a dynamic coefficient of each track point on the first motion track according to a motion model of each track in the plurality of tracks, wherein the dynamic coefficient is used for indicating a corresponding relation between the position of the corresponding track point and time;
and determining the curvature of each track point on the first overlapped track according to the dynamic coefficient of each track point on the first motion track.
4. The method of claim 3, wherein said dividing the first motion profile into a plurality of segments comprises:
and dividing the first motion track into a plurality of sections of tracks in a sliding window mode according to the size of a reference window and a reference sliding step length, wherein the reference sliding step length is smaller than the size of the reference window.
5. The method of claim 3, wherein said separately determining a motion model for each of said plurality of segments of trajectory comprises:
and determining a motion model of each section of track in the plurality of sections of tracks in a ridge regression mode.
6. A method according to any one of claims 2 to 5, wherein the spatiotemporal features comprise one or more of direction of motion, speed of motion, distance between two adjacent trajectory points on the respective motion trajectory, length of the respective motion trajectory, duration of the respective motion trajectory.
7. A monitoring device, the device comprising:
the first determining module is used for determining motion tracks of a first camera and a second camera in the same time period according to a first motion track tracked by the first camera and a second motion track tracked by the second camera to obtain a first overlapping track and a second overlapping track, wherein the first camera and the second camera are two cameras adjacent to each other in a monitoring range;
a second determining module for determining a similarity between the first overlapping trajectory and the second overlapping trajectory;
a third determining module, configured to determine that the first motion trajectory and the second motion trajectory are motion trajectories of the same object if a similarity between the first overlapping trajectory and the second overlapping trajectory is greater than a similarity threshold.
8. The apparatus of claim 7, wherein the second determination module is to:
determining spatiotemporal features of the first and second overlapping trajectories, respectively, the spatiotemporal features being indicative of features of shape over time of the respective motion trajectories;
determining a similarity between the first overlapping trajectory and the second overlapping trajectory based on the spatiotemporal features of the first overlapping trajectory and the spatiotemporal features of the second overlapping trajectory.
9. An apparatus for determining a motion trajectory, the apparatus comprising:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to perform the steps of the method of any of the preceding claims 1-6.
10. A computer-readable storage medium, having stored thereon instructions which, when executed by a processor, carry out the steps of the method of any of claims 1-6 above.
CN202010555795.8A 2020-06-17 2020-06-17 Method and device for determining motion trail and computer storage medium Pending CN111724412A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010555795.8A CN111724412A (en) 2020-06-17 2020-06-17 Method and device for determining motion trail and computer storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010555795.8A CN111724412A (en) 2020-06-17 2020-06-17 Method and device for determining motion trail and computer storage medium

Publications (1)

Publication Number Publication Date
CN111724412A true CN111724412A (en) 2020-09-29

Family

ID=72567264

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010555795.8A Pending CN111724412A (en) 2020-06-17 2020-06-17 Method and device for determining motion trail and computer storage medium

Country Status (1)

Country Link
CN (1) CN111724412A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112465866A (en) * 2020-11-27 2021-03-09 杭州海康威视数字技术股份有限公司 Multi-target track acquisition method, device, system and storage medium
CN112465869A (en) * 2020-11-30 2021-03-09 杭州海康威视数字技术股份有限公司 Track association method and device, electronic equipment and storage medium
CN114751151A (en) * 2021-01-12 2022-07-15 贵州中烟工业有限责任公司 Calculation method for installation area of detection device and storage medium
CN114898307A (en) * 2022-07-11 2022-08-12 浙江大华技术股份有限公司 Object tracking method and device, electronic equipment and storage medium
CN115081643A (en) * 2022-07-20 2022-09-20 北京瑞莱智慧科技有限公司 Countermeasure sample generation method, related device and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102201122A (en) * 2011-05-16 2011-09-28 大连大学 Motion capture system, data noise reduction method and system of motion capture
CN109035299A (en) * 2018-06-11 2018-12-18 平安科技(深圳)有限公司 Method for tracking target, device, computer equipment and storage medium
CN109309809A (en) * 2017-07-28 2019-02-05 阿里巴巴集团控股有限公司 The method and data processing method, device and system of trans-regional target trajectory tracking

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102201122A (en) * 2011-05-16 2011-09-28 大连大学 Motion capture system, data noise reduction method and system of motion capture
CN109309809A (en) * 2017-07-28 2019-02-05 阿里巴巴集团控股有限公司 The method and data processing method, device and system of trans-regional target trajectory tracking
CN109035299A (en) * 2018-06-11 2018-12-18 平安科技(深圳)有限公司 Method for tracking target, device, computer equipment and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
朱进等: "基于多重运动特征的轨迹相似性度量模型", 《武汉大学学报(信息科学版)》 *

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112465866A (en) * 2020-11-27 2021-03-09 杭州海康威视数字技术股份有限公司 Multi-target track acquisition method, device, system and storage medium
CN112465866B (en) * 2020-11-27 2024-02-02 杭州海康威视数字技术股份有限公司 Multi-target track acquisition method, device, system and storage medium
CN112465869A (en) * 2020-11-30 2021-03-09 杭州海康威视数字技术股份有限公司 Track association method and device, electronic equipment and storage medium
CN112465869B (en) * 2020-11-30 2023-09-05 杭州海康威视数字技术股份有限公司 Track association method and device, electronic equipment and storage medium
CN114751151A (en) * 2021-01-12 2022-07-15 贵州中烟工业有限责任公司 Calculation method for installation area of detection device and storage medium
CN114751151B (en) * 2021-01-12 2024-03-26 贵州中烟工业有限责任公司 Calculation method of detection device installation area and storage medium
CN114898307A (en) * 2022-07-11 2022-08-12 浙江大华技术股份有限公司 Object tracking method and device, electronic equipment and storage medium
CN114898307B (en) * 2022-07-11 2022-10-28 浙江大华技术股份有限公司 Object tracking method and device, electronic equipment and storage medium
CN115081643A (en) * 2022-07-20 2022-09-20 北京瑞莱智慧科技有限公司 Countermeasure sample generation method, related device and storage medium
CN115081643B (en) * 2022-07-20 2022-11-08 北京瑞莱智慧科技有限公司 Confrontation sample generation method, related device and storage medium

Similar Documents

Publication Publication Date Title
CN110555883B (en) Repositioning method and device for camera attitude tracking process and storage medium
CN108682038B (en) Pose determination method, pose determination device and storage medium
CN110097576B (en) Motion information determination method of image feature point, task execution method and equipment
CN111724412A (en) Method and device for determining motion trail and computer storage medium
CN108876854B (en) Method, device and equipment for relocating camera attitude tracking process and storage medium
CN111641794B (en) Sound signal acquisition method and electronic equipment
CN111127509B (en) Target tracking method, apparatus and computer readable storage medium
CN112084811B (en) Identity information determining method, device and storage medium
CN111754386B (en) Image area shielding method, device, equipment and storage medium
CN110570465B (en) Real-time positioning and map construction method and device and computer readable storage medium
CN111753606A (en) Intelligent model upgrading method and device
CN112749590B (en) Object detection method, device, computer equipment and computer readable storage medium
CN111986227A (en) Trajectory generation method and apparatus, computer device and storage medium
CN110633336B (en) Method and device for determining laser data search range and storage medium
CN113938606B (en) Method and device for determining ball machine erection parameters and computer storage medium
CN113432620B (en) Error estimation method and device, vehicle-mounted terminal and storage medium
WO2021218926A1 (en) Image display method and apparatus, and computer device
CN111982293B (en) Body temperature measuring method and device, electronic equipment and storage medium
CN113706807B (en) Method, device, equipment and storage medium for sending alarm information
CN111179628B (en) Positioning method and device for automatic driving vehicle, electronic equipment and storage medium
CN113936240A (en) Method, device and equipment for determining sample image and storage medium
CN111127539B (en) Parallax determination method and device, computer equipment and storage medium
CN110263695B (en) Face position acquisition method and device, electronic equipment and storage medium
CN114093020A (en) Motion capture method, motion capture device, electronic device and storage medium
CN112861565A (en) Method and device for determining track similarity, computer equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination