CN107274430B - Object motion trajectory prediction method and device - Google Patents

Object motion trajectory prediction method and device Download PDF

Info

Publication number
CN107274430B
CN107274430B CN201710390805.5A CN201710390805A CN107274430B CN 107274430 B CN107274430 B CN 107274430B CN 201710390805 A CN201710390805 A CN 201710390805A CN 107274430 B CN107274430 B CN 107274430B
Authority
CN
China
Prior art keywords
target
image
pixels
pixel
position pixel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710390805.5A
Other languages
Chinese (zh)
Other versions
CN107274430A (en
Inventor
赵静
见良
彭旸
郑鹏程
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Digital Video Beijing Ltd
Original Assignee
China Digital Video Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Digital Video Beijing Ltd filed Critical China Digital Video Beijing Ltd
Priority to CN201710390805.5A priority Critical patent/CN107274430B/en
Publication of CN107274430A publication Critical patent/CN107274430A/en
Application granted granted Critical
Publication of CN107274430B publication Critical patent/CN107274430B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/207Analysis of motion for motion estimation over a hierarchy of resolutions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/262Analysis of motion using transform domain methods, e.g. Fourier domain methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30241Trajectory

Abstract

The embodiment of the invention provides a method and a device for predicting an object motion track, wherein the method comprises the following steps: acquiring a high-resolution initial image; determining partial image pixels in the initial image as characteristic pixels, and generating a low-resolution target image by adopting the characteristic pixels; determining an object to be predicted in a target image, and taking each characteristic pixel of the object to be predicted as an initial position pixel; searching a plurality of target ending position pixels matched with the starting position pixels in a plurality of characteristic pixels in the target image; extracting each target termination position pixel as a motion trail prediction result of an object to be predicted; adding a prediction result identifier to each target termination position pixel on a target image; and according to the target image added with the prediction result identification, obtaining an initial image added with the prediction result identification at the corresponding position. According to the embodiment of the invention, the prediction effect of the motion trail is improved.

Description

Object motion trajectory prediction method and device
Technical Field
The present invention relates to the field of image processing technologies, and in particular, to a method and an apparatus for predicting an object motion trajectory.
Background
At present, in image and video post-processing software, a particle filter tracking algorithm is widely applied to tracking characteristic pixels of a moving object. When a particle filter tracking algorithm is used for tracking a moving object, a plurality of characteristic pixels of the moving object need to be searched and calculated to predict the motion trail of the object, and the process requires good real-time performance.
However, the applicant has found through research that when feature pixel tracking is performed on a 4K image with high resolution, because the 4K image has a large number of pixels, the processing capability of a computer cannot meet the requirement of searching and calculating a large number of pixels at the same time, so that the tracking real-time performance is poor, and the motion trajectory prediction is unstable. That is, the current object motion trajectory prediction method has a problem of poor motion trajectory prediction effect due to the limitation of the processing capability of the computer.
Disclosure of Invention
The technical problem to be solved by the embodiment of the invention is to provide an object motion trajectory prediction method and an object motion trajectory prediction device.
In order to solve the above problem, the present invention provides a method for predicting a motion trajectory of an object, the method comprising:
acquiring a high-resolution initial image; the initial image comprises a plurality of image pixels;
determining partial image pixels in the initial image as characteristic pixels, and generating a low-resolution target image by using the characteristic pixels;
determining an object to be predicted in the target image, and taking each feature pixel of the object to be predicted as an initial position pixel;
searching a plurality of target ending position pixels matched with the starting position pixels from a plurality of characteristic pixels in the target image;
extracting each target termination position pixel as a motion trail prediction result of the object to be predicted;
adding a prediction result identifier to each target termination position pixel on the target image;
and according to the target image added with the prediction result identification, obtaining an initial image added with the prediction result identification at the corresponding position.
Optionally, the step of generating a low-resolution target image by using the feature pixels includes:
generating a low-resolution intermediate image by using the characteristic pixels;
and performing image transformation with equal resolution on the intermediate image to obtain the target image, wherein the image transformation is time-frequency transformation or probability transformation.
Optionally, the searching for a plurality of target end position pixels matching with the start position pixels among the plurality of feature pixels in the target image includes:
taking a plurality of characteristic pixels adjacent to a certain starting position pixel as candidate ending position pixels;
calculating the estimated termination position pixel of a certain starting position pixel, and calculating the matching degree of the estimated termination position pixel and each candidate termination position pixel;
and selecting a certain candidate termination position pixel as the target termination position pixel according to the matching degree.
Optionally, the tracking particles have sampling weight values, and the calculating an estimated ending position pixel of a certain starting position pixel includes:
configuring a plurality of tracking particles corresponding to each initial position pixel;
determining the estimated moving speed of each tracking particle in each estimated moving direction;
calculating the estimated target position of each tracking particle at the next moment by adopting the sampling weight value, the estimated moving direction and the estimated moving speed of each tracking particle;
and taking the characteristic pixel on the estimated target position of the tracking particle as the estimated ending position pixel.
Optionally, the method further comprises:
searching tracking particles with sampling weight values smaller than a preset threshold value as target tracking particles;
and deleting the target tracking particles, reconfiguring the tracking particles according to the pixels at the starting positions corresponding to the target tracking particles, and performing the next calculation process of estimating the pixels at the ending positions.
Correspondingly, the invention also provides an object motion trail prediction device, which comprises:
the initial image acquisition module is used for acquiring a high-resolution initial image; the initial image comprises a plurality of image pixels;
the target image generation module is used for determining partial image pixels in the initial image as characteristic pixels and generating a low-resolution target image by adopting the characteristic pixels;
the initial position pixel determining module is used for determining an object to be predicted in the target image and taking each feature pixel of the object to be predicted as an initial position pixel;
a target end position pixel searching module, configured to search a plurality of target end position pixels that match each start position pixel, among a plurality of feature pixels in the target image;
a motion trail prediction result determining module, configured to extract each target termination position pixel as a motion trail prediction result of the object to be predicted;
the first identification adding module is used for adding a prediction result identification to each target termination position pixel on the target image;
and the second identification adding module is used for obtaining an initial image added with the prediction result identification at the corresponding position according to the target image added with the prediction result identification.
Optionally, the target image generation module includes:
the intermediate image generation submodule is used for generating an intermediate image with low resolution by adopting the characteristic pixels;
and the image transformation submodule is used for carrying out image transformation with equal resolution on the intermediate image to obtain the target image, and the image transformation is time-frequency transformation or probability transformation.
Optionally, the target termination location pixel search module includes:
a candidate ending position pixel determining submodule for taking a plurality of characteristic pixels adjacent to a certain starting position pixel as candidate ending position pixels;
the estimated termination position pixel calculation module is used for calculating an estimated termination position pixel of a certain starting position pixel and calculating the matching degree of the estimated termination position pixel and each candidate termination position pixel;
and the target termination position pixel determining submodule is used for selecting a certain candidate termination position pixel as the target termination position pixel according to the matching degree.
Optionally, the tracking particles have sampling weight values, and the estimated termination position pixel calculation sub-module includes:
the particle configuration unit is used for configuring a plurality of tracking particles corresponding to the pixels at the initial positions;
the speed determining unit is used for determining the estimated moving speed of each tracking particle in each estimated moving direction;
the estimated target position calculating unit is used for calculating the estimated target position of each tracking particle at the next moment by adopting the sampling weight value, the estimated moving direction and the estimated moving speed of each tracking particle;
and the estimated termination position pixel determining unit is used for taking the characteristic pixel on the estimated target position of the tracking particle as the estimated termination position pixel.
Optionally, the estimated termination position pixel calculation sub-module further includes:
the target tracking particle searching unit is used for searching tracking particles with sampling weight values smaller than a preset threshold value as target tracking particles;
and the reconfiguration particle unit is used for deleting the target tracking particles, reconfiguring the tracking particles aiming at the starting position pixels corresponding to the target tracking particles, and entering the next calculation process of estimating the ending position pixels.
Compared with the prior art, the embodiment of the invention has the following advantages:
according to the embodiment of the invention, the low-resolution image is formed by extracting partial pixels of the high-resolution image, the motion trail of the moving object is calculated based on the low-resolution image, and the pixel objects needing to be calculated and searched are greatly reduced, so that the calculated amount is reduced, the real-time performance of tracking the moving object is ensured, and the prediction effect of the motion trail is improved. Meanwhile, the calculation processing resource for predicting the motion trail is saved.
Drawings
Fig. 1 is a flowchart illustrating steps of a method for predicting a motion trajectory of an object according to a first embodiment of the present invention;
fig. 2 is a block diagram of an object motion trajectory prediction apparatus according to a second embodiment of the present invention;
fig. 3 is a flowchart illustrating steps of object motion trajectory prediction for a 4K image according to the present invention.
Detailed Description
In order to make the aforementioned objects, features and advantages of the present invention comprehensible, embodiments accompanied with figures are described in further detail below.
Example one
Fig. 1 is a flowchart illustrating steps of a method for predicting a motion trajectory of an object according to a first embodiment of the present invention, where the method may specifically include the following steps:
step 101, obtaining a high-resolution initial image; the initial image comprises a plurality of image pixels.
In a specific implementation, a high resolution initial image may be acquired, for example, a certain 4K image may be acquired. In practical application, a certain frame of 4K image can be extracted from the 4K video as an initial image. Wherein the initial image comprises a plurality of image pixels.
Optionally, the step 101 includes:
and a substep S11 of extracting successive multi-frame images from the high-resolution video, and selecting one of the multi-frame images as the initial image.
In a specific implementation, multiple frames of images can be extracted from a high-resolution video, and the multiple frames of images can be continuous in time. One frame of image can be selected from the multiple frames of images as an initial image.
And step 102, determining partial image pixels in the initial image as characteristic pixels, and generating a low-resolution target image by using the characteristic pixels.
In a specific implementation, partial image pixels in the initial image can be extracted as feature pixels, and a low-resolution target image is formed by the sign pixels. The target image is composed of characteristic pixels of the initial image, thereby reducing the resolution of the image, but at the same time maintaining the basic characteristics of the individual objects within the initial image. For example, image pixels of odd rows and odd columns in the initial image may be extracted as feature pixels, and the target image may be composed using the feature pixels. Of course, those skilled in the art can extract image pixels to form the target image in various ways, for example, extracting image pixels in even rows and even columns in the initial image as feature pixels, and using the feature pixels to form the target image.
Optionally, the step of generating a low-resolution target image by using the feature pixels may include:
and a sub-step S21 of generating a low-resolution intermediate image using the characteristic pixels.
And a substep S22 of performing image transformation of equal resolution on the intermediate image to obtain the target image, wherein the image transformation is time-frequency transformation or probability transformation.
In practical application, the intermediate image with low resolution can be generated by using the characteristic pixels, and then the intermediate image can be converted into the target image with equal resolution by a time-frequency conversion or probability conversion method.
Wherein, the Time Frequency transformation (ALT) adopts fast Fourier transformation and inverse transformation to repeatedly transform on the Time Frequency Domain until obtaining the convergent calculation result. The probability transformation is also called Probabilistic Hough Transform (Progressive Probabilistic Hough Transform), and in the transformation process, foreground points on an edge image can be randomly acquired at first and are mapped to a polar coordinate system drawing curve; when the intersection point in the polar coordinate system reaches the minimum vote number, searching a straight line L corresponding to the point in the x-y coordinate system; searching foreground points on the edge image, connecting points (the distance between each point is less than maxLineGap) on a straight line L into line segments, then deleting all the points, and recording parameters (a starting point and an end point) of the line segments, wherein the length of each line segment needs to meet the minimum length; and repeating the steps until the conversion is completed.
In practical application, each Pixel in the image can be marked in a coordinate mode, so that an intermediate image can be obtained by transforming a certain initial image from a Canonical coordinate system to a Pixel coordinate system. The following transformation formula can be specifically adopted for implementation:
px=cx/par*1/2;
py=cy*1/2;
where px and py represent the coordinate positions of the pixels on the Pixel coordinate system, and cx and cy represent the coordinate positions of the pixels on the Canonical coordinate system. By the above formula, an intermediate image 1/2 having a resolution of the original image can be obtained. And then, carrying out time-frequency transformation or probability change on the intermediate image to obtain an 1/2 target image with the resolution of the initial image.
Step 103, determining an object to be predicted in the target image, and taking each feature pixel of the object to be predicted as a start position pixel.
In a specific implementation, an object to be predicted in a target image may be determined. Generally, an object in a certain area may be manually selected as an object to be predicted, or a certain object may be automatically selected as an object to be predicted.
After the object to be predicted is determined, each feature pixel forming the object to be predicted can be determined, and each feature pixel is taken as a current starting position pixel of the object.
And 104, searching a plurality of target ending position pixels matched with the starting position pixels from a plurality of characteristic pixels in the target image.
In a specific implementation, feature pixels matching with each start position pixel may be searched as target end position pixels. Because the target image is a low-resolution image, the searching space is greatly reduced compared with the initial image, and the efficiency of searching the pixels at the end position of the target is improved.
Optionally, the step 104 includes:
a substep S31 of using a plurality of feature pixels adjacent to a certain start position pixel as candidate end position pixels;
substep S32, calculating an estimated ending position pixel of a certain starting position pixel, and calculating the matching degree of the estimated ending position pixel and each candidate ending position pixel;
and a substep S33 of selecting a candidate ending point pixel as the target ending point pixel according to the matching degree.
In a specific implementation, several feature pixels adjacent to a certain start position pixel may be determined as candidate end position pixels. The estimated ending position pixel of a certain starting position pixel can be calculated, the matching degree of the estimated ending position pixel and each candidate ending position pixel is calculated, and the target ending position pixel is determined according to the matching degree. The matching degree may be the sum of squares of target area gray differences of the coordinates of the pixels.
For example, first, the estimated end position pixel of the start position pixel is calculated, then 10 candidate end position pixels are formed in a range of 3 × 3 pixels around the start position pixel, the Sum of Squares (SSD) of the target area gray differences between each candidate end position pixel and the estimated end position pixel is calculated, and the candidate end position pixel having the smallest sum of squares is taken as the target end position pixel.
Optionally, the tracking particles have sample weight values, and the sub-step S32 includes:
substep S32-1, configuring a plurality of tracking particles corresponding to each of the start position pixels;
the substep S32-2, determining the estimated moving speed of each tracking particle in each estimated moving direction;
step S32-3, calculating the estimated target position of each tracking particle at the next moment by adopting the sampling weight value, the estimated moving direction and the estimated moving speed of each tracking particle;
and a substep S32-4 of using the feature pixel at the estimated target position of the tracking particle as the estimated end position pixel.
In practical applications, the tracking algorithm in combination with the rgb (red Green blue) color histogram may be used to predict the real position of the object in the next frame of image. Specifically, the tracking algorithm may use a particle filter algorithm to input the number N of particles, the estimated movement rate of the particles in the x direction, and the estimated movement rate of the particles in the y direction. The number of particles N is used to determine how many randomly distributed particles are used to calculate the RGB color histogram. The estimated moving speed of the particle in the x direction and the estimated moving speed of the particle in the y direction are used for estimating a region where the particle is distributed at the next moment, and the characteristic pixel at the position of the particle can be used as an estimated termination position pixel. The sampling weight values of the configured particles are different, and the sampling weight values are used for adjusting the ratio of different particles to the final calculation result.
Optionally, the sub-step S32 further includes:
substep S32-5, searching the tracking particles with the sampling weight value smaller than a preset threshold value as target tracking particles;
and a substep S32-6, deleting the target tracking particles, reconfiguring the tracking particles according to the starting position pixels corresponding to the target tracking particles, and entering the next calculation process of estimating the ending position pixels.
In practical applications, the particles have a degradation phenomenon (degeneration phenomenon). That is, when a particle calculates its sampling weight value, after several iterations, the variance of the sampling weight value of the particle increases, and the sampling weight values of particles other than some particles become small. The contribution of the particles with smaller sampling weight values to the final calculation result is too small, and if the particles are still used as the calculation parameters, the calculation processing resources are wasted. Therefore, the purpose of reconfiguring the tracking particles is to reduce the particles with smaller weight values, so as to save the computing processing resources.
After reconfiguring the tracking particles, the estimated rate of movement of the particles may be calculated using the following equation:
Figure BDA0001307474780000081
Figure BDA0001307474780000082
wherein the content of the first and second substances,
Figure BDA0001307474780000083
and
Figure BDA0001307474780000084
respectively represents the position offset of a moving object at the time of t-1, vec unit per pixel represents a unit pixel vector, vec
Figure BDA0001307474780000085
Representing the estimated rate of movement, vec, in the x direction
Figure BDA0001307474780000086
Representing the estimated rate of movement in the y-direction.
Assuming that the estimated moving speed of the object of the tracking target does not change greatly from the time t-1 to the time t, the estimated target position of the particle at the next time is obtained according to the following formula:
Figure BDA0001307474780000091
Figure BDA0001307474780000092
wherein the content of the first and second substances,
Figure BDA0001307474780000093
is a Gaussian random number, and is a random number,
Figure BDA0001307474780000094
the width of the particles is the width of the particles,
Figure BDA0001307474780000095
is particle height.
Finally, the weight of each particle can be calculated according to the RGB histogram, and then the positions of N particles are weighted and averaged according to the weights to obtain the tracking result of the particle filter:
Figure BDA0001307474780000096
Figure BDA0001307474780000097
wherein f is a normalization coefficient:
Figure BDA0001307474780000098
after calculating the estimated end position pixel, forming 10 search positions with the input rectangular range of 3 x 3 pixels around the initial position at t-1, and finding a new position with the least Square Sum (SSD) of the gray difference of the target area at the last frame t-1. Specifically, the following formula can be adopted for calculation:
S(x,y)=(∫∫w|(J(X)-I(X))|);
Figure BDA0001307474780000099
the new position is used as a tracking result of the particle filter, so that the tracking result can be stabilized, and particularly, the object can be accurately positioned when the object with small motion is tracked.
And 105, extracting each target termination position pixel as a motion trail prediction result of the object to be predicted.
In a specific implementation, each target termination position pixel may be extracted, and each extracted target termination position pixel is a motion trajectory prediction result of the object to be predicted at the next moment.
And 106, adding a prediction result identifier to each target termination position pixel on the target image.
And 107, obtaining an initial image added with the prediction result identifier at the corresponding position according to the target image added with the prediction result identifier.
In a specific implementation, a prediction result identifier may be added to each target termination position pixel on the target image according to the motion trajectory prediction result of the object to be predicted. For the target image added with the prediction result identifier, the characteristic pixels used for producing the target image before can be adopted to perform inverse operation on the target image to obtain an inverse operation result of the target image. And according to the inverse operation result, adding a prediction result identifier at a position of the initial image corresponding to the characteristic pixel, thereby obtaining the initial image added with the prediction result identifier at the corresponding position.
According to the embodiment of the invention, the low-resolution image is formed by extracting partial pixels of the high-resolution image, the motion trail of the moving object is calculated based on the low-resolution image, and the pixel objects needing to be calculated and searched are greatly reduced, so that the calculated amount is reduced, the real-time performance of tracking the moving object is ensured, and the prediction effect of the motion trail is improved. Meanwhile, the calculation processing resource for predicting the motion trail is saved.
To facilitate understanding of embodiments of the present invention by those skilled in the art, reference will now be made to the specific example of FIG. 3.
Fig. 3 shows a flowchart of the steps of object motion trajectory prediction for a 4K image according to the present invention. As can be seen from the figure, the high-resolution 4K image is firstly subjected to the pixel reduction processing, then N particles are reselected at the time t according to the decay condition of the weights of the N particles, and the N particles are subjected to probability redistribution by using the motion speed at the time t. And calculating the weights of the N particles according to the RGB histogram, and finally performing weighted summation to obtain the estimated position of each particle at the next moment. A new location is searched for the estimated location to stabilize the prediction result. And after the prediction result is obtained, restoring the image into a 4K image. And repeating the steps for the 4K image at the t +1 moment so as to process the continuous 4K images of a plurality of frames in the 4K video.
It should be noted that, for simplicity of description, the method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the present invention is not limited by the illustrated order of acts, as some steps may occur in other orders or concurrently in accordance with the embodiments of the present invention. Further, those skilled in the art will appreciate that the embodiments described in the specification are presently preferred and that no particular act is required to implement the invention.
Example two
Correspondingly, the second embodiment of the invention also provides an object motion trail prediction device. Fig. 2 is a block diagram illustrating a structure of an object motion trajectory prediction apparatus according to a second embodiment of the present invention, where the apparatus may specifically include the following modules:
an initial image obtaining module 201, configured to obtain a high-resolution initial image; the initial image comprises a plurality of image pixels;
a target image generation module 202, configured to determine partial image pixels in the initial image as feature pixels, and generate a low-resolution target image by using the feature pixels;
a start position pixel determining module 203, configured to determine an object to be predicted in the target image, and use each feature pixel of the object to be predicted as a start position pixel;
a target end position pixel searching module 204, configured to search a plurality of target end position pixels that match each start position pixel, among a plurality of feature pixels in the target image;
a motion trajectory prediction result determining module 205, configured to extract each target termination position pixel as a motion trajectory prediction result of the object to be predicted;
a first identifier adding module 206, configured to add a prediction result identifier to each target termination position pixel on the target image;
and a second identifier adding module 207, configured to obtain, according to the target image added with the prediction result identifier, an initial image added with the prediction result identifier at a corresponding position.
Optionally, the target image generation module 202 includes:
the intermediate image generation submodule is used for generating an intermediate image with low resolution by adopting the characteristic pixels;
and the image transformation submodule is used for carrying out image transformation with equal resolution on the intermediate image to obtain the target image, and the image transformation is time-frequency transformation or probability transformation.
Optionally, the target termination position pixel searching module 204 includes:
a candidate ending position pixel determining submodule for taking a plurality of characteristic pixels adjacent to a certain starting position pixel as candidate ending position pixels;
the estimated termination position pixel calculation submodule is used for calculating an estimated termination position pixel of a certain starting position pixel and calculating the matching degree of the estimated termination position pixel and each candidate termination position pixel;
and the target termination position pixel determining submodule is used for selecting a certain candidate termination position pixel as the target termination position pixel according to the matching degree.
Optionally, the tracking particles have sampling weight values, and the estimated termination position pixel calculation sub-module includes:
the particle configuration unit is used for configuring a plurality of tracking particles corresponding to the pixels at the initial positions;
the speed determining unit is used for determining the estimated moving speed of each tracking particle in each estimated moving direction;
the estimated target position calculating unit is used for calculating the estimated target position of each tracking particle at the next moment by adopting the sampling weight value, the estimated moving direction and the estimated moving speed of each tracking particle;
and the estimated termination position pixel determining unit is used for taking the characteristic pixel on the estimated target position of the tracking particle as the estimated termination position pixel.
Optionally, the estimated termination position pixel calculation sub-module further includes:
the target tracking particle searching unit is used for searching tracking particles with sampling weight values smaller than a preset threshold value as target tracking particles;
and the reconfiguration particle unit is used for deleting the target tracking particles, reconfiguring the tracking particles aiming at the starting position pixels corresponding to the target tracking particles, and entering the next calculation process of estimating the ending position pixels.
According to the embodiment of the invention, the low-resolution image is formed by extracting partial pixels of the high-resolution image, the motion trail of the moving object is calculated based on the low-resolution image, and the pixel objects needing to be calculated and searched are greatly reduced, so that the calculated amount is reduced, the real-time performance of tracking the moving object is ensured, and the prediction effect of the motion trail is improved. Meanwhile, the calculation processing resource for predicting the motion trail is saved.
For the device embodiment, since it is basically similar to the method embodiment, the description is simple, and for the relevant points, refer to the partial description of the method embodiment.
The embodiments in the present specification are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, apparatus, or computer program product. Accordingly, embodiments of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, embodiments of the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
In a typical configuration, the computer device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory. The memory may include forms of volatile memory in a computer readable medium, Random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of a computer-readable medium. Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, computer readable media does not include non-transitory computer readable media (fransitory media), such as modulated data signals and carrier waves.
Embodiments of the present invention are described with reference to flowchart illustrations and/or block diagrams of methods, terminal devices (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing terminal to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing terminal, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing terminal to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing terminal to cause a series of operational steps to be performed on the computer or other programmable terminal to produce a computer implemented process such that the instructions which execute on the computer or other programmable terminal provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While preferred embodiments of the present invention have been described, additional variations and modifications of these embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including preferred embodiments and all such alterations and modifications as fall within the scope of the embodiments of the invention.
Finally, it should also be noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or terminal that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or terminal. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or terminal that comprises the element.
The technical solutions provided by the present invention are described in detail above, and the principle and the implementation of the present invention are explained in this document by applying specific examples, and the descriptions of the above examples are only used to help understanding the method and the core idea of the present invention; meanwhile, for a person skilled in the art, according to the idea of the present invention, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present invention.

Claims (8)

1. A method for predicting a motion trajectory of an object, the method comprising:
acquiring a high-resolution initial image; the initial image comprises a plurality of image pixels;
determining partial image pixels in the initial image as characteristic pixels, and generating a low-resolution target image by using the characteristic pixels;
determining an object to be predicted in the target image, and taking each feature pixel of the object to be predicted as an initial position pixel;
searching a plurality of target ending position pixels matched with the starting position pixels from a plurality of characteristic pixels in the target image;
extracting each target termination position pixel as a motion trail prediction result of the object to be predicted;
adding a prediction result identifier to each target termination position pixel on the target image;
according to the target image added with the prediction result identification, obtaining an initial image added with the prediction result identification at a corresponding position;
wherein the searching for a plurality of target ending position pixels matching with each starting position pixel among the plurality of feature pixels in the target image comprises:
taking a plurality of characteristic pixels adjacent to a certain starting position pixel as candidate ending position pixels;
calculating the estimated termination position pixel of a certain starting position pixel, and calculating the matching degree of the estimated termination position pixel and each candidate termination position pixel;
and selecting a certain candidate termination position pixel as the target termination position pixel according to the matching degree.
2. The method of claim 1, wherein the step of generating a low resolution target image using the feature pixels comprises:
generating a low-resolution intermediate image by using the characteristic pixels;
and performing image transformation with equal resolution on the intermediate image to obtain the target image, wherein the image transformation is time-frequency transformation or probability transformation.
3. The method of claim 1, wherein tracking particles have sample weight values, and wherein calculating an estimated ending position pixel for a starting position pixel comprises:
configuring a plurality of tracking particles corresponding to each initial position pixel;
determining the estimated moving speed of each tracking particle in each estimated moving direction;
calculating the estimated target position of each tracking particle at the next moment by adopting the sampling weight value, the estimated moving direction and the estimated moving speed of each tracking particle;
and taking the characteristic pixel on the estimated target position of the tracking particle as the estimated ending position pixel.
4. The method of claim 3, further comprising:
searching tracking particles with sampling weight values smaller than a preset threshold value as target tracking particles;
and deleting the target tracking particles, reconfiguring the tracking particles according to the pixels at the starting positions corresponding to the target tracking particles, and performing the next calculation process of estimating the pixels at the ending positions.
5. An apparatus for predicting a motion trajectory of an object, the apparatus comprising:
the initial image acquisition module is used for acquiring a high-resolution initial image; the initial image comprises a plurality of image pixels;
the target image generation module is used for determining partial image pixels in the initial image as characteristic pixels and generating a low-resolution target image by adopting the characteristic pixels;
the initial position pixel determining module is used for determining an object to be predicted in the target image and taking each feature pixel of the object to be predicted as an initial position pixel;
a target end position pixel searching module, configured to search a plurality of target end position pixels that match each start position pixel, among a plurality of feature pixels in the target image;
a motion trail prediction result determining module, configured to extract each target termination position pixel as a motion trail prediction result of the object to be predicted;
the first identification adding module is used for adding a prediction result identification to each target termination position pixel on the target image;
the second identification adding module is used for obtaining an initial image added with the prediction result identification at the corresponding position according to the target image added with the prediction result identification;
wherein the target termination location pixel search module comprises:
a candidate ending position pixel determining submodule for taking a plurality of characteristic pixels adjacent to a certain starting position pixel as candidate ending position pixels;
the estimated termination position pixel calculation submodule is used for calculating an estimated termination position pixel of a certain starting position pixel and calculating the matching degree of the estimated termination position pixel and each candidate termination position pixel;
and the target termination position pixel determining submodule is used for selecting a certain candidate termination position pixel as the target termination position pixel according to the matching degree.
6. The apparatus of claim 5, wherein the target image generation module comprises:
the intermediate image generation submodule is used for generating an intermediate image with low resolution by adopting the characteristic pixels;
and the image transformation submodule is used for carrying out image transformation with equal resolution on the intermediate image to obtain the target image, and the image transformation is time-frequency transformation or probability transformation.
7. The apparatus of claim 5, wherein the tracking particles have sample weight values, and wherein the estimated end position pixel computation sub-module comprises:
the particle configuration unit is used for configuring a plurality of tracking particles corresponding to the pixels at the initial positions;
the speed determining unit is used for determining the estimated moving speed of each tracking particle in each estimated moving direction;
the estimated target position calculating unit is used for calculating the estimated target position of each tracking particle at the next moment by adopting the sampling weight value, the estimated moving direction and the estimated moving speed of each tracking particle;
and the estimated termination position pixel determining unit is used for taking the characteristic pixel on the estimated target position of the tracking particle as the estimated termination position pixel.
8. The apparatus of claim 7, wherein the estimated end position pixel computation sub-module further comprises:
the target tracking particle searching unit is used for searching tracking particles with sampling weight values smaller than a preset threshold value as target tracking particles;
and the reconfiguration particle unit is used for deleting the target tracking particles, reconfiguring the tracking particles aiming at the starting position pixels corresponding to the target tracking particles, and entering the next calculation process of estimating the ending position pixels.
CN201710390805.5A 2017-05-27 2017-05-27 Object motion trajectory prediction method and device Active CN107274430B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710390805.5A CN107274430B (en) 2017-05-27 2017-05-27 Object motion trajectory prediction method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710390805.5A CN107274430B (en) 2017-05-27 2017-05-27 Object motion trajectory prediction method and device

Publications (2)

Publication Number Publication Date
CN107274430A CN107274430A (en) 2017-10-20
CN107274430B true CN107274430B (en) 2020-07-03

Family

ID=60064827

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710390805.5A Active CN107274430B (en) 2017-05-27 2017-05-27 Object motion trajectory prediction method and device

Country Status (1)

Country Link
CN (1) CN107274430B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111260759B (en) * 2020-01-10 2023-02-24 北京金山安全软件有限公司 Path determination method and device

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6307959B1 (en) * 1999-07-14 2001-10-23 Sarnoff Corporation Method and apparatus for estimating scene structure and ego-motion from multiple images of a scene using correlation
US7035434B2 (en) * 2000-12-11 2006-04-25 Texas Instruments Incorporated Hough transform based motion detection image recording system and method
JP2007149092A (en) * 2005-11-23 2007-06-14 Sonosite Inc Multiple resolution adaptive filtering
CN101577006B (en) * 2009-06-15 2015-03-04 北京中星微电子有限公司 Loitering detecting method and loitering detecting system in video monitoring
CN106127210A (en) * 2016-06-17 2016-11-16 广东顺德中山大学卡内基梅隆大学国际联合研究院 A kind of significance detection method based on multiple features

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Rapid Human Movement Tracking using a Mean-Shift Algorithm Based on Skin Color Model;Jin Lu等;《IEEE》;20120709;第1891-1894页 *

Also Published As

Publication number Publication date
CN107274430A (en) 2017-10-20

Similar Documents

Publication Publication Date Title
US11055535B2 (en) Method and device for video classification
US10803554B2 (en) Image processing method and device
Zhao et al. Trajectory convolution for action recognition
US9990546B2 (en) Method and apparatus for determining target region in video frame for target acquisition
CN111598779B (en) Image super-resolution processing method and device, electronic equipment and storage medium
CN111523447B (en) Vehicle tracking method, device, electronic equipment and storage medium
CN110176024B (en) Method, device, equipment and storage medium for detecting target in video
Liu et al. Crowd counting with fully convolutional neural network
CN114359665B (en) Training method and device of full-task face recognition model and face recognition method
KR101296318B1 (en) Apparatus and method for object tracking by adaptive block partitioning
CN107274430B (en) Object motion trajectory prediction method and device
CN111523533B (en) Method and device for determining area of object from image
US20190068888A1 (en) Auto-focusing
CN114611565A (en) Data processing method, device, equipment and storage medium
Li et al. A fast video stabilization method based on feature matching and histogram clustering
CN113129332A (en) Method and apparatus for performing target object tracking
Lee et al. Pedestrian detection using multi-scale squeeze-and-excitation module
Arun A comparative analysis on the applicability of entropy in remote sensing
Li et al. Research on Crowd Counting Based on Attention Mechanism and Dilation Convolution
CN110866431B (en) Training method of face recognition model, and face recognition method and device
CN110826472B (en) Image detection method and device
AlMarzooqi et al. Increase the exploitation of mars satellite images via deep learning techniques
Wang et al. An optimal coverage model for the deployment of iot devices in feature-based video transmission systems
Li et al. Parameter Selection for Denoising Algorithms Using NR-IQA with CNN
CN117011325A (en) Video processing method, device, computer equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant