CN111179301B - Motion trend analysis method based on computer video - Google Patents
Motion trend analysis method based on computer video Download PDFInfo
- Publication number
- CN111179301B CN111179301B CN201911338801.8A CN201911338801A CN111179301B CN 111179301 B CN111179301 B CN 111179301B CN 201911338801 A CN201911338801 A CN 201911338801A CN 111179301 B CN111179301 B CN 111179301B
- Authority
- CN
- China
- Prior art keywords
- image
- motion
- calculating
- carrying
- counting
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/207—Analysis of motion for motion estimation over a hierarchy of resolutions
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20024—Filtering details
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02D—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
- Y02D10/00—Energy efficient computing, e.g. low power processors, power management or thermal management
Abstract
The invention discloses a motion trend analysis method based on computer video, which comprises the following steps: s1, preprocessing: the method comprises the steps of initializing a system, establishing a processing flow, and initializing an image feature detector; s2, motion distribution analysis: the method comprises the steps of calculating an image difference value, obtaining a gray result image, and carrying out Gaussian filtering, thresholding, image expansion and image corrosion treatment; s3, analyzing motion trend: the method is used for calculating the motion condition of the feature points based on the LK optical flow pyramid, counting and analyzing the motion mode. According to the method, the image difference value is calculated, the result is more stable through thresholding and filtering of the calculation result, the characteristic points are selected based on the motion distribution condition, the calculation amount can be reduced to the greatest extent, and only the image region of most interest is concerned. The LK optical flow method is used for tracking the characteristic points and carrying out statistical analysis, so that more stable and accurate motion vector description can be obtained, and a better motion trend analysis effect can be obtained.
Description
Technical Field
The invention relates to the technical field of computers, in particular to a motion trend analysis method based on computer videos.
Background
In the shooting process of many video programs, the current activities in the scenes need to be tracked in real time, and focus, zoom in and close-up are provided when necessary, such as interviews of television news, live sports games, and other video programs with relatively fixed scenes, in the past, the work is usually completed manually, if the work can be handed to a computer for automatic processing, a large number of workers can be undoubtedly liberated from the task, so that the utilization efficiency of human resources is increased and the cost is reduced.
To achieve the above objective, the greatest difficulty is how to know the distribution of movements in a scene, and many times to analyze such movements often requires the creation of complex models for identifying scenes or characters, the operation of such models often requires relatively high resources and the effect is not very ideal.
Disclosure of Invention
The aim of the invention is achieved by the following technical scheme.
According to a first aspect of the present invention, there is provided a motion trend analysis method based on computer video, comprising the steps of:
s1, preprocessing: the method comprises the steps of initializing a system, establishing a processing flow, and initializing an image feature detector;
s2, motion distribution analysis: the method comprises the steps of calculating an image difference value, obtaining a gray result image, and carrying out Gaussian filtering, thresholding, image expansion and image corrosion treatment;
s3, analyzing motion trend: the method is used for calculating the motion condition of the feature points based on the LK optical flow pyramid, counting and analyzing the motion mode.
Further, the motion profile analysis step S2 includes:
s21, calculating an image difference value: acquiring and storing an initial frame P0 of an input image sequence, and calculating a difference value between a current frame and a previous frame from a second frame P1 of the image sequence;
s22, differential image processing: carrying out graying treatment on the difference value, setting a threshold value between 20 and 45, carrying out Gaussian filtering on the image subjected to the graying treatment, carrying out image expansion operation on the difference value image subjected to the Gaussian filtering, and then carrying out image corrosion operation on the image subjected to the expansion to obtain a result P-Mask;
s23, a statistical analysis step: and counting the image distribution condition on the P-Mask, wherein a white area represents motion, and performing heat statistics on the P-Mask by using a window with the size of 1/25 of the picture, and sequencing.
Further, the gaussian filter has a filter kernel size between 3 and 7.
Further, the motion trend analysis step S3 includes:
s31, feature point selection: overlapping the P1 image by using the P-Mask as a Mask map, and extracting corner points and edge features from the overlapped result; encoding the extracted features, and inputting the encoded features into an LK optical flow pyramid for initialization;
s32, tracking characteristic points: saving P1, acquiring a next frame of image P2, performing LK optical flow pyramid calculation on the P2 image to obtain a new characteristic point sequence, subtracting the new characteristic point sequence from the previous characteristic point sequence, and acquiring all motion vectors;
s33, pattern matching: and calculating all motion vectors, carrying out distribution statistics, and matching the most probable motion trend by taking different motion modes as templates.
Further, the edge feature is a Brisk feature.
Further, the pattern matching step S33 includes:
s331, motion vector calculation: the characteristic point tracking result is differenced with the original characteristic point set to obtain a motion vector set;
s332, motion vector thresholding output: thresholding and screening are carried out on the motion vector set, and the motion vector with too small or too large mode is removed, wherein the range is [5, 50];
s333, motion vector distribution statistics: carrying out direction normalization on the motion vector, taking the whole plane coordinate system as the reference, taking 45 degrees as a direction range interval, dividing 8 direction ranges, and counting all vectors in the 8 direction range intervals;
s334, result statistics and template matching: and counting the vector distribution, calculating the matching degree of the current motion mode and the prefabricated template, and outputting the motion mode with the highest matching degree.
According to a second aspect of the present invention, there is provided a motion trend analysis method based on computer video, comprising the steps of:
B1. establishing a processing flow and initializing an image feature detector;
B2. inputting an image sequence to be processed, preprocessing the input image sequence,
B3. taking out the next frame of image;
B4. judging whether the acquired image is a first frame, if so, directly outputting the image as it is, and jumping to B11, otherwise, continuing to B4;
B5. calculating the difference value between the current image and the previous frame image to obtain a difference image, and calculating the distribution condition of the motion area according to the difference image;
B6. judging whether the feature point set is empty, if so, acquiring the motion area feature point set, otherwise, continuing to perform the step B7;
B7. tracking a feature point set by using an LK optical flow method, calculating a motion vector, carrying out direction normalization on the motion vector, taking the whole plane coordinate system as a reference, taking 45 degrees as a direction range interval, dividing 8 direction ranges, and counting all vectors in the 8 direction range intervals;
B8. counting motion vectors, matching the counting result with a prefabricated template, and outputting a motion mode with highest matching degree;
B9. deleting invalid feature points, namely feature points with motion vectors of 0;
B10. outputting the motion mode as a result;
B11. judging whether the frame is the last frame, if not, jumping to B3, and if yes, executing B12;
B12. and (5) ending.
According to a third aspect of the present invention, a computer video-based motion trend analysis system comprises:
the preprocessing module is used for initializing the system, establishing a processing flow and initializing the image feature detector;
the motion distribution analysis module is used for calculating an image difference value, obtaining a gray result image and carrying out thresholding, image expansion and image corrosion treatment;
and the motion trend analysis module is used for calculating the motion condition of the feature points by taking the LK optical flow pyramid as a basic algorithm, counting and analyzing the motion mode.
Further, the motion profile analysis module includes:
and a difference value calculation module: performing difference calculation on the front frame image and the rear frame image, and outputting a graying result image;
and a subsequent processing module: and removing noise points through Gaussian filtering of the difference gray level image, removing pixel points lower than a gray level threshold value, and then performing expansion and corrosion operation on the image to obtain a maximum connected region.
Further, the movement trend analysis module includes:
the feature point selection module: selecting image feature points of a mask part based on the mask image generated by the motion analysis module;
and a feature point tracking module: performing LK optical flow method tracking on the selected image characteristic points;
and a pattern matching module: calculating the motion vector of the feature point, counting the distribution condition of the motion vector, and matching the preset motion mode.
The invention has the advantages that: according to the method, the image difference value is calculated, the result is more stable through thresholding and filtering of the calculation result, the characteristic points are selected based on the motion distribution condition, the calculation amount can be reduced to the greatest extent, and only the image region of most interest is concerned. The LK optical flow method is used for tracking the characteristic points and carrying out statistical analysis, so that more stable and accurate motion vector description can be obtained, and a better motion trend analysis effect can be obtained.
Drawings
Various other advantages and benefits will become apparent to those of ordinary skill in the art upon reading the following detailed description of the preferred embodiments. The drawings are only for purposes of illustrating the preferred embodiments and are not to be construed as limiting the invention. Also, like reference numerals are used to designate like parts throughout the figures. In the drawings:
FIG. 1 is a flow chart of a method for analyzing motion trend based on computer video according to an embodiment of the present invention;
FIG. 2 shows a flowchart of the pretreatment steps according to an embodiment of the present invention;
FIG. 3 shows a flow chart of the motion profile analysis steps according to an embodiment of the present invention;
FIG. 4 shows a flow chart of the steps of motion trend analysis according to an embodiment of the present invention;
FIG. 5 shows a flow chart of pattern matching steps according to an embodiment of the invention;
fig. 6 shows a block diagram of a motion trend analysis system based on computer video according to an embodiment of the present invention.
Detailed Description
Exemplary embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
The method realizes the purpose of obtaining ideal effect under the condition of consuming less resources by combining and optimizing basic algorithms.
As shown in fig. 1, the invention provides a motion trend analysis method based on computer video, which comprises the following steps:
s1, preprocessing: the method comprises the steps of initializing a system, establishing a processing flow, and initializing an image feature detector;
s2, motion distribution analysis: the method comprises the steps of calculating an image difference value, obtaining a gray result image, and performing Gaussian filtering, thresholding, image expansion, image corrosion and the like;
s3, analyzing motion trend: the method is used for calculating the motion condition of the feature points based on the LK optical flow pyramid, counting and analyzing the motion mode.
As shown in fig. 2, the motion profile analysis step S2 includes:
s21, calculating an image difference value: acquiring and storing an initial frame P0 of an input image sequence, starting from a second frame P1 of the image sequence, and calculating a difference value between a current frame and a previous frame, such as calculating a difference value P-Diff between the P1 and the P0;
s22, differential image processing: carrying out graying treatment on the P-Diff, setting a threshold value between 20 and 45, carrying out Gaussian filtering on the image after the graying treatment, wherein the filtering kernel size is between 3 and 7, carrying out image expansion operation on the P-Diff image after the Gaussian filtering, and then carrying out image corrosion operation on the image after the expansion to obtain a result P-Mask;
s23, a statistical analysis step: and counting the image distribution condition on the P-Mask, wherein a white area represents motion, and performing heat statistics on the P-Mask by using a window with the size of 1/25 of the picture, and sequencing.
As shown in fig. 3, the movement trend analysis step S3 includes:
s31, feature point selection: overlapping the P1 image by using the P-Mask as a Mask map, and extracting corner points and edge features, such as Brisk features, of the overlapped result; encoding the extracted features, and inputting the encoded features into an LK optical flow pyramid for initialization;
s32, tracking characteristic points: saving P1, acquiring a next frame of image P2, performing LK optical flow pyramid calculation on the P2 image to obtain a new characteristic point sequence, subtracting the new characteristic point sequence from the previous characteristic point sequence, and acquiring all motion vectors;
s33, pattern matching: and calculating all motion vectors and carrying out distribution statistics, and matching the most probable motion trend by taking different motion modes such as aggregation, dispersion, group equidirectional motion, individual motion and the like as templates.
As shown in fig. 4, the pattern matching step S33 includes:
s331, motion vector calculation: the characteristic point tracking result is differenced with the original characteristic point set to obtain a motion vector set;
s332, motion vector thresholding output: thresholding and screening are carried out on the motion vector set, and the motion vector with too small or too large mode is removed, wherein the range is [5, 50];
s333, motion vector distribution statistics: carrying out direction normalization on the motion vector, taking the whole plane coordinate system as the reference, taking 45 degrees as a direction range interval, dividing 8 direction ranges, and counting all vectors in the 8 direction range intervals;
s334, result statistics and template matching: and counting vector distribution, calculating the matching degree of the current motion mode and a prefabricated template, wherein the prefabricated template has modes of aggregation, dispersion, group equidirectional motion, individual motion and the like, and outputting the motion mode with the highest matching degree.
FIG. 5 shows a detailed flow chart of a motion trend analysis method according to an embodiment of the invention, comprising:
B1. establishing a processing flow and initializing an image feature detector;
B2. inputting an image sequence to be processed, preprocessing the input image sequence by a preprocessor (equivalent to a preprocessing module in the system),
B3. taking out the next frame of image;
B4. judging whether the acquired image is a first frame, if so, directly outputting the image as it is, and jumping to B11, otherwise, continuing to B4;
B5. calculating the difference value between the current image and the previous frame image to obtain a difference image, and calculating the distribution condition of the motion area according to the difference image;
B6. judging whether the feature point set is empty, if so, acquiring the motion area feature point set, otherwise, continuing to perform the step B7;
B7. tracking a feature point set by using an LK optical flow method, calculating a motion vector, carrying out direction normalization on the motion vector, taking the whole plane coordinate system as a reference, taking 45 degrees as a direction range interval, dividing 8 direction ranges, and counting all vectors in the 8 direction range intervals;
B8. counting motion vectors, matching a counting result with a template, wherein the prefabricated template has modes of aggregation, dispersion, group equidirectional motion, individual motion and the like, and outputting a motion mode with highest matching degree;
B9. deleting invalid feature points, namely feature points with motion vectors of 0;
B10. the motion pattern is output as a result of this,
B11. judging whether the frame is the last frame, if not, jumping to B3, and if yes, executing B12;
B12. and (5) ending.
By the method, a rapid and accurate image motion area and motion trend description are obtained, and video production personnel or automation equipment is helped to control the machine position. Specifically, the analysis of the movement area can reflect the area with more intense movement in the scene, so that the machine position can be conveniently switched, and the analysis of the movement trend can reflect a specific movement which is happening in the scene, so that the movement trend can be conveniently tracked and focused.
As shown in fig. 6, the present invention further discloses a motion trend analysis system 100 based on computer video, including:
the preprocessing module 101: the system is responsible for initializing a system, establishing a processing flow and initializing an image feature detector;
motion profile analysis module 102: the method is responsible for calculating the image difference value and obtaining a gray result image, and performing thresholding, image expansion, image corrosion and other treatments;
motion trend analysis module 103: and the method is responsible for calculating the motion condition of the feature points by taking the LK optical flow pyramid as a basic algorithm, counting and analyzing the motion mode.
In the motion trend analysis system as described above, the motion profile analysis module 102 includes: and a difference value calculation module: performing difference calculation on the front frame image and the rear frame image, and outputting a graying result image; and a subsequent processing module: and through experiments, the effect is good when the threshold value range is between 20 and 45, and then the image is subjected to expansion and corrosion operation to obtain the maximum connected region.
In the movement trend analysis system as described above, the movement trend analysis module 103 includes: the feature point selection module: selecting image feature points of a mask part based on the mask image generated by the motion analysis module; and a feature point tracking module: performing LK optical flow method tracking on the selected image characteristic points; and a pattern matching module: calculating the motion vector of the feature point, counting the distribution condition of the motion vector, and matching the preset motion mode.
The present invention is not limited to the above-mentioned embodiments, and any changes or substitutions that can be easily understood by those skilled in the art within the technical scope of the present invention are intended to be included in the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.
Claims (6)
1. The motion trend analysis method based on the computer video is characterized by comprising the following steps of:
s1, preprocessing: the method comprises the steps of initializing a system, establishing a processing flow, and initializing an image feature detector;
s2, motion distribution analysis: the method comprises the steps of calculating an image difference value, obtaining a gray result image, and carrying out Gaussian filtering, thresholding, image expansion and image corrosion treatment;
s3, analyzing motion trend: the method is used for calculating the motion condition of the feature points by taking the LK optical flow pyramid as a basic algorithm, counting and analyzing the motion mode;
the motion profile analysis step S2 includes:
s21, calculating an image difference value: acquiring and storing an initial frame P0 of an input image sequence, and calculating a difference value between a current frame and a previous frame from a second frame P1 of the image sequence;
s22, differential image processing: carrying out graying treatment on the difference value, setting a threshold value between 20 and 45, carrying out Gaussian filtering on the image subjected to the graying treatment, carrying out image expansion operation on the difference value image subjected to the Gaussian filtering, and then carrying out image corrosion operation on the image subjected to the expansion to obtain a result P-Mask;
s23, a statistical analysis step: counting the image distribution condition on the P-Mask, wherein the white area represents motion, and counting the heat of the P-Mask by using a window with the size of 1/25 of the picture, and sequencing;
the motion trend analysis step S3 includes:
s31, feature point selection: overlapping the P1 image by using the P-Mask as a Mask map, and extracting corner points and edge features from the overlapped result; encoding the extracted features, and inputting the encoded features into an LK optical flow pyramid for initialization;
s32, tracking characteristic points: saving P1, acquiring a next frame of image P2, performing LK optical flow pyramid calculation on the P2 image to obtain a new characteristic point sequence, subtracting the new characteristic point sequence from the previous characteristic point sequence, and acquiring all motion vectors;
s33, pattern matching: and calculating all motion vectors, carrying out distribution statistics, and matching the most probable motion trend by taking different motion modes as templates.
2. The method for analyzing motion trend based on computer video according to claim 1, wherein,
the gaussian filter has a filter kernel size between 3 and 7.
3. The method for analyzing motion trend based on computer video according to claim 1, wherein,
the edge feature is a Brisk feature.
4. The method for analyzing motion trend based on computer video according to claim 1, wherein,
the pattern matching step S33 includes:
s331, motion vector calculation: the characteristic point tracking result is differenced with the original characteristic point set to obtain a motion vector set;
s332, motion vector thresholding output: thresholding and screening are carried out on the motion vector set, and the motion vector with too small or too large mode is removed, wherein the range is [5, 50];
s333, motion vector distribution statistics: carrying out direction normalization on the motion vector, taking the whole plane coordinate system as the reference, taking 45 degrees as a direction range interval, dividing 8 direction ranges, and counting all vectors in the 8 direction range intervals;
s334, result statistics and template matching: and counting the vector distribution, calculating the matching degree of the current motion mode and the prefabricated template, and outputting the motion mode with the highest matching degree.
5. The motion trend analysis method based on the computer video is characterized by comprising the following steps of:
B1. establishing a processing flow and initializing an image feature detector;
B2. inputting an image sequence to be processed, preprocessing the input image sequence,
B3. taking out the next frame of image;
B4. judging whether the acquired image is a first frame, if so, directly outputting the image as it is, and jumping to B11, otherwise, continuing to B4;
B5. calculating the difference value between the current image and the previous frame image to obtain a difference image, and calculating the distribution condition of the motion area according to the difference image;
B6. judging whether the feature point set is empty, if so, acquiring the motion area feature point set, otherwise, continuing to perform the step B7;
B7. tracking a feature point set by using an LK optical flow method, calculating a motion vector, carrying out direction normalization on the motion vector, taking the whole plane coordinate system as a reference, taking 45 degrees as a direction range interval, dividing 8 direction ranges, and counting all vectors in the 8 direction range intervals;
B8. counting motion vectors, matching the counting result with a prefabricated template, and outputting a motion mode with highest matching degree;
B9. deleting invalid feature points, namely feature points with motion vectors of 0;
B10. outputting the motion mode as a result;
B11. judging whether the frame is the last frame, if not, jumping to B3, and if yes, executing B12;
B12. and (5) ending.
6. A computer video-based motion trend analysis system, comprising:
the preprocessing module is used for initializing the system, establishing a processing flow and initializing the image feature detector;
the motion distribution analysis module is used for calculating an image difference value, obtaining a gray result image and carrying out thresholding, image expansion and image corrosion treatment;
the motion trend analysis module is used for calculating the motion condition of the feature points by taking the LK optical flow pyramid as a basic algorithm, counting and analyzing the motion mode;
the motion profile analysis module comprises:
and a difference value calculation module: acquiring and storing an initial frame P0 of an input image sequence, and calculating a difference value between a current frame and a previous frame from a second frame P1 of the image sequence;
and a subsequent processing module: carrying out graying treatment on the difference value, setting a threshold value between 20 and 45, carrying out Gaussian filtering on the image subjected to the graying treatment, carrying out image expansion operation on the difference value image subjected to the Gaussian filtering, and then carrying out image corrosion operation on the image subjected to the expansion to obtain a result P-Mask; counting the image distribution condition on the P-Mask, wherein the white area represents motion, and counting the heat of the P-Mask by using a window with the size of 1/25 of the picture, and sequencing; the movement trend analysis module comprises:
the feature point selection module: overlapping the P1 image by using the P-Mask as a Mask map, and extracting corner points and edge features from the overlapped result; encoding the extracted features, and inputting the encoded features into an LK optical flow pyramid for initialization;
and a feature point tracking module: saving P1, acquiring a next frame of image P2, performing LK optical flow pyramid calculation on the P2 image to obtain a new characteristic point sequence, subtracting the new characteristic point sequence from the previous characteristic point sequence, and acquiring all motion vectors;
and a pattern matching module: and calculating all motion vectors, carrying out distribution statistics, and matching the most probable motion trend by taking different motion modes as templates.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911338801.8A CN111179301B (en) | 2019-12-23 | 2019-12-23 | Motion trend analysis method based on computer video |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911338801.8A CN111179301B (en) | 2019-12-23 | 2019-12-23 | Motion trend analysis method based on computer video |
Publications (2)
Publication Number | Publication Date |
---|---|
CN111179301A CN111179301A (en) | 2020-05-19 |
CN111179301B true CN111179301B (en) | 2023-06-30 |
Family
ID=70657451
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201911338801.8A Active CN111179301B (en) | 2019-12-23 | 2019-12-23 | Motion trend analysis method based on computer video |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111179301B (en) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109697387B (en) * | 2017-10-23 | 2021-07-30 | 北京京东尚科信息技术有限公司 | Motion direction prediction method and device, electronic equipment and storage medium |
CN111833320A (en) * | 2020-07-06 | 2020-10-27 | 涵古观智能科技(苏州)有限公司 | Method, device and equipment for detecting running state of steel strip and storage medium |
Citations (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103370937A (en) * | 2011-02-18 | 2013-10-23 | 西门子公司 | Coding method and image coding device for the compression of an image sequence |
CN103946732A (en) * | 2011-09-26 | 2014-07-23 | 微软公司 | Video display modification based on sensor input for a see-through near-to-eye display |
CN104331151A (en) * | 2014-10-11 | 2015-02-04 | 中国传媒大学 | Optical flow-based gesture motion direction recognition method |
CN104463191A (en) * | 2014-10-30 | 2015-03-25 | 华南理工大学 | Robot visual processing method based on attention mechanism |
CN104835115A (en) * | 2015-05-07 | 2015-08-12 | 中国科学院长春光学精密机械与物理研究所 | Imaging method for aerial camera, and system thereof |
CN106780565A (en) * | 2016-11-15 | 2017-05-31 | 天津大学 | A kind of many students based on light stream and k means clusters rise and sit detection method |
CN107292911A (en) * | 2017-05-23 | 2017-10-24 | 南京邮电大学 | A kind of multi-object tracking method merged based on multi-model with data correlation |
CN107871315A (en) * | 2017-10-09 | 2018-04-03 | 中国电子科技集团公司第二十八研究所 | A kind of video image motion detection method and device |
CN108416798A (en) * | 2018-03-05 | 2018-08-17 | 山东大学 | A kind of vehicle distances method of estimation based on light stream |
CN108537212A (en) * | 2018-07-04 | 2018-09-14 | 南京邮电大学 | Students ' behavior detection method based on estimation |
CN109433641A (en) * | 2018-09-30 | 2019-03-08 | 南通大学 | The filling omission intelligent detecting method of tablet capsule based on machine vision |
CN110009624A (en) * | 2019-04-11 | 2019-07-12 | 成都四方伟业软件股份有限公司 | Method for processing video frequency, video process apparatus and electronic equipment |
CN110517283A (en) * | 2019-07-18 | 2019-11-29 | 平安科技(深圳)有限公司 | Attitude Tracking method, apparatus and computer readable storage medium |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20130031574A (en) * | 2011-09-21 | 2013-03-29 | 삼성전자주식회사 | Image processing method and image processing apparatus |
-
2019
- 2019-12-23 CN CN201911338801.8A patent/CN111179301B/en active Active
Patent Citations (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103370937A (en) * | 2011-02-18 | 2013-10-23 | 西门子公司 | Coding method and image coding device for the compression of an image sequence |
CN103946732A (en) * | 2011-09-26 | 2014-07-23 | 微软公司 | Video display modification based on sensor input for a see-through near-to-eye display |
CN104331151A (en) * | 2014-10-11 | 2015-02-04 | 中国传媒大学 | Optical flow-based gesture motion direction recognition method |
CN104463191A (en) * | 2014-10-30 | 2015-03-25 | 华南理工大学 | Robot visual processing method based on attention mechanism |
CN104835115A (en) * | 2015-05-07 | 2015-08-12 | 中国科学院长春光学精密机械与物理研究所 | Imaging method for aerial camera, and system thereof |
CN106780565A (en) * | 2016-11-15 | 2017-05-31 | 天津大学 | A kind of many students based on light stream and k means clusters rise and sit detection method |
CN107292911A (en) * | 2017-05-23 | 2017-10-24 | 南京邮电大学 | A kind of multi-object tracking method merged based on multi-model with data correlation |
CN107871315A (en) * | 2017-10-09 | 2018-04-03 | 中国电子科技集团公司第二十八研究所 | A kind of video image motion detection method and device |
CN108416798A (en) * | 2018-03-05 | 2018-08-17 | 山东大学 | A kind of vehicle distances method of estimation based on light stream |
CN108537212A (en) * | 2018-07-04 | 2018-09-14 | 南京邮电大学 | Students ' behavior detection method based on estimation |
CN109433641A (en) * | 2018-09-30 | 2019-03-08 | 南通大学 | The filling omission intelligent detecting method of tablet capsule based on machine vision |
CN110009624A (en) * | 2019-04-11 | 2019-07-12 | 成都四方伟业软件股份有限公司 | Method for processing video frequency, video process apparatus and electronic equipment |
CN110517283A (en) * | 2019-07-18 | 2019-11-29 | 平安科技(深圳)有限公司 | Attitude Tracking method, apparatus and computer readable storage medium |
Non-Patent Citations (1)
Title |
---|
基于光流法的运动目标检测与跟踪算法;肖军 等;《东北大学学报(自然科学版)》;20160615;第37卷(第06期);770-774 * |
Also Published As
Publication number | Publication date |
---|---|
CN111179301A (en) | 2020-05-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110378264B (en) | Target tracking method and device | |
US11450146B2 (en) | Gesture recognition method, apparatus, and device | |
JP4766495B2 (en) | Object tracking device and object tracking method | |
US10327045B2 (en) | Image processing method, image processing device and monitoring system | |
Ma et al. | An α-matte boundary defocus model-based cascaded network for multi-focus image fusion | |
JP5213486B2 (en) | Object tracking device and object tracking method | |
CN109492577B (en) | Gesture recognition method and device and electronic equipment | |
Shyam et al. | Towards domain invariant single image dehazing | |
CN111160202B (en) | Identity verification method, device, equipment and storage medium based on AR equipment | |
CN111179301B (en) | Motion trend analysis method based on computer video | |
JP5578816B2 (en) | Image processing device | |
Appiah et al. | A single-chip FPGA implementation of real-time adaptive background model | |
CN112989910A (en) | Power target detection method and device, computer equipment and storage medium | |
JP4427052B2 (en) | Image processing apparatus and area tracking program | |
JP2021517281A (en) | Multi-gesture fine division method for smart home scenes | |
KR20190078890A (en) | Method and apparatus for estimating plane based on grids | |
Liu et al. | Scene background estimation based on temporal median filter with Gaussian filtering | |
CN110472608A (en) | Image recognition tracking processing method and system | |
US20200074612A1 (en) | Image analysis apparatus, image analysis method, and recording medium | |
CN108446653B (en) | Method and apparatus for processing face image | |
CN114387670A (en) | Gait recognition method and device based on space-time feature fusion and storage medium | |
CN112085025B (en) | Object segmentation method, device and equipment | |
CN114170090A (en) | Method and system for reconstructing high-resolution image from fuzzy monitoring video | |
CN113435248A (en) | Mask face recognition base enhancement method, device, equipment and readable storage medium | |
Hatimi et al. | New approach for detecting and tracking a moving object |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |