CN114359265A - Screw counting method and system based on target tracking - Google Patents

Screw counting method and system based on target tracking Download PDF

Info

Publication number
CN114359265A
CN114359265A CN202210205405.3A CN202210205405A CN114359265A CN 114359265 A CN114359265 A CN 114359265A CN 202210205405 A CN202210205405 A CN 202210205405A CN 114359265 A CN114359265 A CN 114359265A
Authority
CN
China
Prior art keywords
screw
module
color
counting
sum
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210205405.3A
Other languages
Chinese (zh)
Other versions
CN114359265B (en
Inventor
滕君秀
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Shunde Fuyide Intelligent Packaging Technology Co ltd
Original Assignee
Guangdong Shunde Fuyide Intelligent Packaging Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Shunde Fuyide Intelligent Packaging Technology Co ltd filed Critical Guangdong Shunde Fuyide Intelligent Packaging Technology Co ltd
Priority to CN202210205405.3A priority Critical patent/CN114359265B/en
Publication of CN114359265A publication Critical patent/CN114359265A/en
Application granted granted Critical
Publication of CN114359265B publication Critical patent/CN114359265B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Analysis (AREA)

Abstract

The invention relates to a screw counting method and a system based on target tracking, wherein the method provides a Lab-based color space algorithm, integrates an average color distance and a subregion growth algorithm, and realizes the rapid segmentation of the foreground and the background of a screw object; the system implements the method. Compared with other methods in the prior art, the method can quickly and accurately track and count the rigid object moving at variable speed. The invention effectively improves the counting capability of the rigid object with variable-speed motion and improves the counting accuracy and the counting efficiency.

Description

Screw counting method and system based on target tracking
Technical Field
The invention relates to the field of machine vision, in particular to a screw counting method and system based on target tracking.
Background
With the rapid development of metallurgy, machinery, electronics, automobile and building industries in China and the continuous progress of the industries in China, the demand and production of fasteners are driven. The data show that the market scale of the fastener in China is increased from 958.2 billion yuan in 2012 to 1465.5 billion yuan in 2020, and the annual composite growth rate is about 5.45%. A great number of fastener products in China are exported to countries all over the world, and fastener products in all over the world are continuously introduced into the Chinese market. The fastener is one of products with large import and export quantity in China, realizes international connection, and has important practical significance and strategic significance for promoting Chinese fastener enterprises to move to the world and promoting the fastener enterprises to fully participate in international cooperation and competition. The counting requirement of the fastener is unprecedentedly improved in recent years, and the fastener has a wider application prospect.
The conventional counting methods for fasteners such as screws can be roughly divided into hardware counting and machine vision counting.
In the screw counting system, the torque sensor is adopted to count the number of screws of the screw gun, so that the application scene is single and the screw gun is difficult to be widely applied. In the screw counter, the counting is realized by counting the number of feedback electric signals, and the application conditions are harsh and the counting speed is slow. Cai's treasures realized screw count in "screw measurement equipment" by adopting the method of calculating the whole weight of screw, and its count precision hardly reaches higher level, has proposed higher requirement to mechanical mechanism intensity simultaneously. The Miao Jiaojiao is counted by adopting an edge detection and template matching method in 'screw counting method research based on image processing', but due to the complexity of an imaging environment, the Miao Jiaojiao also needs manual intervention in an image processing process, and the efficiency is lower. The well-known peak adopts a connected domain counting method in a bar counting method based on machine vision to count bars, has a good counting effect on static objects, but cannot achieve a good counting effect on moving objects, particularly variable-speed moving objects.
In view of the above technical problems, there is a need for a screw counting method for improving the counting efficiency of rigid objects in sports, especially in variable-speed sports, and having more application scenarios and more relaxed application condition limitations.
Disclosure of Invention
The invention provides a screw counting method and system based on target tracking, which are used for improving the screw counting efficiency.
The technical scheme adopted by the invention for realizing the technical purpose is as follows: a screw counting method based on target tracking comprises the following steps:
s100, acquiring falling video image data of the screw;
s200, separating the foreground and the background of the screw based on colors;
step S300, determining the size and the center coordinates of the search box;
s400, realizing the quick iteration of the center coordinates;
in the step, the fast iteration of the center coordinate is realized by a self-adaptive moving algorithm of the center coordinate of the search box, and the method comprises the following steps:
step S410: performing Correlation comparison on the search frame and the frame image line by line, and obtaining a Correlation result with a value range of [0,1] for each pixel;
step S420: mapping the correlation result to [0-255] from [0-1] to obtain a gray scale map of the image back projection, wherein the size is J x K, and J, K is a natural number;
step S430: according to x and y, the expected moving positions delta Xi and delta Yi of the center of the search box are calculated line by line, wherein delta Xi =
Figure DEST_PATH_IMAGE001
, △Yi=
Figure DEST_PATH_IMAGE002
Wherein n represents the number of columns of the back-projected gray scale map; i represents the number of rows of the backprojected gray scale map;
Figure DEST_PATH_IMAGE003
gray values of the back projection gray map of the ith row and the nth column are represented;
step S440: summing the Δ Xi and Δ Yi of each row to obtain Δ sum _ x and Δ sum _ y, wherein Δ sum _ x =
Figure DEST_PATH_IMAGE004
, △sum_y=
Figure DEST_PATH_IMAGE005
Step S450: moving the search box to { x + [ delta ] sum _ x, y + [ delta ] sum _ y };
step S460: because the screw is in horizontal throwing motion, the displacement in the vertical direction is changed more along with time, and the displacement in the vertical direction is changed less; therefore, it is determined whether the amount of displacement Δ sum _ y in the vertical direction is smaller than
Figure DEST_PATH_IMAGE006
And judging whether the delta sum _ x is smaller than
Figure DEST_PATH_IMAGE007
If the above requirements are satisfied, go to step S500; otherwise, taking { x +. DELTA.sum _ x, y +. DELTA.sum _ y } as the center coordinate of the new search box, and repeating the steps S430 to S450;
step S500, repeating the steps S100 to S400, and realizing accurate and rapid tracking of the screw;
and step S600, adding a judgment area, and accurately counting screws passing through the judgment area.
Further, in the above method for counting screws based on target tracking: in the step S100, a line scanning camera is adopted to collect falling video image data of the screw and transmit the falling video image data to an upper computer in real time; the method comprises the following steps: the imaging area of the line scanning camera is parallel to the falling surface of the screw.
Further, in the above method for counting screws based on target tracking: in the step S200, the separation of the foreground and the background of the color-based screw is realized by a Lab-based color algorithm, which includes:
step S210: dividing the image into subblocks of N x N, wherein N is a natural number;
step S220: converting the color space of each sub-block from RGB to Lab to obtain three mutually perpendicular color components of L, a and b;
step S230: respectively calculating the color distance between two adjacent sub-blocks u and v according to the following formula
Figure DEST_PATH_IMAGE008
Figure DEST_PATH_IMAGE009
.
Figure DEST_PATH_IMAGE010
Figure DEST_PATH_IMAGE011
In the formula:
Figure DEST_PATH_IMAGE012
respectively are the average values of L, a and b of all pixel points in the sub-blocks u and v;
Figure DEST_PATH_IMAGE013
Figure DEST_PATH_IMAGE014
Figure DEST_PATH_IMAGE015
represents the average of R, G, B of sub-block u,
Figure DEST_PATH_IMAGE016
Figure DEST_PATH_IMAGE017
Figure DEST_PATH_IMAGE018
represents the average of R, G, B of the sub-block v;
Figure DEST_PATH_IMAGE019
represents: the smaller of the two;
step S240: judging whether d1 is smaller than a set threshold value; if the number of the areas is smaller than the preset number, the two areas are merged; otherwise, not merging;
step S250: repeating the steps S230 to S240, and performing merging or non-merging operation on all the sub-blocks; and finally, dividing the screw target.
Further, in the above method for counting screws based on target tracking: in the step S300: the size and center coordinates of the search box are determined as follows: and finding the mass center of the segmented screw target, and gradually enlarging the rectangle by taking the mass center as the center until the image area covered by all the screw targets is included, wherein the rectangle is a search frame.
The invention also provides a screw counting system based on target tracking, which comprises:
the device comprises a camera device for collecting falling video image data of the screw and an upper computer for processing the falling video image data; the camera device transmits the collected falling video image data of the screw to the upper computer;
the host computer include:
a module to achieve color-based separation of the foreground and background of the screw;
a module for determining the size and the center coordinates of the search box;
a module for implementing fast iteration of the central coordinates;
the module for accurately and quickly tracking the screw is realized;
and a module for counting screws.
Further, in the above screw counting system based on target tracking: the camera device comprises a linear scanner;
the linear scanner also comprises a conveyor belt for conveying the screws, and an imaging area of the linear scanner is parallel to a falling surface of the screws when the screws fall from the conveyor belt.
Further, in the above screw counting system based on target tracking: the module for separating the foreground and the background of the color-based screw is used for separating the foreground and the background of the screw through a Lab-based color algorithm; the method comprises the following steps:
a module for dividing the image into N x N sub-blocks, N being a natural number;
the color space conversion module converts the color space of each sub-block from RGB to Lab to obtain three mutually perpendicular color components of L, a and b;
a color distance calculating module, wherein the color distance calculating module respectively calculates the color distance between two adjacent sub-blocks u and v according to the following formula
Figure 656201DEST_PATH_IMAGE008
Figure 42183DEST_PATH_IMAGE009
.
Figure 290761DEST_PATH_IMAGE010
Figure DEST_PATH_IMAGE020
The comparison and combination module judges whether d1 is smaller than a set threshold value; if so, the two regions are merged.
Further, in the above screw counting system based on target tracking: the module for determining the size and the central coordinate of the search box is connected with the module for realizing the separation of the foreground and the background of the screw based on the color, and the size and the central coordinate of the search box are determined by using the result obtained after the segmentation of the module for realizing the separation of the foreground and the background of the screw based on the color.
Further, in the above screw counting system based on target tracking: the module for realizing the rapid iteration of the central coordinates is connected with the module for determining the size and the central coordinates of the search box, and the module for determining the size and the central coordinates of the search box can realize the rapid iteration of the central coordinates by a self-adaptive moving algorithm of the central coordinates of the search box.
Compared with other methods in the prior art, the method can quickly and accurately track and count the rigid object moving at variable speed. The invention provides a Lab-based color space algorithm, which integrates an average color distance and a subregion growth algorithm to realize rapid segmentation of the foreground and the background of a screw object. Providing a self-adaptive moving algorithm of a center coordinate of a search box, carrying out weighted average on a similarity value of each pixel of an image, and setting a self-adaptive threshold value aiming at the difference between horizontal variable motion and vertical variable motion, so that the quick and accurate iteration of the center coordinate is realized, and the quick and accurate tracking effect of a screw target is achieved; the method effectively improves the counting capacity of the rigid object with variable-speed motion, and improves the counting accuracy and the counting efficiency.
The present invention will be described in detail below with reference to the accompanying drawings and specific embodiments.
Drawings
FIG. 1 is a flow chart of example 1 of the present invention.
Detailed Description
In order to make the technical solutions of the present application better understood, the technical solutions of the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described examples are only a part of the embodiments of the present application, but not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Embodiment 1, this embodiment is a screw counting system based on target tracking, the system includes a camera device for collecting falling video image data of a screw and an upper computer for processing the falling video image data; the camera device transmits the collected falling video image data of the screw to the upper computer. The camera device is a linear scanner, and an imaging area of the linear scanner is approximately parallel to a falling surface of the screw when the screw falls from the conveyor belt.
The host computer includes:
a module to achieve color-based separation of the foreground and background of the screw;
a module for determining the size and the center coordinates of the search box;
a module for implementing fast iteration of the central coordinates;
the module for accurately and quickly tracking the screw is realized;
and a module for counting screws.
The screw counting method based on the target tracking by using the screw counting system of the embodiment comprises the following steps:
step S100: the screw drops to the imaging area after the conveyer belt transportation, and the whereabouts video image data of screw is gathered to the line scanning camera to real-time transmission to the host computer. In this step
After the screws are conveyed by the conveyor belt, the falling process is in a flat parabola shape and is in variable-speed motion.
The imaging area of the line scanning camera is approximately parallel to the falling surface of the screw, the falling video image data of the screw is collected, the size of the image is m x n, and the image data is transmitted to the upper computer.
In the upper computer, a counting algorithm is used for realizing real-time counting, and steps S200-S600 are included.
Step S200: the separation of the foreground and the background of the color-based screw is realized through a Lab-based color algorithm; the specific process is as follows:
step S210: the image is divided into N × N sub-blocks, and the size of N can be dynamically adjusted according to the size of the screw, in this example, N = 4.
Step S220: and converting the color space of each sub-block from RGB to Lab to obtain three mutually perpendicular color components of L, a and b.
Specifically, the conversion method is as follows: the RGB color space is first converted to the XYZ color space:
Figure DEST_PATH_IMAGE021
the XYZ image is then converted to a Lab image using the following formula:
Figure DEST_PATH_IMAGE022
and finally, averaging the L, a and b, namely summing all pixel points (4 x 4) in the sub-block and then averaging to obtain
Figure DEST_PATH_IMAGE023
Figure DEST_PATH_IMAGE024
Figure DEST_PATH_IMAGE025
Step S230: the color distance d1 between two adjacent sub-blocks u and v is calculated, respectively. Wherein
Figure 639965DEST_PATH_IMAGE009
.
Figure 657600DEST_PATH_IMAGE010
Figure 948904DEST_PATH_IMAGE011
In the formula:
Figure 684779DEST_PATH_IMAGE012
respectively are the average values of L, a and b of all pixel points in the sub-blocks u and v;
Figure 86941DEST_PATH_IMAGE013
Figure 959082DEST_PATH_IMAGE014
Figure 155708DEST_PATH_IMAGE015
represents the average of R, G, B of sub-block u,
Figure 113300DEST_PATH_IMAGE016
Figure 319154DEST_PATH_IMAGE017
Figure 45801DEST_PATH_IMAGE018
represents the average of R, G, B of the sub-block v;
Figure 944487DEST_PATH_IMAGE019
represents: the smaller of the two;
step S240: judging whether d1 is smaller than a set threshold value; if the number of the areas is smaller than the preset number, the two areas are merged; otherwise, no merging is performed.
Step S250: and repeating the steps S230 to S240, and performing merging or non-merging operation on all the sub-blocks to finally obtain the foreground sub-block and the background sub-block of the screw target.
Step S260: and after the foreground and the background are separated, segmenting the screw target.
Step S300: determining the size and the center coordinate of the search box by using the segmented result; and finding the mass center of the segmented screw target, and gradually enlarging the rectangle by taking the mass center as the center until the image area covered by all the screw targets is included, wherein the rectangle is a search frame.
In this embodiment, the centroid may not be the search box center, but is merely to bring out the rectangle. The method for determining the center coordinates of the search box comprises the following steps: and taking the mass center of the screw target as a central point, and increasing four sides of the rectangle in an equal amount until the length of one side exceeds the boundary of the screw target, stopping increasing the side, and continuing the rest sides. Until all sides exceed the image area covered by all the screw targets. In this embodiment, the rectangle is gradually enlarged with the center of mass as the center until the image area covered by all the screw targets is included. The rectangle is the search box, and the intersection point of the diagonal lines is the center of the search box.
Step S400: the fast iteration of the center coordinate is realized through a self-adaptive moving algorithm of the center coordinate of the search box; the method comprises the following steps:
step S410: and performing Correlation comparison on the search frame and the frame image (the next frame image after the search frame) line by line, and obtaining a Correlation result with a value range of [0,1] for each pixel.
Step S420: and mapping the correlation result from [0-1] to [0-255] to obtain a gray scale map of the back projection of the image, wherein the size is J x K, and J and K are natural numbers.
Step S430: according to x and y, the expected moving positions delta Xi and delta Yi of the center of the search box are calculated line by line, wherein delta Xi =
Figure DEST_PATH_IMAGE026
, △Yi=
Figure DEST_PATH_IMAGE027
Wherein n represents the number of columns of the back-projected gray scale map; i represents the number of rows of the backprojected gray scale map;
Figure 631837DEST_PATH_IMAGE003
gray values of the back projection gray map of the ith row and the nth column are represented;
step S440: summing the delta Xi and delta Yi of each row to obtain delta sum(X)And Δ sum(Y)Wherein Δ sum(X)=
Figure DEST_PATH_IMAGE028
, △sum(Y)=
Figure DEST_PATH_IMAGE029
Step S450: move search box to { x +. DELTA.sum(X), y+△sum(Y)};
Step S460: because the screw is in horizontal throwing motion, the displacement in the vertical direction is changed more along with time, and the displacement in the vertical direction is changed less; therefore, the amount of displacement Δ sum in the vertical direction is determined(Y)Whether or not less than
Figure DEST_PATH_IMAGE030
Judgment of Δ sum(X)Whether or not less than
Figure DEST_PATH_IMAGE031
If the above requirements are satisfied, go to step S500; otherwise { x +. DELTA.sum(X), y+△sum(Y)And (4) as the new search box center coordinate, repeating the step S430 to the step S450.
Step S500: continuously inputting subsequent frame images, and repeating the process to realize accurate and rapid tracking of the screw; specifically, the next frame image is input, the updated search box obtained in the above process is used as a new search box, and steps S410 to S460 are repeated.
For a new frame of image, whether a new screw appears is detected through steps S210 to S310. If new screws appear, an additional search box is used for subsequent tracking procedures.
Step S600: and finally, adding a judgment area, and accurately counting the screws passing through the judgment area. In the right middle part of the imaging area, namely with (m/2, n/2) as a central coordinate, a rectangular frame with the size of (a, b/10) is added, and the central coordinate is counted through the rectangular screw searching frame.

Claims (9)

1. A screw counting method based on target tracking is characterized in that: the method comprises the following steps:
s100, acquiring falling video image data of the screw;
s200, separating the foreground and the background of the screw based on colors;
step S300, determining the size and the center coordinates of the search box;
s400, realizing the quick iteration of the center coordinates;
in the step, the fast iteration of the center coordinate is realized by a self-adaptive moving algorithm of the center coordinate of the search box, and the method comprises the following steps:
step S410: performing Correlation comparison on the search frame and the frame image line by line, and obtaining a Correlation result with a value range of [0,1] for each pixel;
step S420: mapping the correlation result to [0-255] from [0-1] to obtain a gray scale map of the image back projection, wherein the size is J x K, and J, K is a natural number;
step S430: from x and y, the search is solved line by lineExpected moving positions of frame center Δ Xi and Δ Yi, where Δ Xi =
Figure 647714DEST_PATH_IMAGE001
, △Yi=
Figure 445906DEST_PATH_IMAGE002
Wherein n represents the number of columns of the back-projected gray scale map; i represents the number of rows of the backprojected gray scale map;
Figure 910386DEST_PATH_IMAGE003
gray values of the back projection gray map of the ith row and the nth column are represented;
step S440: summing the delta Xi and delta Yi of each row to obtain delta sum(X)And Δ sum(Y)Wherein Δ sum(X)=
Figure 454631DEST_PATH_IMAGE004
, △sum(Y)=
Figure 244732DEST_PATH_IMAGE005
Step S450: move search box to { x +. DELTA.sum(X), y+△sum(Y)};
Step S460: because the screw is in horizontal throwing motion, the displacement in the vertical direction is changed more along with time, and the displacement in the vertical direction is changed less; therefore, the amount of displacement Δ sum in the vertical direction is determined(Y)Whether or not less than
Figure 264641DEST_PATH_IMAGE006
Judgment of Δ sum(X)Whether or not less than
Figure 408177DEST_PATH_IMAGE007
If the above requirements are satisfied, go to step S500; otherwise { x +. DELTA.sum(X), y+△sum(Y)Repeating the step S430 to the step S450 as a new search box center coordinate;
step S500, repeating the steps S100 to S400, and realizing accurate and rapid tracking of the screw;
and step S600, adding a judgment area, and accurately counting screws passing through the judgment area.
2. The target tracking-based screw counting method of claim 1, wherein: in the step S100, a line scanning camera is adopted to collect falling video image data of the screw and transmit the falling video image data to an upper computer in real time; the method comprises the following steps: the imaging area of the line scanning camera is parallel to the falling surface of the screw.
3. The target tracking-based screw counting method of claim 2, wherein: in the step S200, the separation of the foreground and the background of the color-based screw is realized by a Lab-based color algorithm, which includes:
step S210: dividing the image into subblocks of N x N, wherein N is a natural number;
step S220: converting the color space of each sub-block from RGB to Lab to obtain three mutually perpendicular color components of L, a and b;
step S230: respectively calculating the color distance between two adjacent sub-blocks u and v according to the following formula
Figure 197142DEST_PATH_IMAGE008
Figure 892565DEST_PATH_IMAGE009
.
Figure 779531DEST_PATH_IMAGE010
Figure 585813DEST_PATH_IMAGE011
In the formula:
Figure 635808DEST_PATH_IMAGE012
respectively are the average values of L, a and b of all pixel points in the sub-blocks u and v;
Figure 236554DEST_PATH_IMAGE013
Figure 231055DEST_PATH_IMAGE014
Figure 716394DEST_PATH_IMAGE015
represents the average of R, G, B of sub-block u,
Figure 479950DEST_PATH_IMAGE016
Figure 517177DEST_PATH_IMAGE017
Figure 874340DEST_PATH_IMAGE018
represents the average of R, G, B of the sub-block v;
Figure 22424DEST_PATH_IMAGE019
represents: the smaller of the two;
step S240: judging whether d1 is smaller than a set threshold value; if the number of the areas is smaller than the preset number, the two areas are merged; otherwise, not merging;
step S250: repeating the steps S230 to S240, and performing merging or non-merging operation on all the sub-blocks; and finally, dividing the screw target.
4. The target tracking-based screw counting method of claim 3, wherein: in the step S300: the size and center coordinates of the search box are determined as follows: and finding the mass center of the segmented screw target, and gradually enlarging the rectangle by taking the mass center as the center until the image area covered by all the screw targets is included, wherein the rectangle is a search frame.
5. The utility model provides a screw tally system based on target tracking which characterized in that: the method comprises the following steps:
the device comprises a camera device for collecting falling video image data of the screw and an upper computer for processing the falling video image data; the camera device transmits the collected falling video image data of the screw to the upper computer;
the host computer include:
a module to achieve color-based separation of the foreground and background of the screw;
a module for determining the size and the center coordinates of the search box;
a module for implementing fast iteration of the central coordinates;
the module for accurately and quickly tracking the screw is realized;
and a module for counting screws.
6. The target tracking-based screw counting system of claim 5, wherein: the camera device comprises a linear scanner;
the linear scanner also comprises a conveyor belt for conveying the screws, and an imaging area of the linear scanner is parallel to a falling surface of the screws when the screws fall from the conveyor belt.
7. The target tracking-based screw counting system of claim 6, wherein: the module for separating the foreground and the background of the color-based screw is used for separating the foreground and the background of the screw through a Lab-based color algorithm; the method comprises the following steps:
a module for dividing the image into N x N sub-blocks, N being a natural number;
the color space conversion module converts the color space of each sub-block from RGB to Lab to obtain three mutually perpendicular color components of L, a and b;
a color distance calculating module, wherein the color distance calculating module respectively calculates the color distance between two adjacent sub-blocks u and v according to the following formula
Figure 906067DEST_PATH_IMAGE008
Figure 488095DEST_PATH_IMAGE009
.
Figure 191609DEST_PATH_IMAGE010
Figure 143385DEST_PATH_IMAGE020
The comparison and combination module judges whether d1 is smaller than a set threshold value; if so, the two regions are merged.
8. The target tracking-based screw counting system of claim 7, wherein: the module for determining the size and the central coordinate of the search box is connected with the module for realizing the separation of the foreground and the background of the screw based on the color, and the size and the central coordinate of the search box are determined by using the result obtained after the segmentation of the module for realizing the separation of the foreground and the background of the screw based on the color.
9. The target tracking-based screw counting system of claim 8, wherein: the module for realizing the rapid iteration of the central coordinates is connected with the module for determining the size and the central coordinates of the search box, and the module for determining the size and the central coordinates of the search box can realize the rapid iteration of the central coordinates by a self-adaptive moving algorithm of the central coordinates of the search box.
CN202210205405.3A 2022-03-04 2022-03-04 Screw counting method and system based on target tracking Active CN114359265B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210205405.3A CN114359265B (en) 2022-03-04 2022-03-04 Screw counting method and system based on target tracking

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210205405.3A CN114359265B (en) 2022-03-04 2022-03-04 Screw counting method and system based on target tracking

Publications (2)

Publication Number Publication Date
CN114359265A true CN114359265A (en) 2022-04-15
CN114359265B CN114359265B (en) 2022-05-24

Family

ID=81094596

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210205405.3A Active CN114359265B (en) 2022-03-04 2022-03-04 Screw counting method and system based on target tracking

Country Status (1)

Country Link
CN (1) CN114359265B (en)

Citations (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5434617A (en) * 1993-01-29 1995-07-18 Bell Communications Research, Inc. Automatic tracking camera control system
US6100925A (en) * 1996-11-27 2000-08-08 Princeton Video Image, Inc. Image insertion in video streams using a combination of physical sensors and pattern recognition
US20010008561A1 (en) * 1999-08-10 2001-07-19 Paul George V. Real-time object tracking system
WO2002025576A1 (en) * 2000-09-20 2002-03-28 Daimlerchrysler Ag System for detecting a line of vision using image data
US6392694B1 (en) * 1998-11-03 2002-05-21 Telcordia Technologies, Inc. Method and apparatus for an automatic camera selection system
US6724915B1 (en) * 1998-03-13 2004-04-20 Siemens Corporate Research, Inc. Method for tracking a video object in a time-ordered sequence of image frames
JP2008299834A (en) * 2007-05-02 2008-12-11 Nikon Corp Photographic subject tracking program and photographic subject tracking device
JP2011093076A (en) * 2009-11-02 2011-05-12 Honda Motor Co Ltd Method and apparatus for information processing, and program
CN102398244A (en) * 2010-09-13 2012-04-04 鸿富锦精密工业(深圳)有限公司 Screw counter
US20120120196A1 (en) * 2010-11-15 2012-05-17 Chi-Hung Tsai Image counting method and apparatus
CN103708062A (en) * 2013-12-11 2014-04-09 大连运明自动化技术有限公司 Blanking counting device for high-speed intelligent screw packaging machine
CN105100727A (en) * 2015-08-14 2015-11-25 河海大学 Real-time tracking method for specified object in fixed position monitoring image
US20170083762A1 (en) * 2015-06-22 2017-03-23 Photomyne Ltd. System and Method for Detecting Objects in an Image
US20190004597A1 (en) * 2017-06-28 2019-01-03 Shandong University Multi-human tracking system and method with single kinect for supporting mobile virtual reality application
CN109816692A (en) * 2019-01-11 2019-05-28 南京理工大学 A kind of motion target tracking method based on Camshift algorithm
CN110009023A (en) * 2019-03-26 2019-07-12 杭州电子科技大学上虞科学与工程研究院有限公司 Wagon flow statistical method in wisdom traffic
US20190347767A1 (en) * 2018-05-11 2019-11-14 Boe Technology Group Co., Ltd. Image processing method and device
US20190378283A1 (en) * 2018-06-09 2019-12-12 Lot Spot Inc. System and method for transforming video data into directional object count
CN210042031U (en) * 2019-06-12 2020-02-07 南京四维音符科技有限公司 Vehicle counting device capable of dynamically tracking number of people

Patent Citations (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5434617A (en) * 1993-01-29 1995-07-18 Bell Communications Research, Inc. Automatic tracking camera control system
US6100925A (en) * 1996-11-27 2000-08-08 Princeton Video Image, Inc. Image insertion in video streams using a combination of physical sensors and pattern recognition
US6724915B1 (en) * 1998-03-13 2004-04-20 Siemens Corporate Research, Inc. Method for tracking a video object in a time-ordered sequence of image frames
US6392694B1 (en) * 1998-11-03 2002-05-21 Telcordia Technologies, Inc. Method and apparatus for an automatic camera selection system
US20010008561A1 (en) * 1999-08-10 2001-07-19 Paul George V. Real-time object tracking system
WO2002025576A1 (en) * 2000-09-20 2002-03-28 Daimlerchrysler Ag System for detecting a line of vision using image data
JP2008299834A (en) * 2007-05-02 2008-12-11 Nikon Corp Photographic subject tracking program and photographic subject tracking device
JP2011093076A (en) * 2009-11-02 2011-05-12 Honda Motor Co Ltd Method and apparatus for information processing, and program
CN102398244A (en) * 2010-09-13 2012-04-04 鸿富锦精密工业(深圳)有限公司 Screw counter
US20120120196A1 (en) * 2010-11-15 2012-05-17 Chi-Hung Tsai Image counting method and apparatus
CN103708062A (en) * 2013-12-11 2014-04-09 大连运明自动化技术有限公司 Blanking counting device for high-speed intelligent screw packaging machine
US20170083762A1 (en) * 2015-06-22 2017-03-23 Photomyne Ltd. System and Method for Detecting Objects in an Image
CN105100727A (en) * 2015-08-14 2015-11-25 河海大学 Real-time tracking method for specified object in fixed position monitoring image
US20190004597A1 (en) * 2017-06-28 2019-01-03 Shandong University Multi-human tracking system and method with single kinect for supporting mobile virtual reality application
US20190347767A1 (en) * 2018-05-11 2019-11-14 Boe Technology Group Co., Ltd. Image processing method and device
US20190378283A1 (en) * 2018-06-09 2019-12-12 Lot Spot Inc. System and method for transforming video data into directional object count
CN109816692A (en) * 2019-01-11 2019-05-28 南京理工大学 A kind of motion target tracking method based on Camshift algorithm
CN110009023A (en) * 2019-03-26 2019-07-12 杭州电子科技大学上虞科学与工程研究院有限公司 Wagon flow statistical method in wisdom traffic
CN210042031U (en) * 2019-06-12 2020-02-07 南京四维音符科技有限公司 Vehicle counting device capable of dynamically tracking number of people

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
史东承等: "基于改进颜色直方图映射的目标跟踪算法", 《计算机仿真》 *
周庆芳: "复杂背景下的运动目标跟踪技术研究", 《文理导航(下旬)》 *
唐自力等: "基于高准确度搜索运动目标关键帧的智能跟踪算法", 《光子学报》 *
苗姣姣等: "基于图像处理的螺丝钉计数方法研究", 《液晶与显示》 *
陈鑫等: "一种新的基于SLICO改进的GrabCut彩色图像分割算法", 《计算机应用研究》 *

Also Published As

Publication number Publication date
CN114359265B (en) 2022-05-24

Similar Documents

Publication Publication Date Title
CN106780620B (en) Table tennis motion trail identification, positioning and tracking system and method
CN113034548B (en) Multi-target tracking method and system suitable for embedded terminal
CN104282020B (en) A kind of vehicle speed detection method based on target trajectory
CN107804514B (en) Toothbrush sorting method based on image recognition
CN107392929B (en) Intelligent target detection and size measurement method based on human eye vision model
CN106296700B (en) A kind of steel cord conveyor belt connector twitch detection method
CN105957107A (en) Pedestrian detecting and tracking method and device
CN111709301B (en) Curling ball motion state estimation method
Li et al. An image recognition approach for coal and gangue used in pick-up robot
CN111353448A (en) Pedestrian multi-target tracking method based on relevance clustering and space-time constraint
CN107808391B (en) Video dynamic target extraction method based on feature selection and smooth representation clustering
CN110458785B (en) Magnetic levitation ball levitation gap detection method based on image sensing
CN111028263B (en) Moving object segmentation method and system based on optical flow color clustering
CN114359265B (en) Screw counting method and system based on target tracking
CN117576597B (en) Visual identification method and system based on unmanned aerial vehicle driving
CN114648557A (en) Multi-target cooperative tracking method based on high-altitude visual angle and ground visual angle
CN107563371B (en) Method for dynamically searching interesting region based on line laser light strip
CN112338898B (en) Image processing method and device of object sorting system and object sorting system
CN113643206A (en) Cow breathing condition detection method
CN110322467B (en) Algorithm for improving point cloud density of 3D contour sensor on calculated plate surface
CN116105813A (en) Slag volume flow measuring device and measuring method
CN107067411B (en) Mean-shift tracking method combined with dense features
CN113066085B (en) Method, device, equipment and medium for measuring speed of target real-time video
CN113763474B (en) Indoor monocular depth estimation method based on scene geometric constraint
CN112348853B (en) Particle filter tracking method based on infrared saliency feature fusion

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant