CN111476336B - Method, device and equipment for counting clothes - Google Patents

Method, device and equipment for counting clothes Download PDF

Info

Publication number
CN111476336B
CN111476336B CN201910063798.7A CN201910063798A CN111476336B CN 111476336 B CN111476336 B CN 111476336B CN 201910063798 A CN201910063798 A CN 201910063798A CN 111476336 B CN111476336 B CN 111476336B
Authority
CN
China
Prior art keywords
image data
moving object
clothing
pixel
frame
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910063798.7A
Other languages
Chinese (zh)
Other versions
CN111476336A (en
Inventor
赵永飞
龙一民
徐博文
吴剑
胡露露
张民英
神克乐
陈新
尹宁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Alibaba Group Holding Ltd
Original Assignee
Alibaba Group Holding Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alibaba Group Holding Ltd filed Critical Alibaba Group Holding Ltd
Priority to CN201910063798.7A priority Critical patent/CN111476336B/en
Priority to TW108142548A priority patent/TW202030641A/en
Priority to PCT/CN2020/071926 priority patent/WO2020151530A1/en
Publication of CN111476336A publication Critical patent/CN111476336A/en
Application granted granted Critical
Publication of CN111476336B publication Critical patent/CN111476336B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06MCOUNTING MECHANISMS; COUNTING OF OBJECTS NOT OTHERWISE PROVIDED FOR
    • G06M7/00Counting of objects carried by a conveyor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Abstract

The embodiment of the invention provides a piece counting method, a piece counting device and piece counting equipment for clothing, wherein the method comprises the following steps: acquiring at least one frame of image data for performing quality inspection operation on the clothing; identifying a moving object in the image data and a working area where the moving object is located; and performing piece counting operation on the clothing subjected to quality inspection according to the moving object and the working area. The method comprises the steps of acquiring at least one frame of image data for performing quality inspection operation on the clothing, identifying a moving object in the image data and a working area where the moving object is located, performing piece counting operation on the clothing subjected to quality inspection according to the moving object and the working area, effectively ensuring piece counting statistics on the clothing subjected to quality inspection operation, reducing production management cost on the clothing piece counting, ensuring efficiency and accuracy of the clothing piece counting, facilitating production management of factories, and improving management efficiency of the factories; meanwhile, the user can acquire production process data at any time, know the order production progress, and finally realize efficient production and marketing cooperation.

Description

Method, device and equipment for counting clothes
Technical Field
The present invention relates to the field of computer technologies, and in particular, to a method, an apparatus, and a device for counting items of clothing.
Background
In the production of finished garments, the process of the factory mainly comprises: cutting cloth, assembly line/whole piece, and the last three processes. At present, in a later link, the clothing piece counting operation can be performed, and specific modes for realizing the piece counting operation include: the traditional invasive bar code gun, the wireless radio frequency identification RFID technology or the direct dispatch of special personnel is adopted for manual data input operation.
However, the bar code gun and the radio frequency identification RFID technology are adopted to perform the piece counting operation on the clothes, so that the operation and learning time cost of workers is required to be increased, and the cost of processing the clothes in factories is greatly increased; the manual counting mode is adopted to count the clothes, so that the efficiency is low, the labor cost is increased, and the management efficiency of factories to the clothes is reduced.
Disclosure of Invention
The embodiment of the invention provides a piece counting method, device and equipment for clothing, which are used for reducing the cost of piece counting for clothing, ensuring the piece counting efficiency for clothing and further improving the management efficiency of factories for clothing.
In a first aspect, an embodiment of the present invention provides a method for counting items of clothing, including:
acquiring at least one frame of image data for performing quality inspection operation on the clothing;
Identifying a moving object in the image data and a working area where the moving object is located;
and performing piece counting operation on the clothing after quality inspection according to the moving object and the working area.
In a second aspect, an embodiment of the present invention provides a piece counting device for clothing, including:
the acquisition module is used for acquiring at least one frame of image data for quality inspection of the clothing;
the identification module is used for identifying the moving object in the image data and the working area where the moving object is located;
and the counting module is used for counting the clothing after quality inspection according to the moving object and the working area.
In a third aspect, an embodiment of the present invention provides an electronic device, including: a memory, a processor; wherein the memory is configured to store one or more computer instructions, wherein the one or more computer instructions, when executed by the processor, implement the method of counting items of clothing in the first aspect described above.
In a fourth aspect, an embodiment of the present invention provides a computer storage medium, configured to store a computer program, where the computer program makes a computer execute the piece counting method of the garment in the first aspect.
The method comprises the steps of acquiring at least one frame of image data for performing quality inspection operation on clothing, identifying a moving object in the image data and a working area where the moving object is located, performing piece counting operation on the clothing after quality inspection according to the moving object and the working area, effectively ensuring piece counting statistics on the clothing subjected to the quality inspection operation, reducing production management cost on the clothing piece counting, ensuring efficiency and accuracy of the clothing piece counting, facilitating production management of factories, and improving management efficiency of the factories; meanwhile, the user can acquire production process data at any time, know the order production progress, and finally realize efficient production and marketing cooperation.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions of the prior art, the following description will briefly explain the drawings used in the embodiments or the description of the prior art, and it is obvious that the drawings in the following description are some embodiments of the present invention, and other drawings can be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart of a piece counting method of a garment provided by an embodiment of the invention;
FIG. 2 is a flowchart of identifying a moving object in the image data according to an embodiment of the present invention;
FIG. 3 is a flowchart of identifying a moving object in the image data of each frame according to at least one frame of the image data and the background model image according to an embodiment of the present invention;
FIG. 4 is a flowchart of determining a moving object in the image data according to the first pixel value and the second pixel value according to an embodiment of the present invention;
FIG. 5 is a flowchart of identifying a working area where a moving object is located in the image data according to an embodiment of the present invention;
FIG. 6 is a flowchart of establishing a statistical matrix for representing motion change frequency of the moving object according to an embodiment of the present invention;
FIG. 7 is a flowchart of determining a working area where the moving object is located according to the statistical matrix according to an embodiment of the present invention;
FIG. 8 is a flowchart of updating the statistical matrix according to an embodiment of the present invention;
FIG. 9 is a flowchart of a piece counting operation for the clothing after quality inspection according to the moving object and the working area according to an embodiment of the present invention;
FIG. 10 is a schematic structural view of a piece counting device for clothing according to an embodiment of the present invention;
Fig. 11 is a schematic structural diagram of an electronic device corresponding to the piece counting device of the garment provided in the embodiment shown in fig. 10.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present invention more apparent, the technical solutions of the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention, and it is apparent that the described embodiments are some embodiments of the present invention, but not all embodiments of the present invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
The terminology used in the embodiments of the invention is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used in this application and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise, the "plurality" generally includes at least two, but does not exclude the case of at least one.
It should be understood that the term "and/or" as used herein is merely one relationship describing the association of the associated objects, meaning that there may be three relationships, e.g., a and/or B, may represent: a exists alone, A and B exist together, and B exists alone. In addition, the character "/" herein generally indicates that the front and rear associated objects are an "or" relationship.
The words "if", as used herein, may be interpreted as "at … …" or "at … …" or "in response to a determination" or "in response to a detection", depending on the context. Similarly, the phrase "if determined" or "if detected (stated condition or event)" may be interpreted as "when determined" or "in response to determination" or "when detected (stated condition or event)" or "in response to detection (stated condition or event), depending on the context.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a product or system that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such product or system. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a commodity or system comprising such elements.
In addition, the sequence of steps in the method embodiments described below is only an example and is not strictly limited.
FIG. 1 is a flow chart of a piece counting method of a garment provided by an embodiment of the invention; referring to fig. 1, the embodiment provides a counting method for clothing, and the execution main body of the counting method is a counting device, so that counting statistics for clothing performing quality inspection operation can be realized in a subsequent flow of producing and processing clothing when the counting device executes the counting method. Specifically, the metering method may include:
s101: at least one frame of image data is acquired for performing a quality inspection operation on the garment.
The image data in this embodiment may be acquired in real time, for example: when the quality inspection operation is carried out on the clothing, the image acquisition device can be installed at a preset position, the image acquisition device can be a camera, and at the moment, the image data for the quality inspection operation of the clothing can be obtained in real time through the image acquisition device. Alternatively, the image data in the embodiment may be acquired in non-real time, and at this time, the image data may be acquired by a preset image acquisition device and stored in a preset storage area, and the image data may be obtained by accessing the storage area; alternatively, the image data may be sent actively or passively by the image acquisition device, in which case the image data may be stored on a predetermined area of the image acquisition device. Of course, the person skilled in the art can also select other modes to acquire the image data of the quality inspection operation on the clothing according to specific design requirements and application scenes, so long as the accuracy and reliability of the image data acquisition can be ensured, and details are not repeated here.
Optionally, after acquiring at least one frame of image data for quality inspection of the garment, in order to ensure quality and efficiency of processing the image data, the method in this embodiment may further include adjusting a resolution of the at least one frame of image data such that the resolution of the image data meets a preset criterion.
Wherein, the preset standard is preset, and a person skilled in the art can set different resolution standards according to specific application requirements, for example: the preset standard may refer to the resolution of the image data being 320 x 240dpi; alternatively, the preset standard may refer to the resolution of the image data being 640×480dpi, or the like. It should be noted that, the resolution of the image data satisfies the preset standard, so that the accuracy and reliability of analysis and identification of the image data can be ensured, after the image data is acquired, the resolution of the image data can be acquired first, when the resolution of the image data is greater than the preset standard, the image data is a large resolution image, and when the image data is processed, the calculated data amount is large, so that, in order to ensure the quality and efficiency of processing the image data, the resolution of the image data can be reduced, the calculated amount of processing the image data can be reduced, and thus the real-time performance and reliability of processing the image data are ensured. When the resolution of the image data is smaller than a preset standard, the image data is a small-resolution image, and when the image data is processed, the resolution of the image data can be improved in order to ensure the accuracy of identifying the moving object and the working area in the image data.
For example: when the resolution of the acquired image data is 1280 x 720dpi and is greater than the preset standard, the current image data can be subjected to downsampling processing, so that the resolution of the image data is adjusted from 1280 x 720dpi to 320 x 240dpi, the calculated amount of processing the image data is reduced, the instantaneity and the reliability of processing the image data are ensured, and an accurate processing result can be acquired. When the resolution of the acquired image data is 160×120dpi, the resolution is smaller than a preset standard, so that the current image data can be adjusted to enable the resolution of the image data to be adjusted from 160×120dpi to 320×240dpi, thereby effectively ensuring the accuracy of image data identification and obtaining accurate processing results.
Optionally, before identifying the moving object and the working area where the moving object is located in the image data, the method in the embodiment may further include: and filtering and denoising the at least one frame of image data.
When the image data is acquired, due to the influence of the working environment of the moving object and other factors, more noise may exist in the acquired image data, and in order to improve the accuracy of image data processing, a gaussian model may be used to perform filtering denoising processing on at least one frame of image data, so that noise mixed in an image is eliminated, and relatively clear image data is acquired.
S102: a moving object in the image data is identified and a work area in which the moving object is located.
After the image data is acquired, the image data may be analyzed to identify the moving object and the working area in which the moving object is located in the image data. The moving object refers to a worker performing quality inspection operation on the clothing, and the working area where the moving object is located refers to an area where the worker performs quality inspection operation on the clothing.
The specific implementation process of identifying the moving object and the working area is not limited in this embodiment, and a person skilled in the art may select different implementation manners according to specific design requirements and application scenarios, for example: when identifying a moving object and a working area where the moving object is located in image data, contour information of all objects in the image data can be acquired first, and the contour information of all objects is analyzed and compared with preset standard contour information, wherein the standard contour information is pre-stored contour information corresponding to a worker, and it can be understood that the standard contour information can be one or more; determining an object corresponding to the profile information matched with the at least one standard profile information as a moving object; and then, acquiring time information of all areas of the moving object in the image data, and determining the area with the time information larger than or equal to a preset time threshold as a working area of the moving object. Of course, those skilled in the art may also identify the moving object and the working area in other manners, so long as accuracy of the moving object and the working area can be ensured, and details thereof will not be repeated herein.
S103: and performing piece counting operation on the clothing subjected to quality inspection according to the moving object and the working area.
After the moving object and the working area are obtained, the moving object and the working area can be analyzed and processed, and whether the piece counting operation is carried out on the clothing or not is realized according to analysis and processing results; specifically, referring to fig. 8, in this embodiment, performing a counting operation on the detected clothing according to the moving object and the working area may include:
s1031: it is detected whether the moving object is located within the work area.
One way that can be achieved for whether a moving object is located within a work area is: whether the moving object is located in the working area or not can refer to whether the position of the moving object is located in the working area or not, at this time, when detecting whether the moving object is located in the working area or not, the current position information of the moving object can be acquired first, whether the current position information is located in the working area or not is judged, and if the current position information is located in the working area, the moving object can be determined to be located in the working area; if the current position information is not in the working area, it can be determined that the moving object is not in the working area.
Another way to achieve if a moving object is located within the work area is: whether the moving object is located in the working area or not can refer to whether the motion variation amplitude of the moving object is located in the working area or not, at this time, when detecting whether the moving object is located in the working area or not, the motion variation amplitude of the moving object can be obtained first, whether the motion variation amplitude exceeds the working area or not is judged, and if the motion variation amplitude does not exceed the working area, the moving object can be determined to be located in the working area; if the motion change amplitude exceeds the working area, the moving object can be determined not to be in the working area.
Of course, a person skilled in the art can also detect whether the moving object is located in the working area by adopting other modes according to specific application scenarios and design requirements, so long as the accuracy and reliability of detection can be ensured, and details are not repeated here.
S1032: and if the moving object is not in the working area, performing counting operation on the clothing after quality inspection.
The moving object is used for detecting the clothing after quality inspection, and the moving object is used for detecting the clothing after quality inspection, wherein the behavior frequency of the moving object for detecting the clothing after quality inspection is obviously lower than the behavior frequency of repeatedly checking the clothing when the moving object is used for detecting the clothing after quality inspection, based on the rule characteristics, the clothing after quality inspection can be identified and detected, and when the moving object exceeds the working area, the moving object is used for carrying out the clothing after quality inspection, and then the clothing after quality inspection can be subjected to the piece counting operation, so that the number of the clothing after quality inspection operation can be obtained.
It will be appreciated that the method in this embodiment may further include: if the moving object is in the working area, the counting operation is not executed.
Further, after the moving object is not in the working area, the method in this embodiment may further include:
s1033: and if the moving object is positioned in the preset first area, performing piece counting operation on qualified clothing after quality inspection.
The first preset area is used for placing the clothing with qualified quality inspection results, and can be located at a preset position of the working area, for example, the first area can be located on the left side or the right side of the working area; when the moving object is not in the working area and is positioned in the first area, the moving object is indicated to finish the quality inspection operation of the clothing, and the quality inspection result of the clothing is qualified, so that the moving object is executing the operation of placing the qualified clothing in the first area, and at the moment, the counting operation can be performed on the qualified clothing.
S1034: and if the moving object is positioned in the preset second area, performing counting operation on unqualified clothing after quality inspection.
The preset second area is used for placing the clothing with unqualified quality inspection results, and can be positioned at the preset position of the working area, and the setting positions of the second area and the first area are different; for example, when the first region is located on the left side of the working region, the second region may be located on the right side of the working region; when the first region is located on the right side of the working region, the second region may be located on the left side of the working region; when the moving object is not in the working area and is positioned in the second area, the moving object is indicated to finish the quality inspection operation of the clothing, and the quality inspection result of the clothing is unqualified, so that the moving object is executing the operation of placing the unqualified clothing in the second area, and at the moment, the counting operation can be performed on the unqualified clothing.
It should be noted that the counting operation in this embodiment may include three kinds of counting operations: (1) performing piece counting P on clothing subjected to quality inspection operation; (2) performing piece counting P1 on the qualified clothing; (3) counting the pieces P2 of the unqualified clothing; it will be appreciated that in general, p=p1+p2.
When the detection result is that the moving object is located in the working area, the moving object is indicated to be performing quality inspection operation on the clothing at the moment, and therefore piece counting operation on the clothing is not performed.
Optionally, the method in this embodiment may further include:
s104: at least one frame of image data of the moving object for performing quality inspection operation on the clothing is stored.
When counting and counting the clothes in the quality inspection operation, corresponding image data can be stored for each piece of clothes, so that a user can conveniently check or call related records of related counting at any time.
It will be appreciated that the method in this embodiment may further include:
s105: at least one frame of image data of the moving object placed qualified garment is stored.
S106: at least one frame of image data of a moving object with disqualified clothing is stored.
Similarly, when the clothing subjected to quality inspection is placed, statistics can be performed on clothing of quality inspection results of different placed quality inspection results, and image data corresponding to each piece of clothing when the statistics is performed can be stored, so that a user can check or call related records of related pieces at any time.
According to the counting method of the clothing, at least one frame of image data for performing quality inspection operation on the clothing is obtained, a moving object in the image data and a working area where the moving object is located are identified, and the counting operation is performed on the clothing after the quality inspection according to the moving object and the working area, so that counting statistics on the clothing for performing the quality inspection operation is effectively guaranteed, production management cost of the clothing counting is reduced, efficiency and accuracy of the clothing counting are guaranteed, production management of factories is facilitated, and management efficiency of the factories is improved; meanwhile, the user can acquire production process data at any time, know the order production progress, and finally realize efficient production and marketing cooperation.
FIG. 2 is a flowchart for identifying a moving object in image data according to an embodiment of the present invention; FIG. 3 is a flowchart of identifying a moving object in each frame of image data based on at least one frame of image data and a background model image according to an embodiment of the present invention; FIG. 4 is a flowchart of determining a moving object in image data according to a first pixel value and a second pixel value according to an embodiment of the present invention; as can be seen from the foregoing embodiments with continued reference to fig. 2 to 4, the specific implementation process of identifying the moving object in the image data in this embodiment is not limited, and those skilled in the art may set the implementation process according to specific design requirements, and preferably, the identifying the moving object in the image data in this embodiment may include:
S1021: a background model image is created based on the at least one frame of image data.
Wherein, a Gaussian mixture background modeling method can be adopted to build a background model image for the acquired at least one frame of image data, and the background model image comprises the working environment of the moving object. Specifically, a background model image may be created based on all of the image data; alternatively, at least one background model image may also be created based on at least one frame of image data, namely: a background model image may be created based on part of the image data in the at least one frame of image data, and other background model images may be created based on other image data in the at least one frame of image data, preferably a background model image may be created based on all of the image data.
In addition, the size of the background model image established is the same as the size of the image data. For example, 1000 frames of image data are available, a corresponding background model image can be established based on the 1000 frames of image data, and the established background model image has the same size as the image data; alternatively, the first background model image and the second background model image may be corresponding to each other based on 1000 frames of image data, wherein the size of the first background model image is the same as the size of the corresponding image data; the size of the second background model image is the same as the corresponding image data.
S1022: a moving object in each frame of image data is identified based on at least one frame of image data and the background model image.
After the background model image is established, each frame of image data may be analytically compared with the background model image, thereby identifying moving objects in each frame of image data. Specifically, identifying the moving object in each frame of image data from at least one frame of image data and the background model image may include:
s10221: and acquiring a first pixel value of each pixel point in at least one frame of image data and a second pixel value of the same pixel point in the background model image.
Since the background model image is the same size as the image data, for each pixel in the image data, there is a corresponding pixel in the background model image. In order to identify the moving object, a first pixel value of each pixel in the image data may be obtained, and a second pixel value of the corresponding pixel in the background model image may be obtained, specifically, the pixel values may be obtained in a manner in the prior art, which is not described herein.
S10222: a moving object in the image data is determined from the first pixel value and the second pixel value.
After the first pixel value and the second pixel value are acquired, the first pixel value and the second pixel value may be subjected to analysis processing, so that a moving object in the image data is determined according to a result of the analysis processing. Specifically, determining a moving object in the image data from the first pixel value and the second pixel value may include:
s102221: and obtaining a difference value between the first pixel value and the second pixel value.
The difference value is a difference degree between the first pixel value and the second pixel value, and specifically, the difference value may be a difference value between the first pixel value and the second pixel value, or the difference value may also be a ratio of the first pixel value to the second pixel value; of course, those skilled in the art may also use other ways to embody the difference between the first pixel value and the second pixel value, which will not be described herein.
S102222: searching all pixel points with difference values larger than or equal to a preset pixel threshold value in the image data, wherein all pixel points form a moving object in the image data.
For each image data, the image data includes a dynamic area and a static area, wherein the dynamic area is an area where pixel points in the image data can change, that is: the dynamic region is composed of dynamic pixel points; the static area is an area where pixel points in the image data are not substantially changed, that is: the static area is composed of static pixel points. The above can show that the region where the moving object is located is a dynamic region in the image data, and further, when the difference value between the first pixel value and the second pixel value is greater than or equal to the pixel threshold value, it is indicated that the difference between the pixel point in the image data and the corresponding pixel point in the background model image is greater, and the pixel point can be determined to be a dynamic pixel point, so that all the dynamic pixel points in the image data can be obtained, and at this time, all the dynamic pixel points form the moving object in the image data.
In addition, the pixel threshold value in the embodiment is preset, and a person skilled in the art can determine a specific numerical range of the pixel threshold value according to specific design requirements and application scenarios, and it can be understood that different difference values can correspond to different pixel threshold values; example one: when the difference value is the difference value between the first pixel value and the second pixel value, the corresponding pixel threshold value may be TH1, and then the moving object in the image data may be determined according to the following formula:
Figure BDA0001955010380000111
wherein, currentgroup (i, j) is a first pixel value of a pixel point in a certain frame of image data, background (i, j) is a second pixel value of a corresponding pixel point in a Background model image, and Foreground (i, j) is a difference region of the image data relative to the Background model image, and the difference region is a moving object in the image data.
Example two: when the difference value is the ratio of the first pixel value to the second pixel value, the corresponding pixel threshold may be TH2, and then the moving object in the image data may be determined according to the following formula:
Figure BDA0001955010380000112
wherein, currentgroup (i, j) is a first pixel value of a pixel point in a certain frame of image data, background (i, j) is a second pixel value of a corresponding pixel point in a Background model image, and Foreground (i, j) is a difference region of the image data relative to the Background model image, and the difference region is a moving object in the image data.
By the method, the moving objects in the image data are identified, and the accuracy and the reliability of the identification of the moving objects in each frame of the image data are effectively ensured, so that the accuracy of the counting operation of the clothing is ensured.
FIG. 5 is a flowchart of identifying a working area where a moving object is located in image data according to an embodiment of the present invention; FIG. 6 is a flowchart of establishing a statistical matrix for representing motion change frequencies of a moving object according to an embodiment of the present invention; FIG. 7 is a flowchart of determining a working area where a moving object is located according to a statistical matrix according to an embodiment of the present invention; based on the foregoing embodiments, as can be seen with continued reference to fig. 5 to 7, the specific implementation process of identifying the working area where the moving object is located in the image data in this embodiment is not limited, and those skilled in the art may set the working area according to specific design requirements, and preferably, the identifying the working area where the moving object is located in the image data in this embodiment may include:
s1023: and establishing a statistical matrix for reflecting the action change frequency of the moving object, wherein the size of the statistical matrix is the same as that of the image data.
Wherein, referring to fig. 6, establishing a statistical matrix for representing the motion change frequency of the moving object may include:
S10231: a statistics corresponding to each pixel point in at least one frame of image data is obtained.
S10232: a statistics matrix is established based on the statistics values.
Wherein, the statistics value can be determined based on the motion characteristic of each pixel point in each frame of image data, and because each image data comprises a dynamic area and a static area, the pixel points in the dynamic area can be dynamic pixel points, the pixel points in the static area can be static pixel points, and the motion characteristic of the pixel points is that the pixel points are dynamic pixel points or static pixel points; when the pixel point is a dynamic pixel point, a preset statistics value can be corresponding; when the pixel point is a static pixel point, another preset statistics value can be corresponding. After a plurality of statistics values are acquired, a statistics matrix may be established based on the statistics values, the established statistics matrix having the same size as the image data.
For example, there are 500 frames of image data, each having a size of 320×240dpi, so an initial statistical matrix having a size of 320×240 may be established first, assuming that each element in the initial statistical matrix is 0, that is: the initial statistics preset for each element is 0. When the motion characteristic of the pixel point A in the first frame image data is identified, determining that the pixel point A is a dynamic pixel point, and at the moment, adding 1 to the element corresponding to the pixel point A in the initial statistical matrix to obtain a statistical value corresponding to the pixel point A as 1; when the motion characteristic of the pixel point a in the second frame image data is identified, the pixel point a is found to be a static pixel point, at this time, the element of the initial statistical matrix and the element corresponding to the pixel point a can be kept unchanged, and on the basis of the first frame image data, the statistical value corresponding to the pixel point a in the initial statistical matrix can be determined to be 1. When the motion characteristic of the pixel point B in the first frame image data is identified, the pixel point B is found to be a dynamic pixel point, and at the moment, the element corresponding to the pixel point B in the initial statistical matrix can be added with 1, so that the statistical value corresponding to the pixel point B is obtained to be 1; when the motion characteristic of the pixel point B in the second frame image data is identified, the pixel point B is found to be a dynamic pixel point, and at the moment, the statistics value corresponding to the pixel point B in the statistics matrix can be determined to be 2; specifically, the statistics in the statistics matrix satisfy the following relation:
Figure BDA0001955010380000131
Wherein, motionround (i, j) is a statistics value preset in the statistics matrix, fortgarund (i, j) is a motion characteristic of each pixel in the image data, when fortgarund (i, j) =0, the pixel at that time is a static pixel, and when fortgarund (i, j) =255, the pixel at that time is a dynamic pixel.
After the pixel points in all the image data are analyzed and processed according to the relational expression, statistics values corresponding to the pixel points in all the image data can be obtained, and a statistics matrix can be established based on the statistics values.
S1024: and determining the working area where the moving object is located according to the statistical matrix.
After the statistical matrix is obtained, the working area may be determined by using the statistical matrix, and specifically, determining, according to the statistical matrix, the working area where the moving object is located may include:
s10241: and carrying out normalization processing on the statistical matrix to obtain a pixel gray value corresponding to each statistical value in the statistical matrix.
S10242: when the pixel gray value is greater than or equal to a preset gray threshold value, determining a pixel area corresponding to the pixel gray value as a working area where the moving object is located.
The gray threshold is a preset limit value, and the specific numerical range of the gray threshold is not limited in this embodiment, and a person skilled in the art may set any gray threshold according to specific design requirements, for example: the gray threshold may be 20, 30, 40, or the like. After normalization processing is performed on the statistical matrix, the statistical matrix can be displayed in an image mode, the gray value of each pixel point in the image is 0-255, when the gray value of the pixel is greater than or equal to the gray threshold value, the pixel area corresponding to the gray value of the pixel is indicated to have higher change frequency, and then the pixel area corresponding to the gray value of the pixel can be determined to be the working area where the moving object is located.
Identifying a working area where the moving object is located through a statistical matrix, specifically, analyzing which pixel areas in the image data have frequent motion changes and which pixel areas have no motion changes basically through the statistical matrix, and estimating a working area range of the moving object for normal quality inspection based on the motion change frequency; specifically, a region with higher motion change frequency is identified, and the region is a working region for quality inspection operation of a moving object, so that the accuracy and reliability of the identification of the working region are effectively ensured, and the accuracy of the workpiece counting method is further improved.
In specific application, the numerical value of the statistical matrix cannot be increased without limit, and if the numerical value element of the statistical matrix is increased to a certain extent, the accuracy of processing the image data can be affected; in addition, when the moving object performs quality inspection operation on the clothing, the working area where the moving object is located is not invariable, and can be changed at any time according to the change of the moving object, so that in order to ensure the accuracy of identifying the working area, the statistical matrix can be periodically updated. Specifically, the method in this embodiment may further include:
S201: and updating the statistical matrix.
Wherein, referring to fig. 8, updating the statistical matrix may include:
s2011: and acquiring a preset updating coefficient, wherein the updating coefficient is a positive number smaller than 1.
The update coefficient is preset, the specific numerical range of the update coefficient is not limited in this embodiment, and a person skilled in the art can set the update coefficient arbitrarily according to specific application requirements, so long as the update coefficient can be ensured to meet the requirements, namely: the update coefficient may be any value satisfying the above condition, for example: 0.1, 0.2, 0.3, 0.5, 0.8, etc., for convenience of explanation, the update coefficient of 0.5 will be described as an example.
S2012: and multiplying all statistics values included in the statistics matrix with the update coefficients respectively to obtain updated values.
S2013: and obtaining an updated statistical matrix based on the updated values.
After the update coefficient is obtained, the statistics value in the statistics matrix may be adjusted according to the obtained update coefficient, so as to reduce the numerical element in the statistics value, i.e., motionground (A) =0.5×motionround (B); wherein Motionground (A) is the statistics after adjustment, motionground (B) is the statistics before adjustment, and 0.5 is the update coefficient.
When the statistical matrix is updated, the statistical matrix can be updated according to a preset period or a preset fixed frame number, so that the update operation of the whole numerical value of the statistical matrix according to a preset update coefficient is realized, the working area can be identified by using the updated statistical matrix, the working area of a moving object is estimated in a self-adaptive mode, and the accuracy and the reliability of the determination of the working area are effectively ensured.
When the method is specifically applied, a camera can be installed at a preset position of a factory, the camera can acquire image data of a moving object for quality inspection operation on clothes in real time, and after the image data is acquired, filtering and denoising processing can be carried out on the acquired image data through a Gaussian filtering method; and then, carrying out background modeling images based on the working environment where the moving object is positioned by adopting a Gaussian mixture background modeling mode, and determining the moving object in the image data according to the established background model images. And then, according to the action change frequency of the moving object, identifying a region with higher action change frequency, namely a normal working region for quality inspection of clothing. Because the behavior frequency of the moving object for repeatedly checking the clothing is obviously lower than the behavior frequency of the moving object for repeatedly checking the clothing when the clothing is checked in quality, the quality checking state of the moving object for the clothing can be judged based on the rule characteristics, and after the clothing is checked in quality, the clothing can be subjected to the counting operation, and the image data of the quality checking operation can be stored.
The method provided by the application embodiment can effectively reduce the cost and the transformation difficulty of the digital transformation of the factory, has the characteristics of light deployment and strong replicability, can acquire the real-time progress of the processing of the factory clothes in real time under the condition of not changing the original working mode of workers, and synchronizes the real-time progress to producers, platform sides and consumers, thereby achieving efficient production and marketing coordination and being beneficial to accurately matching, optimizing and improving the working condition of the workers.
FIG. 10 is a schematic structural view of a piece counting device for clothing according to an embodiment of the present invention; referring to fig. 10, the present embodiment provides a piece counting device for clothing, where the piece counting device may perform the piece counting method for clothing, and specifically the piece counting device may include:
an acquisition module 11, configured to acquire at least one frame of image data for quality inspection of clothing;
an identification module 12 for identifying a moving object in the image data and a working area where the moving object is located;
and the counting module 13 is used for counting the clothes after quality inspection according to the moving object and the working area.
Wherein, when the recognition module 12 recognizes a moving object in the image data, the recognition module 12 may be configured to perform: establishing a background model image based on at least one frame of image data; a moving object in each frame of image data is identified based on at least one frame of image data and the background model image.
Alternatively, in the identifying module 12 identifying a moving object in each frame of image data based on at least one frame of image data and the background model image, the identifying module 12 may be configured to perform: acquiring a first pixel value of each pixel point in at least one frame of image data and a second pixel value of the same pixel point in a background model image; a moving object in the image data is determined from the first pixel value and the second pixel value.
Alternatively, when the recognition module 12 determines a moving object in the image data according to the first pixel value and the second pixel value, the recognition module 12 may be configured to perform: acquiring a difference value between the first pixel value and the second pixel value; searching all pixel points with difference values larger than or equal to a preset pixel threshold value in the image data, wherein all pixel points form a moving object in the image data.
Alternatively, when the recognition module 12 recognizes a working area where a moving object is located in the image data, the recognition module 12 may be configured to perform: establishing a statistical matrix for reflecting the motion change frequency of the moving object, wherein the size of the statistical matrix is the same as that of the image data; and determining the working area where the moving object is located according to the statistical matrix.
Wherein, when the recognition module 12 establishes a statistical matrix for representing the motion change frequency of the moving object, the recognition module 12 may be configured to perform: obtaining a statistics corresponding to each pixel point in at least one frame of image data; a statistics matrix is established based on the statistics values.
In addition, when the recognition module 12 determines the working area where the moving object is located according to the statistical matrix, the recognition module 12 may be configured to perform: normalizing the statistical matrix to obtain a pixel gray value corresponding to each statistical value in the statistical matrix; when the pixel gray value is greater than or equal to a preset gray threshold value, determining a pixel area corresponding to the pixel gray value as a working area where the moving object is located.
Optionally, the identification module 12 in the present embodiment is further configured to perform: and updating the statistical matrix.
Specifically, when the identification module 12 updates the statistical matrix, the identification module 12 may be configured to perform: acquiring a preset updating coefficient, wherein the updating coefficient is a positive number smaller than 1; multiplying all statistics values included in the statistics matrix with update coefficients respectively to obtain updated values; and obtaining an updated statistical matrix based on the updated values.
Optionally, when the counting module 13 performs counting operation on the clothing after quality inspection according to the moving object and the working area, the counting module 13 may be configured to perform: detecting whether a moving object is positioned in a working area; and if the moving object is not in the working area, performing counting operation on the clothing after quality inspection.
Optionally, after the moving object is not in the working area, the piece counting module 13 in the present embodiment may be further configured to perform: if the moving object is located in a preset first area, performing piece counting operation on qualified clothing after quality inspection; or if the moving object is positioned in the preset second area, performing counting operation on unqualified clothing after quality inspection.
Optionally, the acquiring module 11 in this embodiment is further configured to, after acquiring at least one frame of image data for quality inspection of clothing, adjust a resolution of the at least one frame of image data so that the resolution of the image data meets a preset criterion.
Optionally, the acquiring module 11 in this embodiment is further configured to perform filtering denoising processing on at least one frame of image data before identifying a moving object and a working area where the moving object is located in the image data.
Optionally, the piece counting module 13 in the present embodiment is further configured to perform: at least one frame of image data of the moving object for performing quality inspection operation on the clothing is stored.
The apparatus of fig. 10 may perform the method of the embodiment of fig. 1-9, and reference is made to the relevant description of the embodiment of fig. 1-9 for parts of this embodiment not described in detail. The implementation process and the technical effect of this technical solution are described in the embodiments shown in fig. 1 to 9, and are not described herein.
In one possible design, the piece counting device of the garment shown in fig. 10 may be implemented as an electronic device, which may be a mobile phone, a tablet computer, a server, or other devices. As shown in fig. 11, the electronic device may include: a processor 21 and a memory 22. Wherein the memory 22 is for storing a program for supporting the electronic device to perform the piece counting method of the garment provided in the embodiment shown in fig. 1-9 described above, the processor 21 is configured for executing the program stored in the memory 22.
The program comprises one or more computer instructions, wherein the one or more computer instructions, when executed by the processor 21, are capable of performing the steps of:
acquiring at least one frame of image data for performing quality inspection operation on the clothing;
identifying a moving object in the image data and a working area where the moving object is located;
and performing piece counting operation on the clothing subjected to quality inspection according to the moving object and the working area.
Optionally, the processor 21 is further configured to perform all or part of the steps in the embodiments shown in fig. 1-9.
The electronic device may further include a communication interface 23 in its structure for communicating with other devices or with a communication network.
In addition, an embodiment of the present invention provides a computer storage medium, which is used for storing computer software instructions for an electronic device, and includes a program for executing the method for counting items of clothing in the method embodiment shown in fig. 1 to fig. 9.
The apparatus embodiments described above are merely illustrative, wherein the elements illustrated as separate elements may or may not be physically separate, and the elements shown as elements may or may not be physical elements, may be located in one place, or may be distributed over a plurality of network elements. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of this embodiment. Those of ordinary skill in the art will understand and implement the present invention without undue burden.
From the above description of the embodiments, it will be apparent to those skilled in the art that the embodiments may be implemented by adding necessary general purpose hardware platforms, or may be implemented by a combination of hardware and software. Based on such understanding, the foregoing aspects, in essence and portions contributing to the art, may be embodied in the form of a computer program product, which may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, etc.) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In one typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include volatile memory in a computer-readable medium, random Access Memory (RAM) and/or nonvolatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of computer-readable media.
Computer readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of storage media for a computer include, but are not limited to, phase change memory (PRAM), static Random Access Memory (SRAM), dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), read Only Memory (ROM), electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium, which can be used to store information that can be accessed by a computing device. Computer-readable media, as defined herein, does not include transitory computer-readable media (transmission media), such as modulated data signals and carrier waves.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present invention, and are not limiting; although the invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present invention.

Claims (12)

1. A method of metering a garment, comprising:
acquiring at least one frame of image data for performing quality inspection operation on the clothing;
identifying a moving object in the image data and a working area where the moving object is located;
detecting whether the moving object is positioned in the working area;
if the moving object is not in the working area, performing counting operation on the clothing after quality inspection;
identifying a working area where a moving object is located in the image data comprises:
obtaining a statistics corresponding to each pixel point in at least one frame of the image data;
establishing a statistical matrix based on the statistical value, wherein the size of the statistical matrix is the same as the size of the image data;
Normalizing the statistical matrix to obtain a pixel gray value corresponding to each statistical value in the statistical matrix;
and when the pixel gray value is greater than or equal to a preset gray threshold value, determining a pixel area corresponding to the pixel gray value as a working area where the moving object is located.
2. The method of claim 1, wherein identifying a moving object in the image data comprises:
establishing a background model image based on at least one frame of the image data;
and identifying a moving object in the image data of each frame according to at least one frame of the image data and the background model image.
3. The method of claim 2, wherein identifying moving objects in the image data for each frame from at least one frame of the image data and the background model image comprises:
acquiring a first pixel value of each pixel point in at least one frame of the image data and a second pixel value of the same pixel point in the background model image;
and determining a moving object in the image data according to the first pixel value and the second pixel value.
4. A method according to claim 3, wherein determining a moving object in the image data from the first pixel value and the second pixel value comprises:
Acquiring a difference value between the first pixel value and the second pixel value;
searching all pixel points with the difference value larger than or equal to a preset pixel threshold value in the image data, wherein all pixel points form a moving object in the image data.
5. The method according to claim 1, wherein the method further comprises:
and updating the statistical matrix.
6. The method of claim 5, wherein updating the statistical matrix comprises:
acquiring a preset updating coefficient, wherein the updating coefficient is a positive number smaller than 1;
multiplying all statistics values included in the statistics matrix with the update coefficient respectively to obtain updated values;
and obtaining an updated statistical matrix based on the updated values.
7. The method of claim 1, wherein after the moving object is not within the work area, the method further comprises:
if the moving object is located in a preset first area, performing counting operation on qualified clothing after quality inspection; or alternatively, the process may be performed,
and if the moving object is positioned in the preset second area, performing counting operation on unqualified clothing after quality inspection.
8. The method according to any one of claims 1-7, wherein after acquiring at least one frame of image data for quality testing of the garment, the method further comprises:
and adjusting the resolution of at least one frame of the image data so that the resolution of the image data meets a preset standard.
9. The method according to any one of claims 1-7, wherein prior to identifying the moving object in the image data and the work area in which the moving object is located, the method further comprises:
and filtering and denoising at least one frame of the image data.
10. The method according to any one of claims 1-7, further comprising:
and storing at least one frame of image data of the moving object for performing quality inspection operation on the clothing.
11. A piece counting device for clothing, comprising:
the acquisition module is used for acquiring at least one frame of image data for quality inspection of the clothing;
the identification module is used for identifying the moving object in the image data and the working area where the moving object is located;
the counting module is used for detecting whether the moving object is positioned in the working area; if the moving object is not in the working area, performing counting operation on the clothing after quality inspection;
The identification module is used for obtaining statistics corresponding to each pixel point in at least one frame of the image data; establishing a statistical matrix based on the statistical value, wherein the size of the statistical matrix is the same as the size of the image data; normalizing the statistical matrix to obtain a pixel gray value corresponding to each statistical value in the statistical matrix; and when the pixel gray value is greater than or equal to a preset gray threshold value, determining a pixel area corresponding to the pixel gray value as a working area where the moving object is located.
12. An electronic device, comprising: a memory, a processor; wherein the memory is for storing one or more computer instructions, wherein the one or more computer instructions, when executed by the processor, implement the piece count method of the garment of any one of claims 1 to 10.
CN201910063798.7A 2019-01-23 2019-01-23 Method, device and equipment for counting clothes Active CN111476336B (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN201910063798.7A CN111476336B (en) 2019-01-23 2019-01-23 Method, device and equipment for counting clothes
TW108142548A TW202030641A (en) 2019-01-23 2019-11-22 Method, apparatus and device for counting clothing by number of pieces
PCT/CN2020/071926 WO2020151530A1 (en) 2019-01-23 2020-01-14 Method, apparatus and device for counting clothing by number of pieces

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910063798.7A CN111476336B (en) 2019-01-23 2019-01-23 Method, device and equipment for counting clothes

Publications (2)

Publication Number Publication Date
CN111476336A CN111476336A (en) 2020-07-31
CN111476336B true CN111476336B (en) 2023-06-20

Family

ID=71736310

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910063798.7A Active CN111476336B (en) 2019-01-23 2019-01-23 Method, device and equipment for counting clothes

Country Status (3)

Country Link
CN (1) CN111476336B (en)
TW (1) TW202030641A (en)
WO (1) WO2020151530A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114640753B (en) * 2022-04-01 2023-10-27 北京市疾病预防控制中心 Nematode pharyngeal pump movement frequency automatic identification method based on experimental video processing

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101325690A (en) * 2007-06-12 2008-12-17 上海正电科技发展有限公司 Method and system for detecting human flow analysis and crowd accumulation process of monitoring video flow
CN102609689A (en) * 2012-02-03 2012-07-25 江苏科海智能系统有限公司 Video driveway background modeling method based on multi-frame counting
CN102930279A (en) * 2012-09-29 2013-02-13 广西工学院 Image identification method for detecting product quantity
WO2013075295A1 (en) * 2011-11-23 2013-05-30 浙江晨鹰科技有限公司 Clothing identification method and system for low-resolution video
JP2014157524A (en) * 2013-02-18 2014-08-28 Nippon Telegr & Teleph Corp <Ntt> Image display system, server device, information processing terminal and control method
CN104616290A (en) * 2015-01-14 2015-05-13 合肥工业大学 Target detection algorithm in combination of statistical matrix model and adaptive threshold
CN104899557A (en) * 2015-05-25 2015-09-09 浙江工业大学 Intersection background image extraction method based on video
CN108345842A (en) * 2018-01-24 2018-07-31 成都鼎智汇科技有限公司 A kind of processing method based on big data
CN108453731A (en) * 2017-02-17 2018-08-28 发那科株式会社 Robot system

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9662084B2 (en) * 2015-06-18 2017-05-30 Toshiba Medical Systems Corporation Method and apparatus for iteratively reconstructing tomographic images from electrocardiographic-gated projection data

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101325690A (en) * 2007-06-12 2008-12-17 上海正电科技发展有限公司 Method and system for detecting human flow analysis and crowd accumulation process of monitoring video flow
WO2013075295A1 (en) * 2011-11-23 2013-05-30 浙江晨鹰科技有限公司 Clothing identification method and system for low-resolution video
CN102609689A (en) * 2012-02-03 2012-07-25 江苏科海智能系统有限公司 Video driveway background modeling method based on multi-frame counting
CN102930279A (en) * 2012-09-29 2013-02-13 广西工学院 Image identification method for detecting product quantity
JP2014157524A (en) * 2013-02-18 2014-08-28 Nippon Telegr & Teleph Corp <Ntt> Image display system, server device, information processing terminal and control method
CN104616290A (en) * 2015-01-14 2015-05-13 合肥工业大学 Target detection algorithm in combination of statistical matrix model and adaptive threshold
CN104899557A (en) * 2015-05-25 2015-09-09 浙江工业大学 Intersection background image extraction method based on video
CN108453731A (en) * 2017-02-17 2018-08-28 发那科株式会社 Robot system
CN108345842A (en) * 2018-01-24 2018-07-31 成都鼎智汇科技有限公司 A kind of processing method based on big data

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
危自福 ; 毕笃彦 ; 张明 ; 何林远 ; .基于背景重构和水平集的多运动目标分割.光电工程.2009,(第07期),全文. *
彭长生 ; 詹智财 ; 张松松 ; 程碧淳 ; .一种基于多帧统计的车道背景建模方法.计算机应用与软件.2013,(第05期),全文. *
王修岩 ; 程婷婷 ; .基于单目视觉的工业机器人目标识别技术研究.机械设计与制造.2011,(第04期),全文. *

Also Published As

Publication number Publication date
WO2020151530A1 (en) 2020-07-30
TW202030641A (en) 2020-08-16
CN111476336A (en) 2020-07-31

Similar Documents

Publication Publication Date Title
CN107305774A (en) Speech detection method and device
CN108876813B (en) Image processing method, device and equipment for detecting object in video
CN105574891B (en) The method and system of moving target in detection image
US20190236336A1 (en) Facial recognition method, facial recognition system, and non-transitory recording medium
CN110135514B (en) Workpiece classification method, device, equipment and medium
CN109493367A (en) The method and apparatus that a kind of pair of target object is tracked
CN116309757B (en) Binocular stereo matching method based on machine vision
CN111476336B (en) Method, device and equipment for counting clothes
CN109978855A (en) A kind of method for detecting change of remote sensing image and device
CN110797046B (en) Method and device for establishing prediction model of voice quality MOS value
CN111476059A (en) Target detection method and device, computer equipment and storage medium
CN116091874B (en) Image verification method, training method, device, medium, equipment and program product
CN112420066A (en) Noise reduction method, noise reduction device, computer equipment and computer readable storage medium
CN111353577B (en) Multi-task-based cascade combination model optimization method and device and terminal equipment
CN112183224A (en) Model training method for image recognition, image recognition method and device
US10789477B2 (en) Method and apparatus for real-time detection of a scene
CN111353526A (en) Image matching method and device and related equipment
CN108230284B (en) Motion trail determination method and device
CN107784363B (en) Data processing method, device and system
CN114759904A (en) Data processing method, device, equipment, readable storage medium and program product
CN115830048A (en) Image edge detection method and device and related equipment
CN112037198B (en) Hot-rolled bar fixed support separation detection method, system, medium and terminal
US10977482B2 (en) Object attribution analyzing method and related object attribution analyzing device
CN113784411A (en) Link quality evaluation method, link switching method, device and storage medium
CN110111286A (en) The determination method and apparatus of image optimization mode

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant