WO2020199775A1 - 货架状态确定方法及装置、存储介质 - Google Patents
货架状态确定方法及装置、存储介质 Download PDFInfo
- Publication number
- WO2020199775A1 WO2020199775A1 PCT/CN2020/075805 CN2020075805W WO2020199775A1 WO 2020199775 A1 WO2020199775 A1 WO 2020199775A1 CN 2020075805 W CN2020075805 W CN 2020075805W WO 2020199775 A1 WO2020199775 A1 WO 2020199775A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- label
- shelf
- image
- feature
- information
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/22—Matching criteria, e.g. proximity measures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/11—Region-based segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/46—Descriptors for shape, contour or point-related descriptors, e.g. scale invariant feature transform [SIFT] or bags of words [BoW]; Salient regional features
- G06V10/462—Salient features, e.g. scale invariant feature transforms [SIFT]
Definitions
- the present disclosure relates to the field of image processing technology, in particular to a method and device for determining shelf status, and a storage medium.
- the empty shelf calculation method in the related art usually uses a direct comparison between the shelf image and the shelf template image to obtain the empty position. However, this calculation method cannot accurately determine the specific empty position of the shelf and is prone to errors.
- one of the objectives of the embodiments of the present disclosure is to provide a shelf state determination method, device, and storage medium, which can calculate more accurate shelf changes.
- the first aspect of the embodiments of the present disclosure provides a method for determining shelf status, including:
- the method for determining shelf status further includes:
- the shelf change position information includes shelf out-of-stock position information.
- obtaining shelf partition information according to the label backplane information and label information includes:
- shelf partition information is obtained.
- performing label backplane detection on the shelf image includes:
- the false label backplane edge straight line that is falsely detected in the straight line detection result is removed, and the broken label backplane edge straight line is connected to obtain the label backplane information of the shelf.
- performing label detection on the shelf image includes:
- the pending label area where the second correlation coefficient is greater than the preset coefficient threshold is a label.
- the first image feature is an image feature including at least one of a brightness feature, a color feature, a direction feature, and a gradient feature.
- the second image feature is a texture feature including at least one of angular second moment feature, contrast feature, inverse moment feature, correlation feature, and entropy feature.
- detecting the goods change area of the shelf includes:
- the goods change area is detected.
- detecting and obtaining the goods change area according to the shelf image and the reference image includes:
- a device for determining shelf status including:
- the memory stores instructions that can be executed by the processor, and when the instructions are executed by the processor, the processor:
- the processor when the instruction is executed by the processor, the processor further causes the processor to push the shelf change position information to a designated user.
- the shelf change position information includes shelf out-of-stock position information.
- the instructions when executed by the processor, also cause the processor to:
- shelf partition information is obtained.
- the instructions when executed by the processor, also cause the processor to:
- the false label backplane edge straight line that is falsely detected in the straight line detection result is removed, and the broken label backplane edge straight line is connected to obtain the label backplane information of the shelf.
- the instructions when executed by the processor, also cause the processor to:
- the pending label area where the second correlation coefficient is greater than the preset coefficient threshold is a label.
- the first image feature is an image feature including at least one of a brightness feature, a color feature, a direction feature, and a gradient feature.
- the second image feature is a texture feature including at least one of angular second moment feature, contrast feature, inverse moment feature, correlation feature, and entropy feature.
- the instructions when executed by the processor, also cause the processor to:
- the goods change area is detected.
- the instructions when executed by the processor, also cause the processor to:
- the third aspect of the embodiments of the present disclosure provides a computer-readable storage medium storing a computer program, wherein the computer program implements the steps of the shelf state determination method when the computer program is executed by a processor.
- FIG. 1 is a schematic flowchart of a method for determining a shelf state provided by an embodiment of the disclosure
- FIG. 2A is a schematic diagram of shelf images in an embodiment of the disclosure.
- 2B is a schematic diagram of a label template image in an embodiment of the disclosure.
- 2C is a schematic diagram of a reference image in an embodiment of the disclosure.
- 2D is a schematic diagram of detecting a straight line of the edge of the label backplane in an embodiment of the disclosure
- FIG. 3A is a schematic diagram of a process of detecting tags in an embodiment of the disclosure.
- FIG. 3B is a schematic diagram of a process of calculating a first correlation coefficient in an embodiment of the disclosure
- FIG. 3C is a schematic diagram of a specific flow of a step of determining a label in an embodiment of the disclosure.
- FIG. 3D is a schematic flowchart of calculating shelf partition information in an embodiment of the disclosure.
- FIG. 3E is a schematic diagram of a process of detecting a cargo change area in an embodiment of the disclosure.
- FIG. 4 is a schematic diagram of functional modules of a device for determining a shelf state provided by an embodiment of the disclosure
- FIG. 5 is a schematic diagram of the hardware structure of an apparatus for executing the method for determining shelf status provided by an embodiment of the disclosure.
- a method for determining shelf status is proposed, which can calculate more accurate shelf changes.
- the method for determining shelf status includes:
- Step 11 Obtain shelf images, refer to Figure 2A.
- the shelf image 200 may be photographed in real time by a camera set in front of the shelf, and the photographed shelf image 200 may be obtained through a corresponding wired or wireless transmission method.
- preprocessing such as noise removal and image enhancement may also be performed first.
- the shelf image 200 may also be obtained in other ways, for example, from a remote server, generated in real time in a virtual reality (VR) environment, or copied through a portable storage device.
- the shelf in the shelf image 200 may include a side bracket 210, a label back plate 220, a label 230, and goods 240 of the shelf.
- reference numeral 250 indicates the position of part of the out-of-stock goods in the shelf image 200.
- Step 12 Perform label backplane detection and label detection on the shelf image 200 to obtain label backplane information and label information of the shelf.
- the label back plate 220 is a rectangular area where labels are placed between two shelves, and the label back plate area 220 has a certain degree of distinction from other areas of the shelf.
- the input of the label backplane detection is the shelf image 200, and the output is the 4 edge line segment information of the label backplane 220.
- the label 230 here refers to the label that records the cargo information of each kind of goods 240.
- the label can usually be a price tag.
- the price tag may also include price information.
- Labels usually only record the information related to the goods and do not include the price information of the goods.
- performing label backplane detection on the shelf image 200 may include the following steps:
- the straight line detection method can be implemented by using Canny algorithm, Sobel operator, Laplacian operator, Hough transform algorithm and other algorithms;
- a line diagram of the label backplane 220 as shown in FIG. 2D may be obtained.
- the line graph may include, for example, the detected actual edge straight line 260 of the label backplane 220, the broken portion 270 of the missed edge straight line, and/or the erroneously detected non-edge straight line 280.
- the false detection and/or missed detection of the edge line including (but not limited to): light changes, obstructions, excessive image preprocessing, and so on.
- the method of removing the line edge of the false label backplane detected by mistake may include (but is not limited to), for example, judging the line length, and comparing it or its ratio with respect to the width (or height) of the shelf image
- the predetermined threshold value is compared, and when it is less than the predetermined threshold value, it can be regarded as a falsely detected false label backplane edge straight line. This is because, usually, the label backplane traverses the entire shelf image, so the length of the edge line of the label backplane or its ratio relative to the shelf image should be large enough.
- the label backplane is usually rectangular (or trapezoidal as shown in Figure 2A, Figure 2C, and Figure 2D due to image distortion)
- the edge line constitutes a rectangular (or trapezoidal) shape, which can be regarded as a falsely detected false label backplane edge line.
- the method of judging whether it is the edge of the pseudo-label backplane is not limited to the above-mentioned embodiment.
- the two edge straight lines are judged to have a disconnection less than the threshold length on substantially one straight line, they can be regarded as the edge straight line missed detection, and they are automatically connected into one line. straight line.
- performing label detection on the shelf image includes:
- Step 121 Obtain a label template image, as shown in FIG. 2B.
- the label template image may be collected in advance according to the pattern of the goods label actually used.
- FIG. 2B shows an example of a label image of a commodity in a supermarket, and the label template image can be obtained by replacing specific words with symbols.
- the label template images of different layers on the shelf can be extracted from top to bottom.
- Step 122 Extract the first image features of the label template image and the shelf image.
- the first image feature is an image feature including at least one of a brightness feature, a color feature, a direction feature, and a gradient feature.
- the method for extracting the brightness feature includes:
- the label template image and the shelf image are color images, calculate the gray average of the 3 color channels of red, green and blue (RGB) and generate a gray image, and normalize the gray image, namely The pixel value of the image is divided by the maximum value of the image pixel to obtain the brightness characteristic image;
- the label template image and the shelf image are grayscale images, then the grayscale images are normalized to obtain brightness characteristics.
- the method for extracting the color feature includes:
- the label template image and the shelf image are color images, for the pixel point (x, y), where x is the row value and y is the column value, the red channel of the marked image is at the pixel point (x, y) The value is r, the pixel value of the green channel at the pixel point (x, y) is g, the pixel value of the blue channel at the pixel point (x, y) is b, and the four dimensions below the pixel point (x, y) are extracted Color characteristics:
- the above operations are performed on all the pixels in the image to generate the corresponding four color feature images, and the above four color feature images are respectively normalized to obtain the color features.
- the label template image and the shelf image are grayscale images, these color features are not extracted.
- the method for extracting the direction feature includes:
- the Gabor wavelet transform is used to extract the features of four directions of 0 degree, 35 degrees, 90 degrees, and 135 degrees of the label template image and the shelf image respectively, and normalization is performed to obtain the direction characteristics.
- the method for extracting the gradient feature includes:
- gray-scale the image into a gray-scale image If the label template image and the shelf image are color images, gray-scale the image into a gray-scale image; extract the gradient amplitude feature of the gray-scale image, and normalize it to obtain the gradient feature;
- the gradient features of the grayscale images are directly extracted and normalized to obtain the gradient features.
- the gradient of the image function f(x, y) at the point (x, y) is a vector with magnitude and direction
- G x and G y are used to represent the image function f(x, y) in the x direction and y
- the gradient of the direction, the vector of this gradient can be expressed as:
- the direction of the gradient is the fastest changing direction of the function f(x, y).
- f(x, y) When there are edges in the image, there must be a larger gradient value. On the contrary, when there are relatively smooth parts in the image, the gray value changes less , The corresponding gradient is also small.
- the mode of the gradient is referred to as gradient, and the image composed of image gradient is called gradient image.
- first image feature including the features of brightness, color, direction, and gradient is not the only embodiment of the present disclosure.
- the features included in the first image feature can be adjusted as needed, such as adding Other features or deletion of features, etc.
- Step 123 Calculate the first correlation coefficient according to the first image feature of the label template image and the shelf image, and generate a label saliency map.
- calculating the first correlation coefficient according to the first image feature of the label template image and the shelf image includes:
- Step 1231 On the shelf image, move the label template image in pixels to traverse the shelf image;
- Step 1232 Calculate the first correlation coefficient between the label template image and the shelf image covered by each movement, that is, calculate the label template image and the shelf covered by each movement The first correlation coefficient between images.
- the first correlation coefficient between the label template image and the shelf image covered by it is calculated once, so that the calculation is performed at each position.
- a first correlation coefficient is obtained, all the first correlation coefficients are combined, and the position on the shelf image moved by the center point of the label template image corresponding to the first correlation coefficient is combined to generate a label saliency map.
- Step 124 Perform adaptive threshold segmentation on the label saliency map to obtain a set of center points of the undetermined label area.
- Step 125 Determine the pending label area in the shelf image according to the central point set of the pending label area.
- the tag saliency map is generated based on the first correlation coefficient and the position of the center point of the tag template image corresponding to the first correlation coefficient that moves to the shelf image, that is, the plane coordinates of the points of the tag saliency map are the first
- the center point of the label template image corresponding to the correlation coefficient is moved to the position on the shelf image.
- the points on the label saliency map obtained by adaptive threshold segmentation are a collection of discrete points, and these points
- the area whose center point is the size of the label template image is the undetermined label area.
- performing adaptive threshold segmentation on the label saliency map to obtain a set of center points of the undecided label region includes:
- Binary segmentation is performed on the label saliency map using an adaptive threshold segmentation algorithm (such as OTSU, also known as the Otsu method or the maximum between-class variance method), where the foreground area is the set of center points of the pending label area.
- an adaptive threshold segmentation algorithm such as OTSU, also known as the Otsu method or the maximum between-class variance method
- Step 126 Extract the second image feature of the pending label area and the label template image.
- the step of extracting the second image feature of the pending label area may also be obtained by processing the entire shelf image (which contains the pending label area) in advance, and may be processed in advance (that is, at the beginning Extracting the second image feature from the shelf image, instead of extracting it after obtaining the pending label area), is not limited to only extracting the second image feature from the pending label area.
- these two second image feature extraction methods can be applied to the present disclosure, and are not specifically limited here.
- the second image feature is a texture feature including at least one of angular second moment feature, contrast feature, inverse moment feature, correlation feature, and entropy feature.
- the method of extracting the texture feature includes:
- the Gray Level Co-occurrence Matrix refers to a common method for describing texture by studying the spatial correlation characteristics of gray levels. Since the texture is formed by the repeated occurrence of grayscale distribution in space, there will be a certain grayscale relationship between two pixels separated by a certain distance in the image space, that is, the spatial correlation characteristics of the grayscale in the image.
- the gray-level histogram is the statistical result of a single pixel on the image with a certain gray-level, while the gray-level co-occurrence matrix is obtained by statistically calculating the status of two pixels that maintain a certain distance on the image with a certain gray-level.
- the gray level co-occurrence matrix generation is briefly introduced as follows:
- the distance difference value (a, b) takes different numerical combinations to obtain the joint probability matrix in different situations.
- the value of (a, b) should be selected according to the characteristics of the texture period distribution. For finer textures, small difference values such as (1, 0), (1, 1), (2, 0) are selected.
- the probability of two pixel gray levels occurring at the same time transforms the spatial coordinates of (x, y) into the description of "gray pair" (g1, g2), forming a gray level co-occurrence matrix.
- the diagonal elements of the gray-level co-occurrence matrix will have relatively large values; if the gray values of the image pixels change locally, they will deviate from the diagonal The elements of the line will have larger values.
- some scalars can be used to characterize the characteristics of the gray-level co-occurrence matrix.
- G represent the common features of the gray-level co-occurrence matrix:
- the ASM has a larger Value
- the value distribution in G is more uniform (such as an image with severe noise)
- the angular second-order moment is the sum of the squares of the element values of the gray-level co-occurrence matrix, so it is also called energy, which reflects the uniformity of the image grayscale distribution and the thickness of the texture.
- energy which reflects the uniformity of the image grayscale distribution and the thickness of the texture.
- the ASM value is small; on the contrary, if some of the values are large and other values are small, the ASM value is large.
- the elements in the co-occurrence matrix are concentratedly distributed, the ASM value is large at this time.
- a large ASM value indicates a more uniform and regular texture pattern.
- Contrast reflects the sharpness of the image and the depth of texture grooves. The deeper the texture groove, the greater the contrast, and the clearer the visual effect; on the contrary, the lower the contrast, the shallower the groove and the blurry effect.
- the inverse error moment reflects the homogeneity of the image texture and measures how much the image texture changes locally.
- the deficit moment reflects the clarity and regularity of the texture. The texture is clear, the regularity is strong, and it is easy to describe, and the value is large; the messy, difficult to describe, the value is small. A large value indicates that the image texture lacks changes between different regions, and the local is very uniform.
- Correlation reflects the consistency of image texture and is used to measure the degree of similarity of image gray levels in the row or column direction. Therefore, the value of the value reflects the local gray correlation. The larger the value, the greater the correlation. If there are horizontal textures in the image, the COR of the horizontal matrix is greater than the COR values of the remaining matrices. It measures the degree of similarity of spatial gray-level co-occurrence matrix elements in the row or column direction. Therefore, the correlation value reflects the local gray-level correlation in the image. When the matrix element values are uniformly equal, the correlation value is large; on the contrary, if the matrix pixel values differ greatly, the correlation value is small.
- the entropy will have a larger value.
- Entropy is a measure of the amount of information the image has. Texture information also belongs to the information of the image. It is a measure of randomness. When all elements in the co-occurrence matrix have the greatest randomness and all values in the spatial co-occurrence matrix are almost equal, the co-occurrence matrix When the middle elements are dispersed, the entropy is larger. It represents the degree of non-uniformity or complexity of the texture in the image.
- a feature vector can be used to integrate the above features.
- the integrated feature vector can be regarded as a description of the image texture, which can be further used for classification, identification, retrieval, etc.
- Step 127 Calculate a second correlation coefficient according to the second image feature of the pending label area and the label template image.
- Step 128 Determine that the pending label area where the second correlation coefficient is greater than the preset coefficient threshold is a label.
- the determining that the undetermined label region where the second correlation coefficient is greater than the preset coefficient threshold is a label includes:
- Step 1281 Sort the regions to be determined according to their first correlation coefficients in descending order
- Step 1282 According to the arrangement order, sequentially calculate a second correlation coefficient between each pending label area and the second image feature of the label template image;
- Step 1283 Determine the undetermined label region where the second correlation coefficient is greater than the preset coefficient threshold as a label, and stop the calculation of the second correlation coefficient when the second correlation coefficient is less than the preset coefficient threshold. In this way, calculation time can be saved and calculation efficiency improved.
- the preset coefficient threshold can be set as required, such as 0.8, but it is not specifically limited here.
- the first correlation coefficient is calculated, and the label saliency map is generated, and then the pending label area is obtained according to the label saliency map, and then the difference between the pending label area and the label template is extracted Second image features and calculate the second correlation coefficient of the two, and finally determine the pending label area where the second correlation coefficient is greater than the preset coefficient threshold as a label.
- a more accurate label on the shelf image can be obtained Position, which can realize the segmentation of the shelf image according to the label position, which is conducive to the subsequent image comparison, in order to calculate a more accurate shelf vacancy rate.
- Step 13 Obtain shelf partition information according to the label backplane information and label information.
- obtaining shelf partition information according to the label backplane information and label information includes:
- Step 131 Obtain shelf layering information according to the label backplane information
- Step 132 According to the label information, combine the shelf layering information to obtain shelf partition information.
- the step of using the label template image and the shelf image to obtain shelf partition information can be performed offline, the display of the shelf goods and the position of label placement are usually unchanged for a period of time, and the shelf partition information can be used continuously for a period of time.
- Step 14 Detect the cargo change area of the shelf.
- detecting the goods change area of the shelf includes:
- the goods change area is detected.
- the reference image 200 ′ is an image of the shelf in a full state after the goods are normally placed and is used to compare with the shelf image 200 collected in real time to obtain the goods change area, that is, the rough change of the goods.
- the step of obtaining the reference image 200' may be performed at the same time as the step of obtaining the shelf image 200 to improve processing efficiency.
- the detected goods change area according to the shelf image 200 and the reference image 200' includes:
- Step 141 Extract third image features (such as color, texture, etc.) of the shelf image 200 and the reference image 200';
- Step 142 Perform image change detection based on the third image feature, and determine the cargo change area based on the detected image change area.
- the image change detection technology uses feature-level change detection technology.
- Feature-level change detection uses a certain algorithm to first extract feature information from the original image, such as edges, shapes, contours, textures, etc., and then synthesize these feature information Analysis and change detection. Because feature-level change detection performs correlation processing on features and classifies features into meaningful combinations, it has higher credibility and accuracy in judging feature attributes. According to the different character description methods, different methods can be used to compare the two sets of characteristics.
- the details are as follows: (1) When using numerical features to describe the detection object, statistical pattern recognition can be used to judge the similarity of the two groups of features and determine the change information of the detection object; (2) When using structural features to describe the detection object At the same time, the method of structural pattern recognition can be used to judge the similarity of the two groups of features and determine the change information of the detection object.
- image change detection can also be pixel-based change detection.
- the advantages of the pixel-based image change detection method are: the method is simple, fast, and easy to obtain the change area, but the type and nature of the image change cannot be determined. Its specific algorithms are: difference method, ratio method, correlation coefficient method, regression analysis method, etc.
- the detection of the goods change area of the shelf can also be performed by using deep learning to perform image recognition on the acquired shelf image.
- the area where the goods are not recognized in the shelf area is the out of stock area, that is, the goods change area. .
- Step 15 Compare the cargo change area with the shelf partition information to obtain shelf change position information.
- the goods change area characterizes the position where the change occurs on the shelf, and the shelf partition information distinguishes the position of the goods on the shelf based on the label backplane and the label, so that when the change position matches a certain partition of the shelf At the time of loading, it means that the goods placed in the zone have changed, so that the information of the changed goods on the shelf can be accurately obtained, that is, the information of the changing position of the shelf.
- the method for determining shelf status further includes step 16: Pushing the shelf change position information to a designated user.
- the designated user may be any person preset to receive change information, such as a supermarket administrator, a replenisher, and so on.
- the shelf change position information includes shelf out-of-stock position information, which can directly reflect the shelf out-of-stock status, so that relevant personnel can be vigilant to replenish goods as soon as possible.
- the shelf status determination method uses the label backplane and the label position to partition the shelf, and then compares the shelf partition information with the change area of the goods to obtain the shelf change position information, thereby
- the change position can be calculated more accurately, and it is more convenient for the user to know the specific position of the shelf change, so that the goods that need to be adjusted according to the specific position can be confirmed, which is more convenient to use.
- a shelf state determination device which can calculate more accurate shelf changes.
- the shelf state determination device is shown in the form of a functional module in the embodiment of FIG. 4, the actual hardware structure is not limited to this. In fact, it can also adopt the hardware architecture of processor plus memory as shown in FIG. 5. In other words, the processor shown in FIG. 5 can execute the instructions stored in the memory to enable the processor to perform the functions of the modules shown in FIG. 4.
- the shelf state determination device includes:
- the obtaining module 21 is used to obtain shelf images
- the partition information calculation module 22 is used to perform label backplane detection and label detection on the shelf image to obtain label backplane information and label information of the shelf; and, according to the label backplane information and label information, obtain the shelf Partition information;
- the change area detection module 23 is used to detect the goods change area of the shelf
- the change information calculation module 24 is used to compare the goods change area with the shelf partition information to obtain shelf change location information.
- the shelf state determination device may further include a pushing module 25 for pushing the shelf change position information to a designated user.
- the shelf change position information includes shelf out-of-stock position information.
- the shelf status determination device uses the label backplane and the label position to partition the shelf, and then compares the shelf partition information with the goods change area to obtain the shelf change position information, thereby
- the change position can be calculated more accurately, and it is more convenient for the user to know the specific position of the shelf change, so that the goods that need to be adjusted according to the specific position can be confirmed, which is more convenient to use.
- the partition information calculation module 22 is configured to:
- shelf partition information is obtained.
- the partition information calculation module 22 is configured to:
- the false label backplane edge straight line that is falsely detected in the straight line detection result is removed, and the broken label backplane edge straight line is connected to obtain the label backplane information of the shelf.
- the partition information calculation module 22 is configured to:
- the pending label area where the second correlation coefficient is greater than the preset coefficient threshold is a label.
- the first image feature is an image feature including at least one of a brightness feature, a color feature, a direction feature, and a gradient feature.
- the second image feature is a texture feature including at least one of angular second-order moment feature, contrast feature, inverse moment feature, correlation feature, and entropy feature.
- the acquiring module 21 is also used to acquire a reference image
- the change area detection module 23 is used to detect the change area of the goods according to the shelf image and the reference image.
- the change area detection module 23 is configured to:
- Each embodiment of the device for determining a shelf state has basically the same effect as the foregoing method for determining a shelf state, and will not be repeated here.
- the third aspect of the embodiments of the present disclosure proposes an embodiment of a device for executing the method for determining shelf status.
- FIG. 5 it is a schematic diagram of the hardware structure of an embodiment of the apparatus for executing the method for determining shelf status provided by the present disclosure.
- the device includes:
- One or more processors 31 and a memory 32 are taken as an example in FIG. 5.
- the device for executing the method for determining the shelf state may further include: an input device 33 and an output device 34.
- the processor 31, the memory 32, the input device 33, and the output device 34 may be connected by a bus or in other ways.
- the connection by a bus is taken as an example.
- the memory 32 can be used to store non-volatile software programs, non-volatile computer-executable programs, and modules, as described in the shelf state determination method in the embodiment of the present application.
- Program instructions/modules for example, the acquisition module 21, the partition information calculation module 22, the change area detection module 23, and the change information calculation module 24 shown in FIG. 4).
- the processor 31 executes various functional applications and data processing of the server by running non-volatile software programs, instructions, and modules stored in the memory 32, that is, implements the shelf state determination method of the foregoing method embodiment.
- the memory 32 may include a storage program area and a storage data area.
- the storage program area may store an operating system and an application program required by at least one function; the storage data area may store data created according to the use of the shelf state determination device.
- the memory 32 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, or other non-volatile solid-state storage devices.
- the memory 32 may optionally include a memory remotely provided with respect to the processor 31, and these remote memories may be connected to the member user behavior monitoring device via a network. Examples of the aforementioned networks include, but are not limited to, the Internet, corporate intranets, local area networks, mobile communication networks, and combinations thereof.
- the input device 33 can receive inputted numeric or character information, and generate key signal inputs related to user settings and function control of the shelf state determining device.
- the output device 34 may include a display device such as a display screen. In some embodiments, one or both of the input device 33 and the output device 34 may be optional. In some embodiments, the input device 33 and the output device 34 may be at least partially the same hardware.
- the one or more modules are stored in the memory 32, and when executed by the one or more processors 31, the shelf state determination method in any of the foregoing method embodiments is executed.
- the technical effect of the embodiment of the device for executing the method for determining the shelf status is the same as or similar to any of the foregoing method embodiments.
- An embodiment of the present application provides a non-transitory computer storage medium that stores computer-executable instructions, and the computer-executable instructions can execute the processing method of the list item operation in any of the foregoing method embodiments.
- the technical effect of the embodiment of the non-transitory computer storage medium is the same as or similar to any of the foregoing method embodiments.
- the programs can be stored in a computer readable storage.
- the medium when the program is executed, it may include the procedures of the above-mentioned method embodiments.
- the storage medium may be a magnetic disk, an optical disc, a read-only memory (Read-Only Memory, ROM), or a random access memory (Random Access Memory, RAM), etc.
- the embodiment of the computer program has the same or similar technical effect as any of the foregoing method embodiments.
- the devices, devices, etc. described in the present disclosure may be various electronic terminal devices, such as mobile phones, personal digital assistants (PDA), tablet computers (PAD), smart TVs, etc., or large-scale terminal devices, such as Servers, etc., therefore, the protection scope of the present disclosure should not be limited to a specific type of device or equipment.
- the client described in the present disclosure may be applied to any of the above electronic terminal devices in the form of electronic hardware, computer software or a combination of both.
- the method according to the present disclosure may also be implemented as a computer program executed by a CPU, and the computer program may be stored in a computer-readable storage medium.
- the computer program executes the above-mentioned functions defined in the method of the present disclosure.
- the above method steps and modules can also be implemented using a controller and a computer-readable storage medium for storing a computer program that enables the controller to implement the above steps or unit functions.
- non-volatile memory may include read only memory (ROM), programmable ROM (PROM), electrically programmable ROM (EPROM), electrically erasable programmable ROM (EEPROM), or flash memory Memory.
- Volatile memory can include random access memory (RAM), which can act as external cache memory.
- RAM can be obtained in various forms, such as synchronous RAM (DRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double data rate SDRAM (DDR SDRAM), enhanced SDRAM (ESDRAM), Synchronous link DRAM (SLDRAM) and direct RambusRAM (DRRAM).
- DRAM synchronous RAM
- DRAM dynamic RAM
- SDRAM synchronous DRAM
- DDR SDRAM double data rate SDRAM
- ESDRAM enhanced SDRAM
- SLDRAM Synchronous link DRAM
- DRRAM direct Rambus RAM
- the storage devices of the disclosed aspects are intended to include, but are not limited to, these and other suitable types of memory.
- DSP digital signal processors
- ASIC dedicated Integrated circuit
- FPGA field programmable gate array
- a general-purpose processor may be a microprocessor, but in the alternative, the processor may be any conventional processor, controller, microcontroller, or state machine.
- the processor may also be implemented as a combination of computing devices, for example, a combination of a DSP and a microprocessor, multiple microprocessors, one or more microprocessors combined with a DSP core, or any other such configuration.
- the steps of the method or algorithm described in combination with the disclosure herein may be directly included in hardware, a software module executed by a processor, or a combination of the two.
- the software module may reside in RAM memory, flash memory, ROM memory, EPROM memory, EEPROM memory, registers, hard disk, removable disk, CD-ROM, or any other form of storage medium known in the art.
- An exemplary storage medium is coupled to the processor such that the processor can read information from or write information to the storage medium.
- the storage medium may be integrated with the processor.
- the processor and the storage medium may reside in the ASIC.
- the ASIC can reside in the user terminal.
- the processor and the storage medium may reside as discrete components in the user terminal.
- the functions described may be implemented in hardware, software, firmware, or any combination thereof. If implemented in software, the functions can be stored as one or more instructions or codes on a computer-readable medium or transmitted through the computer-readable medium.
- Computer-readable media include computer storage media and communication media, including any media that facilitates the transfer of a computer program from one location to another.
- a storage medium may be any available medium that can be accessed by a general-purpose or special-purpose computer.
- the computer-readable medium may include RAM, ROM, EEPROM, CD-ROM or other optical disk storage devices, magnetic disk storage devices or other magnetic storage devices, or may be used for carrying or storing instructions in the form of Or any other medium that can be accessed by a general-purpose or special-purpose computer or a general-purpose or special-purpose processor. Also, any connection is properly termed a computer-readable medium.
- coaxial cable, fiber optic cable, twisted pair, digital subscriber line (DSL), or wireless technologies such as infrared, radio, and microwave to send software from a website, server, or other remote source
- coaxial cable Cable, fiber optic cable, twisted pair, DSL or wireless technologies such as infrared, radio and microwave are all included in the definition of media.
- magnetic disks and optical disks include compact disks (CDs), laser disks, optical disks, digital versatile disks (DVD), floppy disks, and Blu-ray disks. Disks generally reproduce data magnetically, while optical disks use lasers to optically reproduce data .
- the combination of the above content should also be included in the scope of computer-readable media.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- Life Sciences & Earth Sciences (AREA)
- Artificial Intelligence (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Evolutionary Biology (AREA)
- Evolutionary Computation (AREA)
- General Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Image Analysis (AREA)
Abstract
Description
Claims (19)
- 一种货架状态确定方法,包括:获取货架图像;对所述货架图像进行标签背板检测和标签检测,得到所述货架的标签背板信息和标签信息;根据所述标签背板信息和标签信息,得到货架分区信息;检测所述货架的货物变化区域;比对所述货物变化区域和所述货架分区信息,得到货架变化位置信息。
- 根据权利要求1所述的方法,其中,根据所述标签背板信息和标签信息,得到货架分区信息,包括:根据所述标签背板信息,得到货架分层信息;根据所述标签信息,结合所述货架分层信息,得到货架分区信息。
- 根据权利要求1所述的方法,其中,对所述货架图像进行标签背板检测,包括:对所述货架图像进行直线检测,得到直线检测结果;基于标签背板的边缘特性和形状特性,去除所述直线检测结果中误检到的伪标签背板边缘直线,并连接断裂的标签背板边缘直线,以得到所述货架的标签背板信息。
- 根据权利要求1所述的方法,其中,对所述货架图像进行标签检测,包括:获取标签模板图像;提取所述标签模板图像和所述货架图像的第一图像特征;根据所述标签模板图像和所述货架图像的第一图像特征,计算第一相关系数,并生成标签显著图;对所述标签显著图进行自适应阈值分割,得到待定标签区域的中心点集合;根据所述待定标签区域的中心点集合,确定所述货架图像中的待定标签区域;提取所述待定标签区域与所述标签模板图像的第二图像特征;根据所述待定标签区域与所述标签模板图像的第二图像特征,计算第二 相关系数;确定所述第二相关系数大于预设系数阈值的待定标签区域为标签。
- 根据权利要求4所述的方法,其中,所述第一图像特征为包含亮度特征、颜色特征、方向特征和梯度特征中至少一项在内的图像特征;和/或,所述第二图像特征为包含灰度共生矩阵的角二阶矩特征、对比度特征、逆差矩特征、相关性特征、熵特征纹理特征中至少一项在内的纹理特征。
- 根据权利要求1所述的方法,其中,检测所述货架的货物变化区域,包括:获取参考图像;根据所述货架图像和参考图像,检测得到货物变化区域。
- 根据权利要求6所述的方法,其中,根据所述货架图像和参考图像,检测得到货物变化区域,包括:提取所述货架图像和参考图像的第三图像特征;根据所述第三图像特征,进行图像变化检测,根据检测得到的图像变化区域确定所述货物变化区域。
- 根据权利要求1所述的方法,还包括:将所述货架变化位置信息推送给指定用户。
- 根据权利要求1或8所述的方法,其中,所述货架变化位置信息包括货架缺货位置信息。
- 一种货架状态确定装置,包括:处理器;存储器,存储有能被所述处理器执行的指令,所述指令在被所述处理器执行时使所述处理器:获取货架图像;对所述货架图像进行标签背板检测和标签检测,得到所述货架的标签背板信息和标签信息;以及,根据所述标签背板信息和标签信息,得到货架分区信息;检测所述货架的货物变化区域;比对所述货物变化区域和所述货架分区信息,得到货架变化位置信息。
- 根据权利要求10所述的装置,其中,所述指令在被所述处理器执行 时还使所述处理器:根据所述标签背板信息,得到货架分层信息;根据所述标签信息,结合所述货架分层信息,得到货架分区信息。
- 根据权利要求10所述的装置,其中,所述指令在被所述处理器执行时还使所述处理器:对所述货架图像进行直线检测,得到直线检测结果;基于标签背板的边缘特性和形状特性,去除所述直线检测结果中误检到的伪标签背板边缘直线,并连接断裂的标签背板边缘直线,以得到所述货架的标签背板信息。
- 根据权利要求10所述的装置,其中,所述指令在被所述处理器执行时还使所述处理器:获取标签模板图像;提取所述标签模板图像和所述货架图像的第一图像特征;根据所述标签模板图像和所述货架图像的第一图像特征,计算第一相关系数,并生成标签显著图;对所述标签显著图进行自适应阈值分割,得到待定标签区域的中心点集合;根据所述待定标签区域的中心点集合,确定所述货架图像中的待定标签区域;提取所述待定标签区域与所述标签模板图像的第二图像特征;根据所述待定标签区域与所述标签模板图像的第二图像特征,计算第二相关系数;确定所述第二相关系数大于预设系数阈值的待定标签区域为标签。
- 根据权利要求13所述的装置,其中,所述第一图像特征为包含亮度特征、颜色特征、方向特征和梯度特征中至少一项在内的图像特征;和/或,所述第二图像特征为包含角二阶矩特征、对比度特征、逆差矩特征、相关性特征、熵特征中至少一项在内的纹理特征。
- 根据权利要求10所述的装置,其中,所述指令在被所述处理器执行时还使所述处理器:获取参考图像;根据所述货架图像和参考图像,检测得到货物变化区域。
- 根据权利要求15所述的装置,其中,所述指令在被所述处理器执行时还使所述处理器:提取所述货架图像和参考图像的第三图像特征;根据所述第三图像特征,进行图像变化检测,根据检测得到的图像变化区域确定所述货物变化区域。
- 根据权利要求10所述的装置,其中,所述指令在被所述处理器执行时还使所述处理器:将所述货架变化位置信息推送给指定用户。
- 根据权利要求10或17所述的装置,其中,所述货架变化位置信息包括货架缺货位置信息。
- 一种存储有计算机程序的计算机可读存储介质,其中,所述计算机程序在由处理器执行时实现权利要求1-9中任一项所述的方法的步骤。
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910249985.4 | 2019-03-29 | ||
CN201910249985.4A CN109961101B (zh) | 2019-03-29 | 2019-03-29 | 货架状态确定方法及装置、电子设备、存储介质 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2020199775A1 true WO2020199775A1 (zh) | 2020-10-08 |
Family
ID=67025345
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2020/075805 WO2020199775A1 (zh) | 2019-03-29 | 2020-02-19 | 货架状态确定方法及装置、存储介质 |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN109961101B (zh) |
WO (1) | WO2020199775A1 (zh) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112883955A (zh) * | 2021-03-10 | 2021-06-01 | 洛伦兹(北京)科技有限公司 | 货架布局检测方法、装置及计算机可读存储介质 |
CN113762701A (zh) * | 2021-03-26 | 2021-12-07 | 北京京东拓先科技有限公司 | 物品补货方法、装置以及存储介质 |
CN114955355A (zh) * | 2022-07-07 | 2022-08-30 | 中轻长泰(长沙)智能科技股份有限公司 | 堆垛机认址方法、装置、设备及存储介质 |
CN116758578A (zh) * | 2023-08-18 | 2023-09-15 | 上海楷领科技有限公司 | 机械制图信息提取方法、装置、系统及存储介质 |
Families Citing this family (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109961101B (zh) * | 2019-03-29 | 2021-04-27 | 京东方科技集团股份有限公司 | 货架状态确定方法及装置、电子设备、存储介质 |
CN110472486B (zh) * | 2019-07-03 | 2021-05-11 | 北京三快在线科技有限公司 | 一种货架障碍物识别方法、装置、设备及可读存储介质 |
CN110472515B (zh) * | 2019-07-23 | 2021-04-13 | 创新先进技术有限公司 | 货架商品检测方法及系统 |
US11069073B2 (en) | 2019-07-23 | 2021-07-20 | Advanced New Technologies Co., Ltd. | On-shelf commodity detection method and system |
CN113128813A (zh) * | 2019-12-31 | 2021-07-16 | 杭州海康机器人技术有限公司 | 调度货架的方法、装置、仓库系统及存储介质 |
CN111340078B (zh) * | 2020-02-18 | 2024-03-01 | 平安科技(深圳)有限公司 | 证件信息自动归类的方法、装置、介质及电子设备 |
CN111462125B (zh) * | 2020-04-03 | 2021-08-20 | 杭州恒生数字设备科技有限公司 | 一种增强活体检测图像处理系统 |
CN111832454A (zh) * | 2020-06-30 | 2020-10-27 | 苏州罗伯特木牛流马物流技术有限公司 | 利用工业相机视觉识别实现地面货位管理的系统和方法 |
CN111985559A (zh) * | 2020-08-19 | 2020-11-24 | 合肥工业大学 | 一种基于边界特性的轮胎花纹结构相似性检测方法 |
CN112836578B (zh) * | 2020-12-31 | 2022-09-23 | 广西慧云信息技术有限公司 | 一种基于表观特征的货架缺货检测方法 |
CN113674336A (zh) * | 2021-07-27 | 2021-11-19 | 浙江大华技术股份有限公司 | 货架空置信息确定方法、计算机设备及存储装置 |
CN114708291A (zh) * | 2022-04-02 | 2022-07-05 | 北京京东乾石科技有限公司 | 图像处理方法、装置、电子设备及存储介质 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104732253A (zh) * | 2013-12-20 | 2015-06-24 | 中国移动通信集团湖北有限公司 | 基于无源rfid的定位识别方法与系统及装置及中间件 |
US20170278056A1 (en) * | 2014-09-30 | 2017-09-28 | Nec Corporation | Information processing apparatus, control method, and program |
CN108846401A (zh) * | 2018-05-30 | 2018-11-20 | 京东方科技集团股份有限公司 | 商品检测终端、方法、系统以及计算机设备、可读介质 |
CN109330284A (zh) * | 2018-09-21 | 2019-02-15 | 京东方科技集团股份有限公司 | 一种货架系统 |
CN109961101A (zh) * | 2019-03-29 | 2019-07-02 | 京东方科技集团股份有限公司 | 货架状态确定方法及装置、电子设备、存储介质 |
Family Cites Families (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104112216A (zh) * | 2013-04-22 | 2014-10-22 | 学思行数位行销股份有限公司 | 存货管理与行销的图像辨识方法 |
US9158988B2 (en) * | 2013-06-12 | 2015-10-13 | Symbol Technclogies, LLC | Method for detecting a plurality of instances of an object |
CN203733140U (zh) * | 2014-01-21 | 2014-07-23 | 北京迪吉特展览展示有限公司 | 一种零售店铺产品推介多媒体系统及应用其的货架 |
US9536167B2 (en) * | 2014-12-10 | 2017-01-03 | Ricoh Co., Ltd. | Realogram scene analysis of images: multiples for scene analysis |
US9524486B2 (en) * | 2015-03-04 | 2016-12-20 | Xerox Corporation | System and method for retail store promotional price tag detection and maintenance via heuristic classifiers |
CN105787930B (zh) * | 2016-02-17 | 2019-01-18 | 上海文广科技(集团)有限公司 | 基于锐利度的针对虚化图像的显著性检测方法及系统 |
JP2019513274A (ja) * | 2016-03-29 | 2019-05-23 | ボサ ノバ ロボティクス アイピー, インク.Bossa Nova Robotics Ip, Inc. | 品物の設置、特定および計数のためのシステムおよび方法 |
CN106446993A (zh) * | 2016-11-10 | 2017-02-22 | 李锐渊 | 一种智能实时盘点货架系统 |
CN108345893B (zh) * | 2018-03-15 | 2021-01-26 | 京东方科技集团股份有限公司 | 一种直线检测方法、装置、计算机存储介质及终端 |
CN208172900U (zh) * | 2018-04-08 | 2018-11-30 | 上海小亦网络科技有限公司 | 一种智能售货柜 |
CN108564557B (zh) * | 2018-05-31 | 2020-08-25 | 京东方科技集团股份有限公司 | 图像校正方法及装置 |
CN208188867U (zh) * | 2018-06-01 | 2018-12-04 | 西安未来鲜森智能信息技术有限公司 | 一种用于无人自动售货的商品识别系统 |
CN109040539B (zh) * | 2018-07-10 | 2020-12-01 | 京东方科技集团股份有限公司 | 图像采集装置、货架及图像识别方法 |
CN108937366A (zh) * | 2018-07-18 | 2018-12-07 | 广州凌翔网络科技有限公司 | 一种商品的分类存放架 |
CN109377550A (zh) * | 2018-10-11 | 2019-02-22 | 泉州市宏恩新能源汽车科技有限公司 | 一种三维立体无人超市 |
-
2019
- 2019-03-29 CN CN201910249985.4A patent/CN109961101B/zh active Active
-
2020
- 2020-02-19 WO PCT/CN2020/075805 patent/WO2020199775A1/zh active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104732253A (zh) * | 2013-12-20 | 2015-06-24 | 中国移动通信集团湖北有限公司 | 基于无源rfid的定位识别方法与系统及装置及中间件 |
US20170278056A1 (en) * | 2014-09-30 | 2017-09-28 | Nec Corporation | Information processing apparatus, control method, and program |
CN108846401A (zh) * | 2018-05-30 | 2018-11-20 | 京东方科技集团股份有限公司 | 商品检测终端、方法、系统以及计算机设备、可读介质 |
CN109330284A (zh) * | 2018-09-21 | 2019-02-15 | 京东方科技集团股份有限公司 | 一种货架系统 |
CN109961101A (zh) * | 2019-03-29 | 2019-07-02 | 京东方科技集团股份有限公司 | 货架状态确定方法及装置、电子设备、存储介质 |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112883955A (zh) * | 2021-03-10 | 2021-06-01 | 洛伦兹(北京)科技有限公司 | 货架布局检测方法、装置及计算机可读存储介质 |
CN112883955B (zh) * | 2021-03-10 | 2024-02-02 | 洛伦兹(北京)科技有限公司 | 货架布局检测方法、装置及计算机可读存储介质 |
CN113762701A (zh) * | 2021-03-26 | 2021-12-07 | 北京京东拓先科技有限公司 | 物品补货方法、装置以及存储介质 |
CN114955355A (zh) * | 2022-07-07 | 2022-08-30 | 中轻长泰(长沙)智能科技股份有限公司 | 堆垛机认址方法、装置、设备及存储介质 |
CN114955355B (zh) * | 2022-07-07 | 2023-09-26 | 中轻长泰(长沙)智能科技股份有限公司 | 堆垛机认址方法、装置、设备及存储介质 |
CN116758578A (zh) * | 2023-08-18 | 2023-09-15 | 上海楷领科技有限公司 | 机械制图信息提取方法、装置、系统及存储介质 |
CN116758578B (zh) * | 2023-08-18 | 2023-11-07 | 上海楷领科技有限公司 | 机械制图信息提取方法、装置、系统及存储介质 |
Also Published As
Publication number | Publication date |
---|---|
CN109961101A (zh) | 2019-07-02 |
CN109961101B (zh) | 2021-04-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2020199775A1 (zh) | 货架状态确定方法及装置、存储介质 | |
WO2020199776A1 (zh) | 货架空置率计算方法及装置、存储介质 | |
US20210334574A1 (en) | Commodity detection terminal, commodity detection method, system, computer device, and computer readable medium | |
US9576210B1 (en) | Sharpness-based frame selection for OCR | |
Zamberletti et al. | Robust angle invariant 1d barcode detection | |
WO2020199777A1 (zh) | 价签检测方法及装置、存储介质 | |
US9418316B1 (en) | Sharpness-based frame selection for OCR | |
CN105701519A (zh) | 基于超像素的图像的实际货架图景象分析 | |
WO2020052270A1 (zh) | 一种视频审核的方法、装置和设备 | |
US10169673B2 (en) | Region-of-interest detection apparatus, region-of-interest detection method, and recording medium | |
CN108985190B (zh) | 目标识别方法和装置、电子设备、存储介质 | |
US11816946B2 (en) | Image based novelty detection of material samples | |
CN111784675A (zh) | 物品纹理信息处理的方法、装置、存储介质及电子设备 | |
Ngoc et al. | Saliency-based detection of identy documents captured by smartphones | |
CN108960247B (zh) | 图像显著性检测方法、装置以及电子设备 | |
Lopez-Rincon et al. | Binary large object‐based approach for QR code detection in uncontrolled environments | |
Devadethan et al. | Face detection and facial feature extraction based on a fusion of knowledge based method and morphological image processing | |
WO2022121021A1 (zh) | 一种身份证号码检测方法、装置、可读存储介质和终端 | |
WO2024016632A1 (zh) | 亮点定位方法、亮点定位装置、电子设备及存储介质 | |
CN111402177A (zh) | 一种清晰度检测方法、系统、设备和介质 | |
CN113239738B (zh) | 一种图像的模糊检测方法及模糊检测装置 | |
CN115456988A (zh) | 一种缺陷检测方法、终端设备及存储介质 | |
CN110276260B (zh) | 一种基于深度摄像头的商品检测方法 | |
CN113840135A (zh) | 色偏检测方法、装置、设备及存储介质 | |
Wang et al. | Database of human segmented images and its application in boundary detection |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20784282 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 20784282 Country of ref document: EP Kind code of ref document: A1 |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 04/02/2022) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 20784282 Country of ref document: EP Kind code of ref document: A1 |