WO2020048492A1 - 商品状态识别 - Google Patents

商品状态识别 Download PDF

Info

Publication number
WO2020048492A1
WO2020048492A1 PCT/CN2019/104425 CN2019104425W WO2020048492A1 WO 2020048492 A1 WO2020048492 A1 WO 2020048492A1 CN 2019104425 W CN2019104425 W CN 2019104425W WO 2020048492 A1 WO2020048492 A1 WO 2020048492A1
Authority
WO
WIPO (PCT)
Prior art keywords
real
time
position information
calibration
product
Prior art date
Application number
PCT/CN2019/104425
Other languages
English (en)
French (fr)
Inventor
康丽萍
马彬
魏晓明
Original Assignee
北京三快在线科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 北京三快在线科技有限公司 filed Critical 北京三快在线科技有限公司
Publication of WO2020048492A1 publication Critical patent/WO2020048492A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07FCOIN-FREED OR LIKE APPARATUS
    • G07F9/00Details other than those peculiar to special kinds or types of apparatus
    • G07F9/002Vending machines being part of a centrally controlled network of vending machines

Definitions

  • Embodiments of the present disclosure relate to the field of network technologies, and in particular, to a method, an apparatus, an electronic device, and a readable storage medium for identifying a product status.
  • a patent application with application number US 2018/0060803 A1 proposes a system and method for managing retail products.
  • the main steps include: first, obtaining two images for the monitoring shelf; then, comparing the brightness, contrast, and illuminance between the two images to estimate the consumption amount of the product; finally, estimating the consumption rate of the product according to the time stamp, and
  • the central device sends the consumption rate of the goods on the shelf, and the staff judges whether the replenishment is needed based on the consumption rate.
  • Embodiments of the present disclosure provide a method, an apparatus, an electronic device, and a readable storage medium for identifying a commodity state, so as to solve the above-mentioned problems of the prior art commodity state recognition.
  • an embodiment of the present disclosure provides a method for identifying a product status, including:
  • an embodiment of the present disclosure provides a product status recognition device, including:
  • An image acquisition module configured to acquire a real-time status image, and acquire calibration information corresponding to the real-time status image
  • a commodity location acquisition module configured to input the real-time state image into a target training model obtained in advance to obtain real-time target location information, where the real-time target location information includes real-time commodity location information;
  • the commodity state determination module is configured to determine commodity state information according to the calibration information and the real-time commodity position information, and the calibration information corresponds to the real-time state image.
  • an electronic device including:
  • the processor, the memory, and the computer program stored on the memory and executable on the processor are characterized in that, when the processor executes the computer program, the foregoing method for identifying a commodity state is implemented.
  • an embodiment of the present disclosure provides a computer program including computer-readable code that, when the computer-readable code runs on a computing processing device, causes the computing processing device to execute the foregoing method for identifying a commodity state.
  • an embodiment of the present disclosure provides a computer-readable storage medium in which the foregoing computer program is stored.
  • An embodiment of the present disclosure provides a method for identifying a product status, by acquiring a real-time status image, and acquiring calibration information corresponding to the real-time status image; inputting the real-time status image into a target training model obtained in advance, Real-time target position information is obtained, and the real-time target position information includes real-time product position information; product status information is determined according to the calibration information and the real-time product position information, and the calibration information corresponds to the real-time status image.
  • the embodiments of the present disclosure can indicate the degree of stock shortage according to the product status information, which helps improve the accuracy of the replenishment prompt.
  • FIG. 1 is a flowchart of specific steps of a method for identifying a product status according to an embodiment of the present disclosure
  • FIG. 2 is a flowchart illustrating specific steps of another method for identifying a product status according to an embodiment of the present disclosure
  • FIG. 3 is a structural diagram of a product status recognition device according to an embodiment of the present disclosure.
  • FIG. 4 is a structural diagram of another product status recognition device according to an embodiment of the present disclosure.
  • FIG. 5 schematically illustrates a block diagram of an electronic device for performing a method according to the present disclosure.
  • FIG. 6 schematically illustrates a storage unit for holding or carrying program code implementing a method according to the present disclosure.
  • FIG. 1 shows a flowchart of steps in a method for identifying a product status according to an embodiment of the present disclosure, including:
  • Step 101 Acquire a real-time status image.
  • the real-time status image is a real-time image taken for the product display area, and the camera continuously shoots the product display area in a certain period of time.
  • the time period can be set according to the actual application scenario and can be related to the speed of product sales. When the product sales speed is fast, the time period is small; when the product sales speed is slow, the time period is large.
  • the cameras are sent to a server for storage after they capture real-time status images.
  • Step 102 The real-time state image is input into a target training model obtained in advance to obtain real-time target position information, and the real-time target position information includes real-time product position information.
  • the input of the target detection model is a real-time state image
  • the output is the position information corresponding to each target in the image.
  • the real-time product position information is the position information of the frontmost product in an exhibition column. There is no product in front of the product and the product is in the back. When consumers buy goods, they often take the frontmost product, so the position of the frontmost product can reflect the degree of shortage.
  • targets include merchandise and labels.
  • the label is used to display the unique identification, name, price and other information of the product, and can be an ink label, which is pasted on the shelf layer.
  • labels are usually rectangular. Of course, other shapes can be specified. As a result, labels can be identified from the shelf layers based on their shape.
  • the products and labels are marked with rectangular frames
  • the position information may be the coordinates of the lower left corner and the upper right corner of the rectangular frame, or the coordinates of the upper left corner and the lower right corner of the rectangular frame.
  • the coordinates of the upper left corner of the image are (0,0)
  • one position information can be expressed as the lower left corner (12,100) and the upper right corner (80,20), or the upper left corner (12,20) and the lower right corner. (80, 100).
  • the real-time status image obtains at least one product and / or at least one label
  • the obtained result is a set of position information
  • Step 103 Determine commodity status information according to the calibration information and the real-time product location information, where the calibration information corresponds to the real-time status image.
  • the calibration information is used as a reference when determining the status information of the product.
  • the image is used to mark straight lines that are out of stock to different degrees.
  • the calibration information includes the position of the straight line and the degree of stock shortage.
  • the corresponding calibration information needs to be determined, and the calibration information is bound to the camera and stored. Therefore, corresponding calibration information can be obtained for each camera.
  • the product position information can be compared with each calibration information to obtain calibration lines on both sides of the product position; then the product status information corresponding to the product position can be determined.
  • the product status information corresponds to the calibration information.
  • the calibration lines on both sides of the product position correspond to 20% and 30% out of stock, respectively, the range of the product status information is 20% to 30% out of stock.
  • a message prompting replenishment may be sent to the central server.
  • an embodiment of the present disclosure provides a method for identifying a product status.
  • the method includes: acquiring a real-time status image, and acquiring calibration information corresponding to the real-time status image; and inputting the real-time status image to In the target detection model obtained in advance, real-time target position information is obtained, and the real-time target position information includes real-time product position information; product status information is determined according to the calibration information and the real-time product position information, and the calibration information corresponds to The real-time status image is described. It can indicate the degree of out of stock according to the product status information, which helps improve the accuracy of the replenishment prompt.
  • This embodiment of the present application describes an optional method for identifying a product state.
  • FIG. 2 a flowchart of specific steps of another method for identifying a product status according to an embodiment of the present disclosure is shown.
  • Step 201 Perform deep learning through a pre-collected product state image set to obtain an object detection model.
  • Each product state image in the product state image set is labeled with product position information, and / or, tag position information.
  • the product status image set may be an image taken for a real product sales area.
  • Each complete product and each label in the image are marked by a rectangular frame, and the position information of the rectangular frame is in the lower left and upper right corners, or, The upper left and lower right coordinates are represented.
  • the tag information also includes the category of the rectangular frame, indicating that the rectangular frame corresponds to a product or a label.
  • the marked products are products that are arranged in the forefront, so that the product position can reflect the out-of-stock state. If the coordinates of the upper left corner of the image is (0,0), the smaller the vertical coordinate of the product position in a display, the more out of stock the display.
  • the embodiments of the present disclosure may be trained by using Faster-RCNN (Faster, Region, Convolutional, Neural Network, Faster Object Detection Convolutional Neural Network). It can be understood that other deep learning models that can detect labeled targets from images, such as RCNN (Region with Convolutional Neural Network), Fast-RCNN (Fast Target Detection Convolutional Neural Network), can also be used.
  • Faster-RCNN Random, Region, Convolutional, Neural Network, Faster Object Detection Convolutional Neural Network
  • RCNN Registered with Convolutional Neural Network
  • Fast-RCNN Fast-RCNN
  • Step 202 Detect a calibration line from a calibration image to obtain position information of the calibration line and a corresponding reference ratio.
  • the calibration image is obtained by photographing a shelf layer provided with the calibration line, and the calibration image is related to the real-time status. The images correspond to the same shooting area.
  • the calibration line may be a straight line indicating a shortage ratio. It can be understood that the calibration image may include only the calibration area and the calibration line, and does not include the commodity. Products can also be placed outside the calibration area. However, in order to ensure the detection of the calibration area and the calibration line, no commodity can be placed on the calibration area.
  • the calibration area can be circled with special style and color lines, and the calibration line can be represented by characteristic styles and color straight lines.
  • a calibration area can be determined from the calibration image first; then a special pattern and color straight line is detected in the area to obtain a calibration line.
  • the calibration area can be an area with a calibration line on the left side of each shelf, and the horizontal area is the position of the first ink label on the left side of each shelf, and the first product position of the shelf on the right side of the ink label. Between, the vertical area is between the position of the shelf layer and the next shelf layer.
  • a calibration line can be detected in a calibration area, which can avoid detecting a line similar to the calibration line, thereby improving the detection accuracy of the calibration line.
  • the calibration line is deleted after the calibration image is captured, or is present in the real-time status image.
  • the calibration line can be erased from the shelf after shooting the calibration image, thereby ensuring the cleanliness of the shelf layer. Of course, you can't erase it.
  • Step 203 Acquire a real-time status image.
  • step 101 For this step, refer to the detailed description of step 101, and details are not described herein again.
  • Step 204 Perform geometric correction on the real-time status image, so that the shelf layer in the real-time status image is at a horizontal position.
  • the real-time status image needs to be geometrically corrected.
  • Step 205 Input the real-time state image into a target training model obtained in advance to obtain real-time target position information, where the real-time target position information includes real-time product position information.
  • step 102 For this step, refer to the detailed description of step 102, and details are not described herein again.
  • Step 206 Determine the out-of-stock ratio according to the position information of the calibration line, a corresponding reference ratio, and real-time product position information.
  • the calibration lines on the upper and lower sides of the product are determined according to the product position information; then the reference proportions corresponding to the upper and lower calibration lines are obtained; finally, the out-of-stock ratio is determined according to the two reference proportions, for example, the out-of-stock ratio is two Between reference ratios.
  • the above-mentioned step 206 includes sub-steps 2061 to 2064:
  • Sub-step 2061 determining position information of each shelf layer according to the real-time label position information.
  • the labels are located on the shelf layer.
  • the width can often be a maximum of several labels, so the position and number of shelf layers can be determined according to the difference between the vertical coordinates of different labels.
  • the position information of the shelf layer is determined according to all the label position information on each shelf layer.
  • the preset multiple may be determined according to the width of the shelf layer and the average height of the label; for example, if the width of the shelf layer is twice the average height of the label, the preset multiple is two.
  • Sub-step 2062 for each display of each shelf layer in the real-time status image, if the shelf layer is a target monitoring shelf layer, extract a reference vertical coordinate from real-time product position information of the display .
  • each item corresponds to each display.
  • the reference ordinate can be the ordinate of the lower left corner or the ordinate of the upper right corner, or the ordinate of the upper left corner or the ordinate of the lower right corner.
  • the target monitoring shelf layer is obtained by the following steps:
  • step A1 a calibration state image is input into the target detection model to obtain calibration target position information, where the calibration target position information includes calibration label position information and calibration product position information.
  • This step may refer to the detailed description of step 102, which is not repeated here.
  • the calibration state image needs to be geometrically corrected first.
  • step 102 is different from step 102 in that the input image is different.
  • Step A2 Determine the height of each shelf layer according to the position information of the calibration labels adjacent to each other.
  • the position information of each shelf layer is determined according to the detailed description of the sub-step 2061; then the difference between the position information of two adjacent shelf layers is calculated to obtain the height of the shelf layer.
  • step A3 the number of calibration columns of each shelf layer is determined according to the calibration product position information and the calibration label position information.
  • the products are divided into shelf layers according to the ordinate of the product position information. For example, if the ordinate of the product location information is greater than the ordinate of shelf layer A and smaller than the ordinate of shelf layer B, the product is located at shelf layer A. Therefore, the number of product position information on each shelf layer can be counted to obtain the number of columns of the shelf layer.
  • Step A4 For each shelf layer, when the height of the shelf layer is greater than a shelf height threshold, or if the number of calibrated exhibits of the shelf layer is greater than the threshold of the number of exhibits, determine that the shelf layer is the target monitoring shelf Floor.
  • the shelf height threshold may be determined according to the average height of the shelf layer, and the display number threshold may be determined according to the average number of the shelf layer.
  • Embodiments of the present disclosure do not monitor shelves that are too low or too small.
  • Sub-step 2063 Determine a reference calibration line according to the position information of the reference vertical coordinate and the calibration line.
  • the position information of the calibration line mainly refers to the ordinate of the calibration line.
  • two upper and lower calibration lines closest to the reference ordinate are selected from a plurality of calibration lines. For example, if the ordinate of the calibration line A is 20, the ordinate of B is 35, the ordinate of C is 50, the ordinate of D is 75, and the reference ordinate is 40, then the reference calibration lines are B and C.
  • the reference ordinate can also be one.
  • the reference calibration line is C.
  • the out-of-stock ratio corresponding to the exhibit is determined according to the reference ratio corresponding to the reference calibration line.
  • each calibration line corresponds to a stock-out ratio.
  • the reference ratio of calibration line B is 65% and the reference ratio of C is 50%
  • the out-of-stock ratio is 50% to 65% and the reference ordinate is 50
  • the stock-out ratio was 50%.
  • Step 207 For each shelf layer in the real-time status image, if the shelf layer is the target monitoring shelf layer, determine the number of real-time display of the shelf layer according to the real-time product position information.
  • the product position information is the position information of the top product in each display, so that each product position information corresponds to one display, that is, the number of product position information is the same as the number of display .
  • Step 208 Determine the number of unplaced products according to the number of real-time display and standard number of display of the shelf layer.
  • the number of exhibits that have not been placed is the difference between the number of standard exhibits and the number of real-time exhibits.
  • the number of unplaced merchandise columns can also be sent to a designated system for display to prompt staff to replenish.
  • the above-mentioned standard exhibition number is obtained by the following steps:
  • step B1 a standard state image is obtained, and the standard state image is input into a target detection model trained in advance to obtain standard target position information, where the standard target position information includes standard product position information.
  • the standard state image may be an image of at least one product placed in each display column, including an image of a shelf layer full of products.
  • step B2 the number of standard exhibits is determined according to the standard commodity position information.
  • the embodiments of the present disclosure determine the number of standard exhibitions by having images of commodities in each column.
  • step 207 For the relationship between the number of exhibits and the product position information, refer to the detailed description of step 207, and details are not described herein again. The difference is that the number of standard exhibits is determined by using images with products in each column, and the image in step 207 may include products in each column, or it may include at least one column without products.
  • an embodiment of the present disclosure provides a method for identifying a product status.
  • the method includes: acquiring a real-time status image, and acquiring calibration information corresponding to the real-time status image; and inputting the real-time status image to In the target detection model obtained in advance, real-time target position information is obtained, and the real-time target position information includes real-time product position information; product status information is determined according to the calibration information and the real-time product position information, and the calibration information corresponds to The real-time status image is described. It can indicate the degree of out of stock according to the product status information, which helps improve the accuracy of the replenishment prompt.
  • FIG. 3 a structure diagram of a product status recognition device is shown as follows.
  • the image acquisition module 301 is configured to acquire a real-time status image.
  • the commodity position acquisition module 302 is configured to input the real-time state image into a target training model obtained in advance to obtain real-time target position information, where the real-time target position information includes real-time product position information.
  • the commodity status determination module 303 is configured to determine commodity status information according to the calibration information and the real-time commodity location information, where the calibration information corresponds to the real-time status image.
  • an embodiment of the present disclosure provides a product status recognition device, the device includes: an image acquisition module for acquiring a real-time status image, and acquiring calibration information corresponding to the real-time status image; a product location An acquisition module is configured to input the real-time state image into a target training model obtained in advance to obtain real-time target position information, where the real-time target position information includes real-time product position information; a product state determination module is configured to The calibration information and the real-time product location information determine product status information, and the calibration information corresponds to the real-time status image. It can indicate the degree of out of stock according to the product status information, which helps improve the accuracy of the replenishment prompt.
  • FIG. 4 a structural diagram of another product status recognition device is shown, which is as follows.
  • the target detection model training module 401 is configured to perform deep learning through a pre-collected product state image set to obtain a target detection model.
  • Each product state image in the product state image set is labeled with product position information, and / or, a label. location information.
  • a calibration line detection module 402 is configured to detect a calibration line from a calibration image, obtain position information of the calibration line and a corresponding reference ratio, and the calibration image is obtained by photographing a shelf layer provided with the calibration line, and the calibration image The same shooting area corresponds to the real-time state image.
  • the image acquisition module 403 is configured to acquire a real-time status image.
  • the geometric correction module 404 is configured to geometrically correct the real-time state image, so that the shelf layer in the real-time state image is at a horizontal position.
  • the commodity location acquisition module 405 is configured to input the real-time state image into a target training model obtained in advance to obtain real-time target location information, where the real-time target location information includes real-time commodity location information.
  • the commodity state determination module 406 is configured to determine commodity state information according to the calibration information and the real-time commodity location information, where the calibration information corresponds to the real-time state image.
  • the above-mentioned commodity status determination module 406 includes:
  • the out-of-stock ratio determination submodule 4061 is configured to determine the out-of-stock ratio according to the position information of the calibration line, a corresponding reference ratio, and real-time product position information.
  • the real-time display determination module 407 is configured to determine, for each shelf layer in the real-time status image, a real-time display of the shelf layer according to the real-time product position information when the shelf layer is a target monitoring shelf layer. The number of columns.
  • the number of empty display columns determining module 408 is configured to determine the number of display columns of unplaced products according to the number of real-time display columns and the standard display number of the shelf layer.
  • the out-of-stock ratio determination submodule 4051 includes:
  • a shelf layer determining unit is configured to determine position information of each shelf layer according to the real-time label position information.
  • the reference vertical coordinate extraction unit is configured to target each display of each shelf layer in the real-time status image, and when the shelf layer is a target monitoring shelf layer, from the real-time product position information of the display Extract the reference ordinate.
  • a reference calibration line extraction unit is configured to determine a reference calibration line according to the reference ordinate and position information of the calibration line.
  • the out-of-stock ratio determining unit is configured to determine the out-of-stock ratio corresponding to the display according to the reference ratio corresponding to the reference calibration line.
  • the target monitoring shelf layer is obtained through the following modules:
  • a target position acquisition module is configured to input a calibration state image into the target detection model to obtain calibration target position information, where the calibration target position information includes calibration label position information and calibration product position information.
  • the shelf layer height determination module is used to determine the height of each shelf layer according to the position information of the calibration labels adjacent to each other.
  • the calibration display determination module is configured to determine the number of calibration display of each shelf layer according to the calibration product position information and calibration label position information.
  • the monitoring shelf layer determination module is configured to determine the shelf for each shelf layer when the height of the shelf layer is greater than a shelf height threshold, or when the number of calibrated exhibits of the shelf layer is greater than a threshold of the number of exhibits
  • the layer is the target monitoring shelf layer.
  • the above-mentioned standard exhibition number is obtained through the following modules:
  • the standard commodity position acquisition module is configured to acquire a standard state image and input the standard state image into a target training model obtained in advance to obtain standard target position information, where the standard target position information includes standard product position information.
  • the standard display number determining module is configured to determine the standard display number according to the standard commodity position information.
  • the calibration line is deleted after the calibration image is captured, or is present in the real-time status image.
  • an embodiment of the present disclosure provides a product status recognition device, the device includes: an image acquisition module for acquiring a real-time status image, and acquiring calibration information corresponding to the real-time status image; a product location An acquisition module is configured to input the real-time state image into a target training model obtained in advance to obtain real-time target position information, where the real-time target position information includes real-time product position information; a product state determination module is configured to The calibration information and the real-time product location information determine product status information, and the calibration information corresponds to the real-time status image. It can indicate the degree of out of stock according to the product status information, which helps improve the accuracy of the replenishment prompt.
  • An embodiment of the present disclosure further provides an electronic device including a processor, a memory, and a computer program stored on the memory and executable on the processor.
  • the processor is implemented when the processor executes the computer program. The method for identifying a product state in the foregoing embodiment.
  • An embodiment of the present disclosure also provides a computer program including computer-readable code that, when the computer-readable code runs on a computing processing device, causes the computing processing device to execute the method for identifying a product status of the foregoing embodiment.
  • An embodiment of the present disclosure also provides a computer-readable storage medium in which the computer program of the foregoing embodiment is stored.
  • the description is relatively simple.
  • the related parts refer to the description of the method embodiment.
  • the device embodiments described above are only schematic, wherein the units described as separate components may or may not be physically separated, and the components shown as units may or may not be physical units, that is, may be located One place, or it can be distributed across multiple network elements. Some or all of the modules may be selected according to actual needs to achieve the objective of the solution of this embodiment. Those of ordinary skill in the art can understand and implement without creative labor.
  • the various component embodiments of the present disclosure may be implemented in hardware, or in software modules running on one or more processors, or in a combination thereof.
  • a microprocessor or a digital signal processor (DSP) may be used in practice to implement some or all functions of some or all components in a computing processing device according to an embodiment of the present disclosure.
  • the present disclosure may also be implemented as a device or device program (e.g., a computer program and a computer program product) for performing part or all of the methods described herein.
  • Such a program that implements the present disclosure may be stored on a computer-readable storage medium or may have the form of one or more signals. Such signals can be downloaded from an Internet website, provided on a carrier signal, or provided in any other form.
  • FIG. 5 illustrates a computing processing device that can implement a method according to the present disclosure.
  • the computing processing device traditionally includes a processor 510 and a computer program product or computer-readable storage medium in the form of a memory 520.
  • the memory 520 may be an electronic memory such as a flash memory, an EEPROM (Electrically Erasable Programmable Read Only Memory), an EPROM, a hard disk, or a ROM.
  • the memory 520 has a storage space 530 of program code 531 for performing any of the method steps in the above method.
  • the storage space 530 for program code may include respective program codes 531 respectively for implementing various steps in the above method. These program codes can be read from or written into one or more computer program products.
  • Such computer program products include program code carriers such as hard disks, compact disks (CDs), memory cards or floppy disks.
  • Such a computer program product is typically a portable or fixed storage unit as described with reference to FIG.
  • the storage unit may have a storage section, a storage space, and the like arranged similar to the memory 520 in the computing processing device of FIG. 5.
  • the program code may be compressed, for example, in a suitable form.
  • the storage unit includes computer-readable code 531 ', that is, code that can be read by, for example, a processor such as 510, and these codes, when run by a computing processing device, cause the computing processing device to execute the method described above Steps.
  • one embodiment means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present disclosure. Also, please note that the word examples "in one embodiment” herein do not necessarily refer to the same embodiment.
  • any reference signs placed between parentheses shall not be construed as limiting the claim.
  • the word “comprising” does not exclude the presence of elements or steps not listed in a claim.
  • the word “a” or “an” preceding an element does not exclude the presence of a plurality of such elements.
  • the disclosure may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In the unit claim listing several devices, several of these devices may be embodied by the same hardware item.
  • the use of the words first, second, and third does not imply any order. These words can be interpreted as names.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Computing Systems (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Artificial Intelligence (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)
  • Control Of Vending Devices And Auxiliary Devices For Vending Devices (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

本公开实施例提供了一种商品状态识别方法,涉及网络技术领域,本公开实施例的商品状态识别方法包括:获取实时状态图像,以及,获取所述实时状态图像对应的标定信息;将所述实时状态图像输入至预先训练得到的目标检测模型中,得到实时目标位置信息,所述实时目标位置信息包括实时商品位置信息;根据所述标定信息与所述实时商品位置信息确定商品状态信息,所述标定信息对应所述实时状态图像。

Description

商品状态识别
本申请要求在2018年09月05日提交中国专利局、申请号为201811034093.4、发明名称为“商品状态识别方法、装置、电子设备及可读存储介质”的中国专利申请的优先权,其全部内容通过引用结合在本申请中
技术领域
本公开的实施例涉及网络技术领域,尤其涉及一种商品状态识别方法、装置、电子设备及可读存储介质。
背景技术
随着图像识别技术的日渐成熟,其应用也日渐广泛,可以应用于超市的商品监控,判断商品是否缺货,以及时提示补货。
现有技术中,申请号为US 2018/0060803 A1的专利申请提出了一种管理零售商品的系统和方法。主要步骤包括:首先,获取针对监控货架的两幅图像;然后,通过比较两幅图像之间的亮度、对比度、照度来估计商品的消耗数量;最后,根据时间戳估计商品的消耗速率,并向中央设备发送货架商品的消耗速率,工作人员根据该消耗速率判断是否需要补货。
然而,比较两幅图像之间的亮度、对比度、照度估计消耗数量的准确度较低,导致是否需要补货的判断准确性较低。
发明内容
本公开的实施例提供一种商品状态识别方法、装置、电子设备及可读存储介质,以解决现有技术商品状态识别的上述问题。
第一方面,本公开的实施例提供了一种商品状态识别方法,包括:
获取实时状态图像,以及,获取所述实时状态图像对应的标定信息;
将所述实时状态图像输入至预先训练得到的目标检测模型中,得到实时目标位置信息,所述实时目标位置信息包括实时商品位置信息;
根据所述标定信息与所述实时商品位置信息确定商品状态信息,所述标定信息对应所述实时状态图像。
第二方面,本公开的实施例提供了一种商品状态识别装置,包括:
图像获取模块,用于获取实时状态图像,以及,获取所述实时状态图像 对应的标定信息;
商品位置获取模块,用于将所述实时状态图像输入至预先训练得到的目标检测模型中,得到实时目标位置信息,所述实时目标位置信息包括实时商品位置信息;
商品状态确定模块,用于根据所述标定信息与所述实时商品位置信息确定商品状态信息,所述标定信息对应所述实时状态图像。
第三方面,本公开的实施例提供了一种电子设备,包括:
处理器、存储器以及存储在所述存储器上并可在所述处理器上运行的计算机程序,其特征在于,所述处理器执行所述计算机程序时实现前述商品状态识别方法。
第四方面,本公开的实施例提供了一种计算机程序,包括计算机可读代码,当所述计算机可读代码在计算处理设备上运行时,导致所述计算处理设备执行前述商品状态识别方法。
第五方面,本公开的实施例提供了一种计算机可读存储介质,其中存储了前述计算机程序。
本公开的实施例提供了一种商品状态识别方法,通过获取实时状态图像,以及,获取所述实时状态图像对应的标定信息;将所述实时状态图像输入至预先训练得到的目标检测模型中,得到实时目标位置信息,所述实时目标位置信息包括实时商品位置信息;根据所述标定信息与所述实时商品位置信息确定商品状态信息,所述标定信息对应所述实时状态图像。本公开的实施例可以根据商品状态信息表示缺货程度,有助于提高补货提示的准确性。
上述说明仅是本公开技术方案的概述,为了能够更清楚了解本公开的技术手段,而可依照说明书的内容予以实施,并且为了让本公开的上述和其它目的、特征和优点能够更明显易懂,以下特举本公开的具体实施方式。
附图说明
为了更清楚地说明本公开实施例的技术方案,下面将对本公开的实施例的描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本公开的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动性的前提下,还可以根据这些附图获得其他的附图。
图1示出了本公开的实施例提供的一种商品状态识别方法具体步骤流程图;
图2示出了本公开的实施例提供的另一种商品状态识别方法具体步骤流程图;
图3是本公开的实施例提供的一种商品状态识别装置的结构图;
图4是本公开的实施例提供的另一种商品状态识别装置的结构图;
图5示意性地示出了用于执行根据本公开的方法的电子设备的框图;以及
图6示意性地示出了用于保持或者携带实现根据本公开的方法的程序代码的存储单元。
具体实施例为使本公开的实施例的目的、技术方案和优点更加清楚,下面将结合本公开的实施例中的附图,对本公开的实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例是本公开的一部分实施例,而不是全部的实施例。基于本公开中的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都属于本公开保护的范围。
实施例一
参照图1,其示出了本公开的实施例提供的一种商品状态识别方法的步骤流程图,包括:
步骤101,获取实时状态图像。
其中,实时状态图像是针对商品展示区拍摄的实时图像,摄像头以一定时间周期不断的拍摄商品展示区。可以理解,时间周期可以根据实际应用场景设定,可以与商品销售速度相关。当商品销售速度较快时,时间周期较小;当商品销售速度较慢时,时间周期较大。
此外,摄像头在拍摄到实时状态图像之后统一发送至一个服务器存储。
步骤102,将所述实时状态图像输入至预先训练得到的目标检测模型中,得到实时目标位置信息,所述实时目标位置信息包括实时商品位置信息。
其中,目标检测模型的输入为实时状态图像,输出为该图像中各目标对应的位置信息。
实时商品位置信息为一个展列中最靠前商品的位置信息,该商品前面不存在商品,后面是商品。消费者在购买商品时,往往取最前面的商品,从而 最前面商品的位置可以体现缺货程度。
在本公开的实施例中,目标包括商品和标签。其中,标签用于展示商品的唯一标识、名称、价格等信息,可以为水墨标签,粘贴于货架层。
可以理解,标签通常为矩形。当然,也可以为其他指定形状。从而可以从货架层上根据形状识别出标签。
在本公开的实施例中,商品和标签均采用矩形框标记,位置信息可以为矩形框的左下角和右上角的坐标,也可以为矩形框的左上角和右下角的坐标。例如,若图像的左上角坐标为(0,0),则一个位置信息可以表示为左下角(12,100)和右上角(80,20),或,左上角(12,20)和右下角(80,100)。
可以理解,当实时状态图像得到至少一个商品和/或至少一个标签时,得到结果为位置信息的集合。
步骤103,根据所述标定信息与所述实时商品位置信息确定商品状态信息,所述标定信息对应所述实时状态图像。
标定信息用于在确定商品状态信息时作为参考,例如,在图像中标定缺货达到不同程度的直线,标定信息包括直线的位置、缺货程度等。
在实际应用中,针对每个拍摄商品展示区的摄像头,均需要确定对应的标定信息,并将该标定信息与摄像头绑定保存。从而可以针对每个摄像头获取对应的标定信息。
具体地,对于每个展列,可以将商品位置信息与各标定信息进行对比,得到位于商品位置两侧的标定线;然后可以确定商品位置对应的商品状态信息。
可以理解,商品状态信息与标定信息对应。例如,若商品位置两侧的标定线分别对应缺货20%和30%,则商品状态信息的范围为缺货20%至30%。
在实际应用中,可以在缺货比例达到一定阈值时提示补货。具体地,可以将提示补货的消息发送至中心服务器上。
综上所述,本公开的实施例提供了一种商品状态识别方法,所述方法包括:获取实时状态图像,以及,获取所述实时状态图像对应的标定信息;将所述实时状态图像输入至预先训练得到的目标检测模型中,得到实时目标位置信息,所述实时目标位置信息包括实时商品位置信息;根据所述标定信息与所述实时商品位置信息确定商品状态信息,所述标定信息对应所述实时状 态图像。可以根据商品状态信息表示缺货程度,有助于提高补货提示的准确性。
实施例二
本申请实施例对可选地商品状态识别方法进行了描述。
参照图2,其示出了本公开的实施例提供的另一种商品状态识别方法的具体步骤流程图。
步骤201,通过预先收集的商品状态图像集进行深度学习,得到目标检测模型,所述商品状态图像集中的每个商品状态图像被标记了商品位置信息,和/或,标签位置信息。
其中,商品状态图像集可以为针对真实商品销售区拍摄得到的图像,该图像中的每个完整商品和每个标签被矩形框标记,该矩形框的位置信息用左下角和右上角,或,左上角和右下角的坐标表示。
在实际应用中,标记信息还包括该矩形框的类别,表明该矩形框对应的是商品或标签。
在本公开的实施例中,被标记的商品为排列在最前面的商品,从而商品位置可以体现缺货状态。若图像左上角的坐标为(0,0),则一展列中商品位置的纵坐标越小,则该展列越缺货。
本公开的实施例可以采用Faster-RCNN(Faster Region with Convolutional Neural Network,更快速目标检测卷积神经网络)进行训练。可以理解,还可以采用RCNN(Region with Convolutional Neural Network,目标检测卷积神经网络)、Fast-RCNN(快速目标检测卷积神经网络)等其他可以从图像中检测标记目标的深度学习模型。
步骤202,从标定图像中检测标定线,得到所述标定线的位置信息和对应的参考比例,所述标定图像通过对设置有标定线的货架层拍摄得到,所述标定图像与所述实时状态图像对应相同的拍摄区域。
其中,标定图像存在标定区域,标定区域内存在标定线,标定线可以为表明缺货比例的直线。可以理解,标定图像可以仅包含标定区域和标定线,不包含商品。也可以在标定区域之外置有商品。但为了保证标定区域和标定线的检测,标定区域上不能置有商品。
在实际应用中,标定区域可以采用特殊样式、颜色的线条圈出,标定线 可以采用特征样式、颜色的直线表示。
具体地,可以从标定图像中首先确定标定区域;然后在该区域内检测特殊样式、颜色的直线得到标定线。其中,标定区域可以为每层货架左侧带有标定线的区域,横向区域为每层货架左侧第一个水墨标签的位置,与该层货架位于该水墨标签右侧的第一个商品位置之间,纵向区域为货架层的位置与下一货架层之间。
本公开的实施例可以采用标定区域内检测标定线,可以避免检测到与标定线类似的线,进而提高标定线的检测准确度。
可选地,在本公开的另一种实施例中,上述标定线在拍摄所述标定图像之后删除,或,存在于所述实时状态图像中。
可以理解,标定线在拍摄完标定图像之后可以从货架上抹掉,从而可以保证货架层的清洁度。当然还可以不抹掉。
步骤203,获取实时状态图像。
该步骤可以参照步骤101的详细说明,在此不再赘述。
步骤204,对所述实时状态图像进行几何校正,以使所述实时状态图像中的货架层处于水平位置。
在实际应用中,由于摄像头在拍摄时,可能存在摄像头倾斜的情况,从而导致拍摄的实时状态图像非水平方向的,为了准确表示货架层、标签、商品的位置,需要将实时状态图像进行几何校正。
其中,几何校正已经是比较成熟的技术了,本公开的实施例对其不进行赘述。
步骤205,将所述实时状态图像输入至预先训练得到的目标检测模型中,得到实时目标位置信息,所述实时目标位置信息包括实时商品位置信息。
该步骤可以参照步骤102的详细说明,在此不再赘述。
步骤206,根据所述标定线的位置信息、对应的参考比例、以及实时商品位置信息确定缺货比例。
具体地,首先根据商品位置信息确定位于商品上下两侧的标定线;然后获取上下两侧的标定线对应的参考比例;最后根据该两个参考比例确定缺货比例,例如,缺货比例在两个参考比例之间。
可选地,在本公开的另一种实施例中,上述步骤206包括子步骤2061 至2064:
子步骤2061,根据所述实时标签位置信息确定每个货架层的位置信息。
可以理解,标签位于货架层上,对于一个货架层,其宽度往往最多可以粘贴几个标签,从而可以根据不同标签纵坐标之间的差值确定货架层的位置和数目。
具体地,首先,将所有标签的纵坐标从小到大进行排序;然后,逐个计算相邻两个标签的纵坐标的差值;如果相邻两个标签的纵坐标的差值小于或等于标签平均高度的预设倍数,则该两个标签在同一货架层上;如果相邻两个标签的纵坐标的差值大于标签平均高度的预设倍数,则该两个标签不在同一个货架层上;最后,根据每个货架层上的所有标签位置信息确定货架层的位置信息。
可以理解,预设倍数可以根据货架层的宽度和标签平均高度确定;例如,若货架层的宽度是标签平均高度的2倍,则预设倍数为2。
子步骤2062,针对所述实时状态图像中的每个货架层的每个展列,在所述货架层为目标监控货架层的情况下,从所述展列的实时商品位置信息中提取参考纵坐标。
在本公开的实施例中,每个商品对应每个展列。
具体地,对于矩形框,参考纵坐标可以为左下角的纵坐标或右上角的纵坐标,也可以为左上角的纵坐标或右下角的纵坐标。
在实际应用中,由于图像在拍摄时,会存在部分被拍摄的货架层,或其他不规则的货架层,在本公开的实施例中,对该类货架层不进行监督。从而保证检测准确性。
可选地,在本公开的另一种实施例中,上述目标监控货架层通过如下步骤得到:
步骤A1,将标定状态图像输入至所述目标检测模型中,得到标定目标位置信息,所述标定目标位置信息包括标定标签位置信息和标定商品位置信息。
该步骤可以参照步骤102的详细说明,在此不再是赘述。
可以理解,标定状态图像需要首先进行几何校正。
可以理解,该步骤与步骤102的不同之处在于,输入的图像不同。
步骤A2,根据上下相邻的标定标签位置信息确定每个货架层的高度。
具体地,首先根据子步骤2061的详细说明确定每个货架层的位置信息;然后计算相邻两个货架层的位置信息的差值,得到货架层的高度。
步骤A3,根据所述标定商品位置信息、以及标定标签位置信息确定每个货架层的标定展列数目。
具体地,根据商品位置信息的纵坐标将商品划分至货架层。例如,商品位置信息的纵坐标大于货架层A的纵坐标,小于货架层B的纵坐标,则该商品位于货架层A。从而统计位于每个货架层的商品位置信息的数目可以得到该货架层的展列数目。
步骤A4,针对每个货架层,在所述货架层的高度大于货架高度阈值,或,所述货架层的标定展列数目大于展列数目阈值的情况下,确定所述货架层为目标监控货架层。
其中,货架高度阈值可以为根据货架层的平均高度确定,展列数目阈值可以根据货架层的平均展列数目确定。
本公开的实施例对过低或过小的货架层不进行监控。
子步骤2063,根据所述参考纵坐标与所述标定线的位置信息确定参考标定线。
在本公开的实施例中,标定线的位置信息主要指标定线的纵坐标。
具体地,从多个标定线中选取与参考纵坐标距离最近的上下两个标定线。例如,若标定线A的纵坐标为20,B的纵坐标为35,C的纵坐标为50,D的纵坐标为75,参考纵坐标为40,则,参考标定线为B和C。
当然,在实际应用中,参考纵坐标还可以为一个,例如若参考纵坐标为50,则参考标定线为C。
子步骤2064,根据所述参考标定线对应的参考比例确定所述展列对应的缺货比例。
在本公开的实施例中,每个标定线对应一个缺货比例。例如,对于步骤2063中的例子,若标定线B的参考比例65%,C的参考比例为50%,则参考纵坐标为40时,缺货比例为50%至65%,参考纵坐标为50时,缺货比例为50%。
步骤207,针对所述实时状态图像中的每个货架层,在所述货架层为目 标监控货架层的情况下,根据所述实时商品位置信息确定所述货架层的实时展列数目。
在本公开的实施例中,商品位置信息均为每个展列中的最靠前的商品的位置信息,从而每个商品位置信息对应一个展列,即:商品位置信息的数目和展列数目相同。
步骤208,根据所述货架层的实时展列数目和标准展列数目确定未放置商品的展列数目。
具体地,未放置商品的展列数目为标准展列数目和实时展列数目的差值。
可以理解,未放置商品的展列数目越大,缺货越严重;未放置商品的展列数目越小,缺货越不严重。
在实际应用中,还可以将未放置商品的展列数目发送至指定系统展示,以提示工作人员补货。
可选地,在本公开的另一种实施例中,上述标准展列数目通过如下步骤得到:
步骤B1,获取标准状态图像,并将所述标准状态图像输入至预先训练得到的目标检测模型中,得到标准目标位置信息,所述标准目标位置信息包括标准商品位置信息。
其中,标准状态图像可以为每个展列均至少置有一个商品的图像,包括货架层置满商品的图像。
可以理解,标准状态图像需要首先进行几何校正。
步骤B2,根据所述标准商品位置信息确定标准展列数目。
本公开的实施例通过每列均有商品的图像,确定标准展列数目。
展列数目和商品位置信息之间的关系可以参照步骤207的详细说明,在此不再赘述。区别在于:确定标准展列数目采用每列均有商品的图像,而步骤207的图像可能每列均有商品,也可能包括至少一列没有商品。
综上所述,本公开的实施例提供了一种商品状态识别方法,所述方法包括:获取实时状态图像,以及,获取所述实时状态图像对应的标定信息;将所述实时状态图像输入至预先训练得到的目标检测模型中,得到实时目标位 置信息,所述实时目标位置信息包括实时商品位置信息;根据所述标定信息与所述实时商品位置信息确定商品状态信息,所述标定信息对应所述实时状态图像。可以根据商品状态信息表示缺货程度,有助于提高补货提示的准确性。
实施例三
参照图3,其示出了一种商品状态识别装置的结构图,具体如下。
图像获取模块301,用于获取实时状态图像。
商品位置获取模块302,用于将所述实时状态图像输入至预先训练得到的目标检测模型中,得到实时目标位置信息,所述实时目标位置信息包括实时商品位置信息。
商品状态确定模块303,用于根据所述标定信息与所述实时商品位置信息确定商品状态信息,所述标定信息对应所述实时状态图像。
综上所述,本公开的实施例提供了一种商品状态识别装置,所述装置包括:图像获取模块,用于获取实时状态图像,以及,获取所述实时状态图像对应的标定信息;商品位置获取模块,用于将所述实时状态图像输入至预先训练得到的目标检测模型中,得到实时目标位置信息,所述实时目标位置信息包括实时商品位置信息;商品状态确定模块,用于根据所述标定信息与所述实时商品位置信息确定商品状态信息,所述标定信息对应所述实时状态图像。可以根据商品状态信息表示缺货程度,有助于提高补货提示的准确性。
实施例四
参照图4,其示出了另一种商品状态识别装置的结构图,具体如下。
目标检测模型训练模块401,用于通过预先收集的商品状态图像集进行深度学习,得到目标检测模型,所述商品状态图像集中的每个商品状态图像被标记了商品位置信息,和/或,标签位置信息。
标定线检测模块402,用于从标定图像中检测标定线,得到所述标定线的位置信息和对应的参考比例,所述标定图像通过对设置有标定线的货架层拍摄得到,所述标定图像与所述实时状态图像对应相同的拍摄区域。
图像获取模块403,用于获取实时状态图像。
几何校正模块404,用于对所述实时状态图像进行几何校正,以使所述 实时状态图像中的货架层处于水平位置。
商品位置获取模块405,用于将所述实时状态图像输入至预先训练得到的目标检测模型中,得到实时目标位置信息,所述实时目标位置信息包括实时商品位置信息。
商品状态确定模块406,用于根据所述标定信息与所述实时商品位置信息确定商品状态信息,所述标定信息对应所述实时状态图像。可选地,在本公开的另一种实施例中,上述商品状态确定模块406,包括:
缺货比例确定子模块4061,用于根据所述标定线的位置信息、对应的参考比例、以及实时商品位置信息确定缺货比例。
实时展列确定模块407,用于针对所述实时状态图像中的每个货架层,在所述货架层为目标监控货架层的情况下,根据所述实时商品位置信息确定所述货架层的实时展列数目。
空展列数目确定模块408,用于根据所述货架层的实时展列数目和标准展列数目确定未放置商品的展列数目。
可选地,在本公开的另一种实施例中,上述缺货比例确定子模块4051包括:
货架层确定单元,用于根据所述实时标签位置信息确定每个货架层的位置信息。
参考纵坐标提取单元,用于针对所述实时状态图像中的每个货架层的每个展列,在所述货架层为目标监控货架层的情况下,从所述展列的实时商品位置信息中提取参考纵坐标。
参考标定线提取单元,用于根据所述参考纵坐标与所述标定线的位置信息确定参考标定线。
缺货比例确定单元,用于根据所述参考标定线对应的参考比例确定所述展列对应的缺货比例。
可选地,在本公开的另一种实施例中,上述目标监控货架层通过如下模块得到:
目标位置获取模块,用于将标定状态图像输入至所述目标检测模型中,得到标定目标位置信息,所述标定目标位置信息包括标定标签位置信息和标定商品位置信息。
货架层高度确定模块,用于根据上下相邻的标定标签位置信息确定每个货架层的高度。
标定展列确定模块,用于根据所述标定商品位置信息、以及标定标签位置信息确定每个货架层的标定展列数目。
监控货架层确定模块,用于针对每个货架层,在所述货架层的高度大于货架高度阈值,或,所述货架层的标定展列数目大于展列数目阈值的情况下,确定所述货架层为目标监控货架层。
可选地,在本公开的另一种实施例中,上述标准展列数目通过如下模块得到:
标准商品位置获取模块,用于获取标准状态图像,并将所述标准状态图像输入至预先训练得到的目标检测模型中,得到标准目标位置信息,所述标准目标位置信息包括标准商品位置信息。
标准展列数目确定模块,用于根据所述标准商品位置信息确定标准展列数目。
可选地,在本公开的另一种实施例中,上述标定线在拍摄所述标定图像之后删除,或,存在于所述实时状态图像中。
综上所述,本公开的实施例提供了一种商品状态识别装置,所述装置包括:图像获取模块,用于获取实时状态图像,以及,获取所述实时状态图像对应的标定信息;商品位置获取模块,用于将所述实时状态图像输入至预先训练得到的目标检测模型中,得到实时目标位置信息,所述实时目标位置信息包括实时商品位置信息;商品状态确定模块,用于根据所述标定信息与所述实时商品位置信息确定商品状态信息,所述标定信息对应所述实时状态图像。可以根据商品状态信息表示缺货程度,有助于提高补货提示的准确性。
本公开的实施例还提供了一种电子设备,包括:处理器、存储器以及存储在所述存储器上并可在所述处理器上运行的计算机程序,所述处理器执行所述计算机程序时实现前述实施例的商品状态识别方法。
本公开的实施例还提供了一种计算机程序,包括计算机可读代码,当所述计算机可读代码在计算处理设备上运行时,导致所述计算处理设备执行前述实施例的商品状态识别方法。
本公开的实施例还提供了一种计算机可读存储介质,其中存储了前述实施例的计算机程序。
对于装置实施例而言,由于其与方法实施例基本相似,所以描述的比较简单,相关之处参见方法实施例的部分说明即可。
以上所描述的装置实施例仅仅是示意性的,其中所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部模块来实现本实施例方案的目的。本领域普通技术人员在不付出创造性的劳动的情况下,即可以理解并实施。
本公开的各个部件实施例可以以硬件实现,或者以在一个或者多个处理器上运行的软件模块实现,或者以它们的组合实现。本领域的技术人员应当理解,可以在实践中使用微处理器或者数字信号处理器(DSP)来实现根据本公开实施例的计算处理设备中的一些或者全部部件的一些或者全部功能。本公开还可以实现为用于执行这里所描述的方法的一部分或者全部的设备或者装置程序(例如,计算机程序和计算机程序产品)。这样的实现本公开的程序可以存储在计算机可读存储介质上,或者可以具有一个或者多个信号的形式。这样的信号可以从因特网网站上下载得到,或者在载体信号上提供,或者以任何其他形式提供。
例如,图5示出了可以实现根据本公开的方法的计算处理设备。该计算处理设备传统上包括处理器510和以存储器520形式的计算机程序产品或者计算机可读存储介质。存储器520可以是诸如闪存、EEPROM(电可擦除可编程只读存储器)、EPROM、硬盘或者ROM之类的电子存储器。存储器520具有用于执行上述方法中的任何方法步骤的程序代码531的存储空间530。例如,用于程序代码的存储空间530可以包括分别用于实现上面的方法中的各种步骤的各个程序代码531。这些程序代码可以从一个或者多个计算机程序产品中读出或者写入到这一个或者多个计算机程序产品中。这些计算机程序产品包括诸如硬盘,紧致盘(CD)、存储卡或者软盘之类的程序代码载体。这样的计算机程序产品通常为如参考图6所述的便携式或者固定 存储单元。该存储单元可以具有与图5的计算处理设备中的存储器520类似布置的存储段、存储空间等。程序代码可以例如以适当形式进行压缩。通常,存储单元包括计算机可读代码531’,即可以由例如诸如510之类的处理器读取的代码,这些代码当由计算处理设备运行时,导致该计算处理设备执行上面所描述的方法中的各个步骤。
本文中所称的“一个实施例”、“实施例”或者“一个或者多个实施例”意味着,结合实施例描述的特定特征、结构或者特性包括在本公开的至少一个实施例中。此外,请注意,这里“在一个实施例中”的词语例子不一定全指同一个实施例。
在此处所提供的说明书中,说明了大量具体细节。然而,能够理解,本公开的实施例可以在没有这些具体细节的情况下被实践。在一些实例中,并未详细示出公知的方法、结构和技术,以便不模糊对本说明书的理解。
在权利要求中,不应将位于括号之间的任何参考符号构造成对权利要求的限制。单词“包含”不排除存在未列在权利要求中的元件或步骤。位于元件之前的单词“一”或“一个”不排除存在多个这样的元件。本公开可以借助于包括有若干不同元件的硬件以及借助于适当编程的计算机来实现。在列举了若干装置的单元权利要求中,这些装置中的若干个可以是通过同一个硬件项来具体体现。单词第一、第二、以及第三等的使用不表示任何顺序。可将这些单词解释为名称。
最后应说明的是:以上实施例仅用以说明本公开的技术方案,而非对其限制;尽管参照前述实施例对本公开进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本公开各实施例技术方案的精神和范围。

Claims (14)

  1. 一种商品缺货识别方法,包括:
    获取实时状态图像;
    将所述实时状态图像输入至预先训练得到的目标检测模型中,得到实时目标位置信息,所述实时目标位置信息包括实时商品位置信息;
    根据标定信息与所述实时商品位置信息确定商品状态信息,所述标定信息对应所述实时状态图像。
  2. 根据权利要求1所述的方法,还包括:
    从标定图像中检测标定线,得到所述标定线的位置信息和对应的参考比例,所述标定图像通过对设置有标定线的货架层拍摄得到,所述标定图像与所述实时状态图像对应相同的拍摄区域。
  3. 根据权利要求2所述的方法,其中,所述商品状态信息包括缺货比例,所述根据所述标定信息与所述实时商品位置信息确定商品状态信息的步骤,包括:
    根据所述标定线的位置信息、对应的参考比例、以及实时商品位置信息确定缺货比例。
  4. 根据权利要求3所述的方法,其中,所述实时目标位置信息包括实时标签位置信息,所述根据所述标定线的位置信息、对应的参考比例、以及实时商品位置信息确定缺货比例的步骤,包括:
    根据所述实时标签位置信息确定每个货架层的位置信息;
    针对所述实时状态图像中的每个货架层的每个展列,在所述货架层为目标监控货架层的情况下,从所述展列的实时商品位置信息中提取参考纵坐标;
    根据所述参考纵坐标与所述标定线的位置信息确定参考标定线;
    根据所述参考标定线对应的参考比例确定所述展列对应的缺货比例。
  5. 根据权利要求1所述的方法,在所述将所述实时状态图像输入至预先训练得到的目标检测模型中的步骤之前,还包括:
    对所述实时状态图像进行几何校正,以使所述实时状态图像中的货架层处于水平位置。
  6. 根据权利要求4所述的方法,其中,所述目标监控货架层通过如下 步骤得到:
    将标定状态图像输入至所述目标检测模型中,得到标定目标位置信息,所述标定目标位置信息包括标定标签位置信息和标定商品位置信息;
    根据上下相邻的标定标签位置信息确定每个货架层的高度;
    根据所述标定商品位置信息、以及标定标签位置信息确定每个货架层的标定展列数目;
    针对每个货架层,在所述货架层的高度大于货架高度阈值,或,所述货架层的标定展列数目大于展列数目阈值的情况下,确定所述货架层为目标监控货架层。
  7. 根据权利要求4所述的方法,还包括:
    针对所述实时状态图像中的每个货架层,在所述货架层为目标监控货架层的情况下,根据所述实时商品位置信息确定所述货架层的实时展列数目;
    根据所述货架层的实时展列数目和标准展列数目确定未放置商品的展列数目。
  8. 根据权利要求7所述的方法,其中,所述标准展列数目通过如下步骤得到:
    获取标准状态图像,并将所述标准状态图像输入至预先训练得到的目标检测模型中,得到标准目标位置信息,所述标准目标位置信息包括标准商品位置信息;
    根据所述标准商品位置信息确定标准展列数目。
  9. 根据权利要求1所述的方法,其中,所述目标检测模型通过如下步骤训练得到:
    通过预先收集的商品状态图像集进行深度学习,得到目标检测模型,所述商品状态图像集中的每个商品状态图像被标记了商品位置信息,和/或,标签位置信息。
  10. 根据权利要求2所述的方法,其中,所述标定线在拍摄所述标定图像之后删除,或,存在于所述实时状态图像中。
  11. 一种商品状态识别装置,包括:
    图像获取模块,用于获取实时状态图像,以及,获取所述实时状态图像对应的标定信息;
    商品位置获取模块,用于将所述实时状态图像输入至预先训练得到的目标检测模型中,得到实时目标位置信息,所述实时目标位置信息包括实时商品位置信息;
    商品状态确定模块,用于根据所述标定信息与所述实时商品位置信息确定商品状态信息,所述标定信息对应所述实时状态图像。
  12. 一种电子设备,包括:
    处理器、存储器以及存储在所述存储器上并可在所述处理器上运行的计算机程序,其特征在于,所述处理器执行所述程序时实现如权利要求1至8中一个或多个所述的商品状态识别方法。
  13. 一种计算机程序,包括计算机可读代码,当所述计算机可读代码在计算处理设备上运行时,导致所述计算处理设备执行根据权利要求1至10中的任一个所述的商品状态识别方法。
  14. 一种计算机可读存储介质,其中存储了如权利要求13所述的计算机程序。
PCT/CN2019/104425 2018-09-05 2019-09-04 商品状态识别 WO2020048492A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201811034093.4 2018-09-05
CN201811034093.4A CN109446883B (zh) 2018-09-05 2018-09-05 商品状态识别方法、装置、电子设备及可读存储介质

Publications (1)

Publication Number Publication Date
WO2020048492A1 true WO2020048492A1 (zh) 2020-03-12

Family

ID=65530337

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/104425 WO2020048492A1 (zh) 2018-09-05 2019-09-04 商品状态识别

Country Status (2)

Country Link
CN (1) CN109446883B (zh)
WO (1) WO2020048492A1 (zh)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111291834A (zh) * 2020-03-27 2020-06-16 华士磐典科技(上海)有限公司 一种快速生成货架数字陈列图的方法
CN111612000A (zh) * 2020-05-26 2020-09-01 创新奇智(西安)科技有限公司 一种商品分类方法、装置、电子设备及存储介质
CN111666927A (zh) * 2020-07-08 2020-09-15 广州织点智能科技有限公司 商品识别方法、装置、智能货柜和可读存储介质
CN111898417A (zh) * 2020-06-17 2020-11-06 厦门华联电子股份有限公司 货柜系统、货品检测装置及方法
CN112883955A (zh) * 2021-03-10 2021-06-01 洛伦兹(北京)科技有限公司 货架布局检测方法、装置及计算机可读存储介质
CN115359117A (zh) * 2022-08-30 2022-11-18 创新奇智(广州)科技有限公司 商品陈列位置确定方法、装置及可读存储介质

Families Citing this family (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109446883B (zh) * 2018-09-05 2022-08-23 北京三快在线科技有限公司 商品状态识别方法、装置、电子设备及可读存储介质
CN109998296A (zh) * 2019-04-03 2019-07-12 中南大学 实验室智能物料柜
CN110046566A (zh) * 2019-04-10 2019-07-23 中南大学 一种基于图形识别及物联网技术的实验室智能物料柜
CN110189319B (zh) * 2019-05-31 2021-08-13 北京百度网讯科技有限公司 置物架分割方法、装置、设备和存储介质
CN110287928B (zh) * 2019-07-01 2021-09-14 创优数字科技(广东)有限公司 商品缺货检测方法及装置
CN110472515B (zh) * 2019-07-23 2021-04-13 创新先进技术有限公司 货架商品检测方法及系统
US11069073B2 (en) 2019-07-23 2021-07-20 Advanced New Technologies Co., Ltd. On-shelf commodity detection method and system
CN110533028B (zh) * 2019-07-31 2021-08-13 北京三快在线科技有限公司 商品陈列状态的检测方法、装置、电子设备及存储介质
CN110516628A (zh) * 2019-08-29 2019-11-29 上海扩博智能技术有限公司 货架空缺位置商品信息获取方法、系统、设备及存储介质
CN112580411A (zh) * 2019-09-30 2021-03-30 深圳云天励飞技术有限公司 一种货架缺货告警方法、装置、货架、系统及电子设备
CN111753614A (zh) * 2019-11-01 2020-10-09 北京京东尚科信息技术有限公司 一种商品货架的监控方法和装置
CN110852240A (zh) * 2019-11-06 2020-02-28 创新奇智(成都)科技有限公司 零售商品检测系统及检测方法
CN110991273B (zh) * 2019-11-18 2021-03-19 支付宝(杭州)信息技术有限公司 层板调控设备、方法及装置
CN110910567A (zh) * 2019-11-29 2020-03-24 合肥美的智能科技有限公司 扣付方法、装置、电子设备、计算机可读存储介质和货柜
CN111553889A (zh) * 2020-04-16 2020-08-18 上海扩博智能技术有限公司 货架上商品摆放位置的对比方法、系统、设备和存储介质
CN111931674B (zh) * 2020-08-18 2024-04-02 创新奇智(成都)科技有限公司 物品识别管理方法、装置、服务器及可读存储介质
CN111738245B (zh) * 2020-08-27 2020-11-20 创新奇智(北京)科技有限公司 商品识别管理方法、装置、服务器及可读存储介质
CN112488608A (zh) * 2020-11-18 2021-03-12 北京三快在线科技有限公司 补货管理方法、装置、存储介质及终端
CN112699778A (zh) * 2020-12-29 2021-04-23 上海零眸智能科技有限公司 一种基于深度学习的冰柜库存情况监督识别方法
CN114022999A (zh) * 2021-10-27 2022-02-08 北京云迹科技有限公司 一种自动售货机的缺货检测方法、装置、设备和介质
CN116935198B (zh) * 2023-09-12 2023-11-24 汉朔科技股份有限公司 货架商品的缺货检测方法、装置及机器人

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105701519A (zh) * 2014-12-10 2016-06-22 株式会社理光 基于超像素的图像的实际货架图景象分析
CN107045641A (zh) * 2017-04-26 2017-08-15 广州图匠数据科技有限公司 一种基于图像识别技术的货架识别方法
CN107516111A (zh) * 2017-08-23 2017-12-26 昆山塔米机器人有限公司 一种售卖机缺货检测方法及装置
US20180060803A1 (en) * 2016-08-23 2018-03-01 Wal-Mart Stores, Inc. System and method for managing retail products
CN109446883A (zh) * 2018-09-05 2019-03-08 北京三快在线科技有限公司 商品状态识别方法、装置、电子设备及可读存储介质

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4445265B2 (ja) * 2002-01-23 2010-04-07 ヴュー・テクノロジー・インコーポレーテッド 在庫管理システム
US8189855B2 (en) * 2007-08-31 2012-05-29 Accenture Global Services Limited Planogram extraction based on image processing
CN102982332A (zh) * 2012-09-29 2013-03-20 顾坚敏 基于云处理方式的零售终端货架影像智能分析系统
CN102930264B (zh) * 2012-09-29 2015-10-28 李炳华 基于图像识别技术的商品陈列信息采集分析系统及方法
RU2015133464A (ru) * 2013-01-11 2017-02-17 Тагнэтикс, Инк. Датчик отсутствия товаров
CN105654211A (zh) * 2016-03-10 2016-06-08 北京小米移动软件有限公司 信息推送方法及装置
CN107822400B (zh) * 2016-09-15 2020-12-15 东芝泰格有限公司 库存管理装置及控制方法、终端设备
TWI618916B (zh) * 2016-09-23 2018-03-21 啟碁科技股份有限公司 貨架庫存估測方法與系統
CN107131925A (zh) * 2017-04-28 2017-09-05 南京邮电大学 一种基于图像处理的水位实时监测方法
CN107358313A (zh) * 2017-06-16 2017-11-17 深圳市盛路物联通讯技术有限公司 一种超市管理方法及装置
CN107121074B (zh) * 2017-06-26 2019-07-26 辽宁科技大学 一种利用机器视觉进行尾矿库干滩长度测量的方法
CN107886285A (zh) * 2017-12-25 2018-04-06 康美药业股份有限公司 供销管理方法、设备及计算机可读存储介质

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105701519A (zh) * 2014-12-10 2016-06-22 株式会社理光 基于超像素的图像的实际货架图景象分析
US20180060803A1 (en) * 2016-08-23 2018-03-01 Wal-Mart Stores, Inc. System and method for managing retail products
CN107045641A (zh) * 2017-04-26 2017-08-15 广州图匠数据科技有限公司 一种基于图像识别技术的货架识别方法
CN107516111A (zh) * 2017-08-23 2017-12-26 昆山塔米机器人有限公司 一种售卖机缺货检测方法及装置
CN109446883A (zh) * 2018-09-05 2019-03-08 北京三快在线科技有限公司 商品状态识别方法、装置、电子设备及可读存储介质

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111291834A (zh) * 2020-03-27 2020-06-16 华士磐典科技(上海)有限公司 一种快速生成货架数字陈列图的方法
CN111291834B (zh) * 2020-03-27 2022-06-10 华士磐典科技(上海)有限公司 一种快速生成货架数字陈列图的方法
CN111612000A (zh) * 2020-05-26 2020-09-01 创新奇智(西安)科技有限公司 一种商品分类方法、装置、电子设备及存储介质
CN111612000B (zh) * 2020-05-26 2023-09-12 创新奇智(西安)科技有限公司 一种商品分类方法、装置、电子设备及存储介质
CN111898417A (zh) * 2020-06-17 2020-11-06 厦门华联电子股份有限公司 货柜系统、货品检测装置及方法
CN111898417B (zh) * 2020-06-17 2023-08-08 厦门华联电子股份有限公司 货柜系统、货品检测装置及方法
CN111666927A (zh) * 2020-07-08 2020-09-15 广州织点智能科技有限公司 商品识别方法、装置、智能货柜和可读存储介质
CN112883955A (zh) * 2021-03-10 2021-06-01 洛伦兹(北京)科技有限公司 货架布局检测方法、装置及计算机可读存储介质
CN112883955B (zh) * 2021-03-10 2024-02-02 洛伦兹(北京)科技有限公司 货架布局检测方法、装置及计算机可读存储介质
CN115359117A (zh) * 2022-08-30 2022-11-18 创新奇智(广州)科技有限公司 商品陈列位置确定方法、装置及可读存储介质

Also Published As

Publication number Publication date
CN109446883A (zh) 2019-03-08
CN109446883B (zh) 2022-08-23

Similar Documents

Publication Publication Date Title
WO2020048492A1 (zh) 商品状态识别
US11640576B2 (en) Shelf monitoring device, shelf monitoring method, and shelf monitoring program
US11587029B2 (en) Determining product placement compliance
WO2019165892A1 (zh) 自动售货方法、装置和计算机可读存储介质
US9697429B2 (en) Method and apparatus for image processing to avoid counting shelf edge promotional labels when counting product labels
TWI618916B (zh) 貨架庫存估測方法與系統
US11288734B2 (en) Intelligent shelf display system
WO2019087792A1 (ja) 棚札検出装置、棚札検出方法、及び、棚札検出プログラム
WO2020211499A1 (zh) 一种商品的自助收银方法和设备
WO2021185281A1 (zh) 货架交互方法和货架
US11176684B2 (en) Customer behavior analyzing method and customer behavior analyzing system
JP2019094191A (ja) 棚割生成プログラム、棚割生成方法及び棚割生成装置
JP2018131331A (ja) 物品管理装置及び物品管理方法
CN110895747B (zh) 商品信息识别、显示、信息关联、结算方法及系统
CN113935774A (zh) 图像处理方法、装置、电子设备及计算机存储介质
WO2021179138A1 (zh) 商超货架上商品的分析方法和系统
JP2022183305A (ja) 情報処理装置、情報処理方法、およびプログラム
CN108629318A (zh) 一种基于图像识别技术的货架陈列识别方法、装置和系统
CN111428743B (zh) 商品识别方法、商品处理方法、装置及电子设备
JP7130945B2 (ja) 在庫検出プログラム、在庫検出方法及び在庫検出装置
TWI710968B (zh) 商品影像辨識與數量監控系統
JP7404038B2 (ja) 情報処理システムと情報処理装置と情報処理プログラムと情報処理方法
CN111191551A (zh) 商品检测的方法及装置
CN114693335A (zh) 信息处理装置
JP2022040557A (ja) 情報処理装置と情報処理システムと情報処理プログラムと情報処理方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19857903

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19857903

Country of ref document: EP

Kind code of ref document: A1