US20190009987A1 - Shelf space allocation management device and shelf space allocation management method - Google Patents

Shelf space allocation management device and shelf space allocation management method Download PDF

Info

Publication number
US20190009987A1
US20190009987A1 US16/125,345 US201816125345A US2019009987A1 US 20190009987 A1 US20190009987 A1 US 20190009987A1 US 201816125345 A US201816125345 A US 201816125345A US 2019009987 A1 US2019009987 A1 US 2019009987A1
Authority
US
United States
Prior art keywords
product
shelf
image
allocation
imaging device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US16/125,345
Inventor
Nobuyuki Yamashita
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
NEC Corp
Original Assignee
NEC Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by NEC Corp filed Critical NEC Corp
Priority to US16/125,345 priority Critical patent/US20190009987A1/en
Publication of US20190009987A1 publication Critical patent/US20190009987A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C39/00Aircraft not otherwise provided for
    • B64C39/02Aircraft not otherwise provided for characterised by special use
    • B64C39/024Aircraft not otherwise provided for characterised by special use of the remote controlled vehicle type, i.e. RPV
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B65CONVEYING; PACKING; STORING; HANDLING THIN OR FILAMENTARY MATERIAL
    • B65GTRANSPORT OR STORAGE DEVICES, e.g. CONVEYORS FOR LOADING OR TIPPING, SHOP CONVEYOR SYSTEMS OR PNEUMATIC TUBE CONVEYORS
    • B65G1/00Storing articles, individually or in orderly arrangement, in warehouses or magazines
    • B65G1/02Storage devices
    • B65G1/04Storage devices mechanical
    • B65G1/137Storage devices mechanical with arrangements or automatic control means for selecting which articles are to be removed
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/0094Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot involving pointing a payload, e.g. camera, weapon, sensor, towards a fixed or moving target
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • G06K9/00342
    • G06K9/00624
    • G06K9/6202
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/08Logistics, e.g. warehousing, loading or distribution; Inventory or stock management
    • G06Q10/087Inventory or stock management, e.g. order filling, procurement or balancing against orders
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]
    • G06Q30/0639Item locations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/17Image acquisition using hand-held instruments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/20Scenes; Scene-specific elements in augmented reality scenes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/103Static body considered as a whole, e.g. static pedestrian or occupant recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U10/00Type of UAV
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U10/00Type of UAV
    • B64U10/10Rotorcrafts
    • B64U10/17Helicopters
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications
    • B64U2101/30UAVs specially adapted for particular uses or applications for imaging, photography or videography
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications
    • B64U2101/30UAVs specially adapted for particular uses or applications for imaging, photography or videography
    • B64U2101/31UAVs specially adapted for particular uses or applications for imaging, photography or videography for surveillance
    • G06K9/00288
    • G06K9/209
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/172Classification, e.g. identification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/23Recognition of whole body movements, e.g. for sport training

Definitions

  • the present invention relates to a shelf space allocation management device and a shelf space allocation management method for managing products allocated on shelves i.e. shelf space allocation of products).
  • clerks have managed products by allocating products on shelves i.e. shelf space allocation of products) in distribution-retail stores. They may visually recognize products so as to manually count and check the number of products displayed on shelves. To automatically check shelves by way of image recognition, clerks or operators of marketing researches may have recognized objects or products from images captured by imaging devices. In addition, they may have determined the presence/absence of products by use of electronic readers for RIM (Radio Frequency Identification) tags attached to products.
  • RIM Radio Frequency Identification
  • Patent Literature 1 discloses a product monitoring system that processes images representing statuses of displaying products so as to determine the timings of supplementing products based on time-series variations of images.
  • Patent Literature 2 discloses an inventory management method using image recognition of products displayed on shelves and captured by multiple cameras. This method uses mobile cameras in stores, and therefore cameras are moved under a predetermined procedure so as to capture images of products and thereby check the number of products by way of image recognition.
  • Patent Literature 3 discloses a technology of recognizing the type of products being imaged by way of comparison between the images captured by cameras and the features of products stored on a database.
  • Patent Literature 1 Japanese Patent Application No. H05-81552
  • Patent Literature 2 Japanese Patent Application No. 2001-88912
  • Patent Literature 3 Japanese Patent Application No. 2014-164594
  • Patent Literature 2 needs to capture images of products under a predetermined procedure, and therefore it is difficult to efficiently carry out image recognition and thereby grasp the statue of products displayed on shelves in real time.
  • the present invention aims to provide a shelf space allocation management device and a shelf space allocation management method, which can efficiently carry out image recognition of products so as to check the allocation of products and the number of products on shelves.
  • a first aspect of the present invention relates to a shelf space allocation management device for managing products allocated on a shelf.
  • the shelf space allocation management device includes an image acquisition part configured to acquire an image including a position assumed to be changed in an allocation status of each product on the shelf; an allocation status determination part configured to determine whether a type and an allocation status of each product reflected in the image match a predetermined type and a predetermined allocation status of each product; and an execution determination part configured to execute a product allocation inspection based on the determination result of the allocation status determination part.
  • a second aspect of the present invention relates to a shelf space allocation management system for managing products allocated on a shelf.
  • the shelf space allocation management system includes an imaging device configured to capture an image of each product while moving along the shelf; and the aforementioned shelf space allocation management device.
  • a third aspect of the present invention relates to a shelf space allocation management method for managing products allocated on a shelf.
  • the shelf space allocation management method includes the steps of: acquiring an image including a position assumed to be changed in allocation status of each product on the shelf; determining whether a type and an allocation status of each product reflected in the image match a predetermined type and a predetermined allocation status of each product; and determining whether to execute a product allocation inspection based on the determination result.
  • the present invention it is possible to automatically check the status of allocating products on shelves arranged in a store by way of image recognition. In addition, it is possible to efficiently check the allocation of ducts at an appropriate timing.
  • FIG. 1 is a block diagram showing the minimum configuration of a product management device according to one embodiment of the present invention.
  • FIG. 2 is a block diagram showing the detailed configuration of the product management device according to one embodiment of the present invention.
  • FIG. 3 is a layout plan of a store adopting a product allocation management system according to one embodiment of the present invention.
  • FIG. 4A shows an example of a product table registered in a product database stored on a storage unit of the product management device.
  • FIG. 4B shows an example of a map information table registered in a map information database stored on the storage unit of the product management device.
  • FIG. 4C shows an example of a product allocation table registered in a shelf space allocation database stored on the storage unit of the product management device.
  • FIG. 5 is a flowchart showing a first procedure for the product management device.
  • FIG. 6 is a flowchart showing a second procedure for the product management device.
  • FIG. 7 is a flowchart showing a third procedure for the product management device.
  • FIG. 8 is a flowchart showing a fourth procedure for the product management device.
  • FIG. 9A shows a product inspection frequency table stored on the storage unit of the product management device.
  • FIG. 9B shows a shelf inspection frequency table stored on the storage unit of the product management device.
  • FIG. 10 is a flowchart showing a fifth procedure for the product management device.
  • FIG. 11 is a flowchart showing a sixth procedure for the product management device.
  • FIG. 1 is a block diagram showing the minimum configuration of a shelf space allocation management device (hereinafter, referred to as a product management device) 10 according to one embodiment of the present invention.
  • the product management device 10 includes an image acquisition part 14 and an allocation status determination part 16 .
  • the image acquisition part 14 captures an image including the position of any product assumed to be changed in its status.
  • the image acquisition part 14 may capture moving images or still images.
  • the allocation status determination part 16 determines whether a product reflected in an image captured by the image acquisition part 14 is of a predetermined type or in a predetermined allocation status.
  • FIG. 2 is a block diagram showing the detailed configuration of the product management device 10 according to one embodiment of the present invention.
  • the product management device 10 checks appropriateness with respect to the type of each product, the allocation status for each product, the number of products, and the allocation method of products.
  • the allocation status represents the number of products and the allocation of products on shelves in rows and columns.
  • the operation for checking the type of each product and the allocation status for each product will be referred to as product allocation inspection.
  • the product management device 10 includes a position specifying part 11 , an imaging device controller 12 , a human detector 13 , an image acquisition part 14 , a product recognition part 15 , an allocation status determination part 16 , a product purchase information acquisition part 17 , an execution determination part 18 , an output part 19 , a communication part 20 , and a storage unit 21 .
  • the image acquisition part 14 and the allocation status determination part 16 are included in the minimum configuration shown in FIG. 1 .
  • the position specifying part 11 specifies the position of any product assumed to be changed in its allocation status.
  • the imaging device controller 12 controls the movement of an imaging device (not shown) and the timing of starting or stopping imaging.
  • the human detector 13 detects whether each product is assumed to be changed in its allocation status by a person reflected in an image captured by an imaging device.
  • the product recognition part 15 recognizes which product is regarded as an object reflected in an image captured by an imaging device with reference to images of products registered in a product database (not shown) prepared in advance.
  • the product purchase information acquisition part 17 acquires the information of each product purchased by a customer. For example, the product information represents any product selected by a customer and the number of products purchased by a customer.
  • the output part 19 outputs a message to a display device connected to the product management device 10 .
  • the output part 19 outputs a message concerning the current allocation status of products and urging a clerk displaying products.
  • the communication part 20 communicates with an external device.
  • the storage unit 21 stores various pieces of information such as a product database registering images of products, a map information database representing the location of each shelf in a store, and a shelf space allocation database representing which product should be displayed on a specified shelf and the number of products displayed thereon.
  • FIG. 3 shows a layout plan of a store adopting a shelf space allocation management system according to one embodiment of the present invention.
  • shelves 101 through 104 and a surveillance imaging device 105 are arranged on a floor 100 of a store.
  • a mobile imaging device 108 is arranged on the floor 100 .
  • FIG. 3 shows that a clerk 106 walks around the floor 100 so as to visually recognize the shelves 101 through 104 while a customer 107 is selecting any product displayed on the shelf 103 .
  • another clerk 109 may receive payment and settles its account for any product purchased by a customer by use of a register 110 .
  • FIG. 3 shows a shelf space allocation management system including the product management device 10 and the mobile imaging device 108 .
  • the mobile imaging device 108 moves on the floor 100 so as to capture an image in a predetermined scope of imaging based on the allocated positions of products subjected to inspection and displayed on the shelves 101 through 104 .
  • the predetermined scope of imaging may entirely cover the allocated position of each product to reflect an explanatory tag (e.g. POP advertisement) for each product. Images captured by the mobile imaging device 108 are subjected to product allocation inspection for the product management device 10 .
  • an unmanned air vehicle (or a drone) having an automatic tracking function and an imaging device as the mobile imaging device 108 .
  • the unmanned air vehicle may wirelessly communicate with a mobile terminal device worn by the clerk 106 .
  • the unmanned air vehicle may acquire the positional information of a mobile terminal device so as to track the movement of the clerk 106 while holding a predetermined relative positional relationship with a mobile terminal device.
  • an automatic tracking method it is possible to track the clerk 106 by way of image recognition for identifying the clerk 106 based on an image captured by the mobile imaging device 108 .
  • an unmanned air vehicle As a moving method, it is possible for an unmanned air vehicle to move and follow after the clerk 106 with holding a certain distance by way of an automatic tracking function.
  • the unmanned air vehicle can be equipped with a GPS device to transmit its current position to the product management device 10 .
  • the imaging device controller 12 transmits to the unmanned air vehicle via the communication part 20 a command signal indicating the direction of moving the unmanned air vehicle based on a predetermined target position and the positional information acquired from the unmanned air vehicle.
  • the imaging device controller 12 transmits to the unmanned air vehicle a command signal indicating that the unmanned air vehicle should stop when it arrives at a target position. In this connection, it is possible to switch over an automatic tracking mode of an unmanned air vehicle and a position control mode of an unmanned air vehicle by the imaging device controller 12 .
  • an unmanned air vehicle may be equipped with an imaging device to capture an image in a traveling direction.
  • the imaging device may capture an image in a scope of 360° around an unmanned air vehicle.
  • the mobile imaging device controller 12 According to an instruction to start imaging by the imaging device controller 12 , the mobile imaging device 108 starts to capture an image.
  • the mobile imaging device controller 12 According to an instruction to stop imaging by the imaging device controller 12 , the mobile imaging device 108 stops to capture an image.
  • the mobile imaging device 108 transmits its captured image to the product management device 10 .
  • the image acquisition part 14 acquires images captured by the mobile imaging device 108 so as to send images to the product recognition part 15 .
  • the product recognition part 15 analyzes images so as to recognize which product corresponds to objects reflected in images by way of the known image recognition technology. For example, the product recognition part 15 compares multiple features in images of various products, which are registered in the product database stored on the storage unit 21 , with features in captured images of products so as to specify a product having the largest number of matched features as a product reflected in an image captured by the image acquisition part 14 . In addition, the product recognition part 15 specifies products reflected in an image so as to determine which product is being reflected in an image while counting the number of products.
  • the product recognition part 15 compares features such as patterns of wrapping and outlines of images of products, which are registered in the product database, with features such as outlines of objects reflected in images captured by the image acquisition part 14 , thus counting the number of specified products being reflected in images. Moreover, the product recognition part 15 determines the mutual positional relationship based on positional information for images of specified products so as to determine the number of rows and the number of columns for allocating products on each shelf.
  • the allocation status determination part 16 determines whether those products are subjected to an appropriate allocation of products. For example, the allocation status determination part 16 inspects whether products are allocated at proper positions on each shelf and whether the appropriate number of products are allocated on each shelf based on the allocated position of each product registered in the shelf space allocation database stored on the storage unit 21 as well as the minimum number of products. When an insufficient number of products are allocated on each shelf, for example, the allocation status determination part 16 outputs to a display device through the output part 19 a message of “X sets are insufficient in product A”.
  • the surveillance imaging device 105 transmits to the product management device 10 an image capturing the entire status of the floor 100 .
  • the status concerning the clerk 106 and the customer 107 is reflected in an image captured by the surveillance imaging device 105 .
  • a single set of the surveillance imaging device 105 is arranged in a store layout, it is possible to arrange multiple sets of the surveillance imaging device 105 for imaging the entire status of the shelves 101 through 104 .
  • the image acquisition part 14 acquires an image captured by the surveillance imaging device 105 so as to transmit the image to the human detector 13 .
  • the human detector 13 analyzes images by way of the known image recognition technology so as to identify whether a person reflected in an image is a clerk or a customer and thereby recognize the behavior of a person reflected in an image.
  • a clerk for example, his/her facial image is stored on the storage unit 21 in advance, and therefore the human detector 13 compares the facial image of a person reflected in an image with the facial image of a clerk stored on the storage unit 21 , thus identifying whether the person matches a clerk.
  • the human detector 13 may identify a person as a clerk when the costume of a person reflected in an image matches the uniform for a clerk.
  • the human detector 13 identifies a person as a customer when it fails to identify a person reflected in an image as a clerk.
  • the human detector 13 is able to recognize the behavior of a person reflected in an image (e.g. activities concerning “a person who extends his/her arm toward a shelf so as to take a product in his/her hand” and “a person who looks around in front of a shelf”) by way of the known image recognition technology for detecting movements of persons' arms and variations of facial directions based on a plurality of time-series images.
  • the register 110 is connected to the product management device 10 .
  • the register 110 is equipped with a barcode reader.
  • the barcode reader acquires and stores a product's identification on a storage unit of the register 110 .
  • the clerk 109 completes reading a barcode on a product and then the customer 107 completes paying the price of a product, and therefore the clerk 109 makes a completion operation with the register 110 , which in turn transmits the identification of a product purchased by the customer 107 to the product management device 10 .
  • the product management device 10 of the present embodiment carries out product allocation inspection when it is assume that any change may occur in the status of allocating products displayed on each shelf.
  • FIGS. 4A to 4C show examples of data tables stored on the storage unit 21 of the product management device 10 of the present embodiment.
  • FIG. 4A shows an example of a product table registered in a product database stored on the storage unit 21 .
  • FIG. 4B shows an example of a map information table registered in a map information database stored on the storage unit 21 .
  • FIG. 4C shows an example of a shelf space allocation table registered in a shelf space allocation database stored on the storage unit 21 .
  • the product table of FIG. 4A has items such as “Product ID” and “Product Image 1” through “Product Image N”.
  • the item of “Product ID” has records concerning identifications of products.
  • the items of “Product Image 1” through “Product Image N” have records concerning images capturing products at various angles.
  • an image capturing its front face may differs from an image capturing its rear face.
  • the product management device 10 cannot properly recognize the product if the product table registers an image capturing the front face of the product alone. Therefore, the product table is designed to register multiple images for each product. In this connection, it is unnecessary to register N images for each product.
  • the map information table of FIG. 4B has items such as “Shelf ID” and “Position Information”.
  • the item of “Shelf ID” has records concerning identifications of shelves.
  • the item of “Position information” has records concerning positional information for each shelf.
  • the positional information represents three-dimensional coordinates for each shelf based on an origin corresponding to a corner of the floor 100 .
  • the shelf ID “001” relates to positional information which represents a position measuring x meters in an x-axis direction, y meters in a y-axis direction, and z meters in a z-axis direction (i.e. a height direction). That is, a shelf having the shelf ID “001” is located at a position representing the positional information (x,y,z) on the floor 100 .
  • the shelf space allocation table of FIG. 4C has items such as “Shelf ID”, “Product ID”, “Minimum Number”, “Current Number”, and “Last Check Time”.
  • the item of “Shelf ID” has records concerning identifications of shelves while the item of “Product ID” has records concerning identifications of products.
  • the item of “Minimum Number” has records concerning the minimum number of products having “Product ID” allocated on a shelf having “Shelf ID”.
  • the item of “Current Number” has records concerning the number of products having “Product ID” currently allocated on a shelf having “Shelf ID”.
  • the item of “Last Check Time” has records concerning the last time of making a product allocation inspection with respect to products having “Product ID” allocated on a shelf having “Shelf ID”. In this connection, it is possible to register in the shelf space allocation table the information representing which row and which column in a shelf each product should be allocated to.
  • FIG. 5 is a flowchart showing a first procedure for the product management device 10 according to the present embodiment. Specifically, it shows a procedure for the product management device 10 making a product inspection.
  • the execution determination part 18 determines whether or not any one shelf should undergo a product allocation inspection (step S 11 ). Detailed examples of this determination process will be discussed later. The following description refers to the situation that the product management device 10 is scheduled to conduct a product allocation inspection every day at a predetermined time.
  • the execution determination part 18 compares the current time with the start time of a product allocation inspection stored on the storage unit 21 in advance, and therefore it determines that any shelf should undergo a product allocation inspection when the current time matches the start time.
  • the execution determination part 18 exits the flowchart of FIG. 5 when it is determined that no shelf should undergo a product allocation inspection (step S 11 : NO). In contrast, the execution determination part 18 indicates the commencement of a product allocation inspection with the position specifying part 11 when it is determined that any shelf should undergo a product allocation inspection (step S 11 : YES).
  • the position specifying part 11 obtains the positional information of each shelf undergoing a product allocation inspection (step S 12 ). To carry out a product allocation inspection for all shelves, the position specifying part 11 reads from the storage unit 21 identifications (i.e. shelf IDs) of shelves undergoing a product allocation inspection according to an order of inspecting shelves while reading positional information corresponding to each shelf ID from the map information table. The position specifying part 11 sends the positional information corresponding to each shelf ID to the imaging device controller 12 and the allocation status determination part 16 .
  • identifications i.e. shelf IDs
  • the position specifying part 11 sends the positional information corresponding to each shelf ID to the imaging device controller 12 and the allocation status determination part 16 .
  • the imaging device controller 12 moves the mobile imaging device 108 toward a shelf undergoing a product allocation inspection (step S 13 ).
  • the imaging device controller 12 obtains the current positional information of the mobile imaging device 108 by communicating with the mobile imaging device 108 via the communication part 20 .
  • the imaging device controller 12 determines a direction of moving the mobile imaging device 108 based on the current positional information of the mobile imaging device 108 and the positional information of a shelf obtained by the position specifying part 11 .
  • the imaging device controller 12 generates a command signal for moving the mobile imaging device 108 in its moving direction so as to transmit the command signal to the mobile imaging device 108 via the communication part 20 .
  • the imaging device controller 12 repeats the above process until the mobile imaging device 108 reaches an image-capture position close to a shelf undergoing a product allocation inspection.
  • the imaging device controller adjusts the direction of the mobile imaging device 108 . For example, it is determined to capture age of a shelf serving as an image-capture object in its front side.
  • the imaging device controller 12 controls the mobile imaging device 108 to turn its direction by 90° when the position of the mobile imaging device 108 matches the position of a shelf serving as an image-capture object in the x-axis direction and the y-axis direction.
  • the imaging device controller 12 instructs the mobile imaging device 108 to capture an image of a shelf (step S 14 ).
  • the mobile imaging device 108 transmits an image of a shelf to the product management device 10 .
  • the image acquisition part 14 obtains an image captured by the mobile imaging device 12 via the communication part 20 .
  • the product recognition part 15 calculates product allocation information (step S 15 ).
  • the product allocation information represents which product is displayed on a shelf and the number of products allocated on a shelf.
  • the product recognition part 15 compares an image captured by the image acquisition part 14 with images of various products registered in the product table so as to specify an image of product having features closest to a captured image as a product reflected in the captured image.
  • the product recognition part 15 detects the number of specified products being reflected in the captured image by way of image recognition.
  • the product recognition part 15 may detect which row and which column in a shelf the specified product is allocated to by way of image recognition.
  • the product recognition part 15 sends to the allocation status determination part 16 the identification of the specified product (i.e. a product ID) and product allocation information of products (e.g. the number of products) reflected in the captured image.
  • the product recognition part 15 may estimate an allocated orientation of the specified product (e.g. a vertical/horizontal orientation of each product, a backward orientation of each product), thus sending to the allocation status determination part 16 the information concerning an allocated orientation of each product in addition to product IDs, the number of products, and the number of arrays for products.
  • the allocation status determination part 16 determines whether or not products are arranged on each shelf in a proper status of allocation (step S 16 ). For example, the allocation status determination part 16 refers to the shelf space allocation table based on a product ID obtained from the product recognition part 15 and a shelf ID obtained from the position specifying part 11 , thus reading the minimum number of products which is determined for each combination of the product ID and the shelf ID. Subsequently, the allocation status determination part 16 compares the minimum number of products with the number of products that the product recognition part 15 obtains via image recognition. The allocation status determination part 16 determines that products are arranged in a proper status of allocation when the number of products is equal to or greater than the minimum number of products.
  • the product recognition part 15 may detect multiple types of products obtained via image recognition i.e. it may detect multiple product IDs).
  • the allocation status determination part 16 refers to the shelf space allocation table based on a shelf ID and a certain product ID among multiple product IDs but it fails to find out the corresponding records in the product allocation table, it is assumed that products representing a certain product ID should not be arranged on a shelf representing the shelf ID. In this case, the allocation status determination part 16 determines that products are arranged in an improper status of allocation.
  • the above shelf space allocation table may prescribe numeric values representing the number of rows and the number of columns for allocating products on each shelf.
  • the allocation status determination part 16 may determines that products are arranged in an improper status of allocation when the number of rows and the number of columns for allocating products on each shelf obtained via image recognition differ from the numeric values registered in the shelf space allocation table.
  • the allocation status determination part 16 may determine that products are arranged in an improper status of allocation.
  • the allocation status determination part 16 may determine whether a tag of each product is placed at an appropriate position in addition to the allocation status of each product. For example, the product recognition part 15 calculates the position and the inclination of each product tag via image recognition, and then the allocation status determination part 16 compares them with the positional information for placing each product tag so as to determine whether each product tag is placed in a proper manner.
  • the allocation status determination part 16 Upon determining a proper status of allocating products (step S 16 : NO), the allocation status determination part 16 sends information concerning the cause of improperness determination to the output part 19 .
  • the information concerning the cause of improperness determination may refer to “Shelf ID: 001, Product ID: A, Insufficient Number: Two”.
  • the output part 19 outputs an error list describing a shelf having a problem in a product allocation status based on the information concerning the cause of improper determination for a product allocation status.
  • the clerk 106 may approach a shelf described on an error list so as to supplement products or appropriately reorganize products.
  • the allocation status determination part 16 determines that products are arranged in a proper status of allocation (step S 16 : YES), or when the output part 19 outputs an error list, the allocation status determination part 16 updates the numeric value assigned to the item of “Current Number” for a record relating to the product ID and the shelf ID described in the shelf space allocation table with the number of products obtained from the product recognition part 15 . In addition, the allocation status determination part 16 updates the numeric value assigned to the item of “Last Check Time” for a record of the shelf space allocation table with the current time (step S 18 ). Thereafter, the flow returns to step S 11 , and therefore the aforementioned processes are repeated as long as any shelf remains to undergo a product allocation inspection.
  • the shelf space allocation management system of the present embodiment is able to automatically carry out a product allocation inspection without spending time and effort by store clerks manually capturing images of products displayed on shelves and attaching RFID tags to products.
  • the present embodiment can be realized using an unmanned helicopter having an imaging device which is sold in market. In this connection, it is possible to provide multiple types of products subjected to recognition for product allocation in step S 15 .
  • FIG. 5 The procedure of FIG. 5 has been described such that shelves are subjected to product allocation inspection at a predetermined time in a predetermined order.
  • the shelf space allocation management system of the present embodiment is able to specify the position of each shelf undergoing the occurrence of any change upon assuming the occurrence of any change in product allocation status so as to carry out a product allocation inspection for the shelf located at the specified position.
  • processing realizing this function will be described in detail with reference to FIGS. 6 to 11 .
  • FIG. 6 is a flowchart showing a second procedure for the product management device 10 .
  • the execution determination part 18 determines whether to carry out a product allocation inspection when a customer purchases a product to as to change the status of allocating products on each shelf.
  • the customer 107 purchases any product to pay the price thereof, and therefore the clerk 109 reads product information with a barcode reader so as to carry out an input process, and then the register 110 transmits to the product management device 10 a product ID of the purchased product and the number of products being purchased.
  • the product purchase information acquisition part 17 acquires the product ID via the communication part 20 (step S 21 ).
  • the product purchase information acquisition part 17 refers to the shelf space allocation table based on the product ID so as to update the numeric value registered as a record of the item “Current Number” concerning the product ID with a value subtracting the number of products being purchased (step S 22 ).
  • the product purchase information acquisition part 17 carries out an update process for subtracting the number of purchased products from the numeric value registered in the item “Current Number” with respect to all the records concerning the same “Product ID” in the shelf space allocation table.
  • the allocation status determination part 16 determines the properness of the allocation status of each product (step S 23 ). Specifically, the allocation status determination part 16 compares the numeric value of the item “Minimum Number” with the numeric value of the item “Current Number”, and therefore it determines that the product allocation status is improper when the numeric value of the item “Current Number” is less than the numeric value of the item “Minimum Number”.
  • the allocation status determination part 16 determines the properness of the allocation status of each product with respect to all records concerning the same “Product ID” in the shelf space allocation table.
  • the allocation status determination part 16 determines that the allocation status of each product is proper when the numeric value of the item “Current Number” after subtracting the number of purchased products in all records concerning the same “Product ID” is equal to or greater than the numeric value of the item “Minimum Number”.
  • the allocation status determination part 16 determines that the allocation status of each product is improper when one of all records concerning the same “Product ID” is deemed to be improper in its allocation status.
  • the allocation status determination part 16 notifies the determination result of the allocation status of each product to the execution determination part 18 .
  • step S 23 Upon determining the properness of the allocation status of each product (step S 23 : YES), the execution determination part 18 exits the procedure of FIG. 6 without carrying out a product allocation inspection.
  • step S 23 NO
  • the execution determination part 18 executes a product allocation inspection (step S 24 ).
  • step S 24 After completion of step S 24 , the present embodiment proceeds to a series of steps from step S 12 in FIG. 5 .
  • the execution determination part 18 indicates the commencement of product allocation inspection with the position specifying part 11 .
  • the allocation status determination part 16 reads a shelf ID of a shelf displaying product purchased by a customer from the shelf space allocation table so as to notify it to the position specifying part 11 .
  • the position specifying part 11 reads positional information of the shelf having the shelf ID from the map information table so as to send it to the imaging device controller 12 and the allocation status determination part 16 .
  • the imaging device controller 12 instructs the mobile imaging device 108 to move toward the shelf having the shelf ID so as to capture an image.
  • the mobile imaging device 108 captures images for all the multiple shelves.
  • the product recognition part 15 specifies each product via image recognition so as to count the number of products displayed on each shelf.
  • the allocation status determination part 16 determines the status of allocating products so as to update the shelf space allocation table.
  • the product management device 10 carries out a product allocation inspection solely for a shelf displaying the purchased product upon assuming the occurrence of any change in allocation status of each product when a customer purchases each product.
  • a product allocation inspection in real time.
  • FIG. 7 is a flowchart showing a third procedure for the product management device 10 .
  • the execution determination part 18 carries out a product allocation inspection when any change occurs in allocation status of each product due to the clerk 106 displaying products on shelves.
  • the procedure of FIG. 7 is based on the precondition that the mobile imaging device 108 having an automatic tracking mode moves behind the clerk 106 who displays products on shelves.
  • the mobile imaging device 108 is continuously capturing images in its moving direction under an image-capture instruction from the imaging device controller 12 of the product management device 10 .
  • the imaging device controller 12 is able to acquire positional information of the mobile imaging device 108 at a predetermined interval of time.
  • the image acquisition part 14 of the product management device 10 acquires images captured by the mobile imaging device 108 (step S 31 ).
  • the image acquisition part 14 sends images to the human detector 13 .
  • the human detector 13 detects the behavior of the clerk 106 reflected in multiple images captured by the image acquisition part 14 in time series. For example, the human detector 13 detects whether the clerk 106 is moving along a path between shelves in a store, whether the clerk 106 visually recognizes products displayed on shelves, and whether the clerk 106 extends his/her arm towards shelves.
  • features of the clerk 106 e.g. facial images and features of clothes
  • the human detector 13 sends the behavior of the clerk 106 to the execution determination part 18 .
  • the execution determination part 18 determines whether the clerk 106 displays products on shelves (step S 32 ). For example, the execution determination part 18 determines that the clerk 106 displays products based on multiple images captured by the image acquisition part 14 in time series when the clerk 106 repeatedly extends his/her arms a predetermined number of times or more in a predetermined period of time.
  • the product management device 10 exits the procedure of FIG. 7 (step S 33 ) when it is determined that the clerk 106 does not display products (step S 32 YES).
  • the execution determination part 18 indicates the commencement of a product allocation inspection with the position specifying part 11 .
  • the position specifying part 11 acquires the positional information of the mobile imaging device 108 from the imaging device controller 12 .
  • the position specifying part 11 sends the positional information to the allocation status determination part 16 .
  • the procedure of FIG. 7 is based on the precondition that the mobile imaging device 108 is following behind the clerk 106 , and therefore the mobile imaging device 108 should have already moved close to a shelf to be inspected. Therefore, the imaging device controller 12 changes the mode of the mobile imaging device 108 from an automatic tracking mode to a position control mode, thus controlling the mobile imaging device 108 to capture images of shelves, which should be checked in allocation status of each product, in terms of the positioning and the image-capture direction.
  • the product management device 10 carries out a series of steps similar to the foregoing steps from step S 12 onwards in FIG. 5 .
  • the execution determination part 18 carries out a product allocation inspection solely for the shelf having arranged products. Thus, it is possible to confirm whether a clerk appropriately displays products on shelves after the clerk complete displaying products on shelves.
  • FIG. 8 is a flowchart showing a fourth procedure for the product management device 10 .
  • the execution determination part 18 determines to execute a product allocation inspection when a clerk or a customer moves products on shelves so as to change the status of allocating products.
  • the third procedure of FIG. 7 is based on the precondition that the mobile imaging device 108 follows behind the clerk 106 who displays products on shelves.
  • the fourth procedure of FIG. 8 is carried out upon detecting any change in allocation status of each product due to the clerk 106 or the customer 107 based on images captured by the surveillance imaging device 105 instead of images captured by the mobile imaging device 108 .
  • the image acquisition part 14 of the product management device 10 acquires images captured by the surveillance imaging device 105 (step S 41 ).
  • the image acquisition part 14 sends images to the human detector 13 .
  • the human detector 13 recognizes a person reflected in images so as to identify whether the person is clerk 106 or the customer 107 (step S 42 ).
  • an image of the uniform worn by the clerk 106 is stored on the storage unit 21 in advance, and therefore the human detector 13 determines whether clothes of a person reflected in an image matches the uniform of the clerk 106 via pattern matching.
  • the human detector 13 identifies the person as the clerk 106 when the person's clothes reflected in an image matches the uniform.
  • the human detector 13 identifies the person as the customer 107 when the person's clothes reflected in an image doses not match the uniform.
  • the human detector 13 may identify whether a person reflected in an image is the clerk 106 by way of the facial recognition technology.
  • the human detector 13 detects the behavior of the clerk 106 from multiple images captured by the image acquisition part 14 in time series so as to notify it to the execution determination part 18 . Similar to step S 32 of FIG. 7 , the execution determination part 18 determines whether or not the clerk 106 displays products on shelves (step S 43 ).
  • step S 43 Upon determining that the clerk 106 or the customer 107 displays products on shelves (step S 43 : YES), the execution determination part 18 executes a product allocation inspection (step S 44 ). Upon detecting that the clerk 106 does not display products on shelves (step S 43 : NO), the product management device 10 exits the procedure of FIG. 8 .
  • the execution determination part 18 determines whether the customer 107 takes a product in his/her hand (step S 45 ). For example, the execution determination part 18 determines that the customer 107 takes a product in his/her hand when the human detector 13 notifies the execution determination part 18 of the operation of the customer 107 taking a product in his/her hand. Upon determining that the customer 107 takes a product in his/her hand (step S 45 : YES), the execution determination part 18 executes a product allocation inspection (step S 44 : YES).
  • the execution determination part 18 determines whether the customer 107 looks around in front of a shelf a predetermined number of times or more (step S 46 ). For example, when the human detector 13 notifies that the customer 107 looks around in front of a shelf while the customer 107 repeatedly makes his/her movements a predetermined number of times or more, the execution determination part 18 determines that the customer 107 looks around his/her surroundings in front of a shelf a predetermined number of times or more. Upon determining that the customer 107 does not look around his/her surroundings (step S 46 : NO), the execution determination part 18 exits the procedure of FIG. 8 .
  • the execution determination part 18 executes a product allocation inspection. Specifically, the execution determination part 18 indicates the commencement of a product allocation inspection with the position specifying part 11 .
  • the position specifying part 11 obtains information detected by the human detector 13 .
  • the position specifying part 11 specifies a shelf ID of a shelf subjected to product allocation inspection based on the detected information of the human detector 13 so as to read positional information concerning the shelf ID from the map information table.
  • the position specifying part 11 sends the positional information to the allocation status determination part 16 of the imaging device controller 12 .
  • the product management device 10 carries out a series of steps from step S 12 onwards in FIG. 5 .
  • the customer 107 may have a possibility of purchasing a product when the operation of the customer 107 taking the product in his/her hand is reflected in an image captured by the surveillance imaging device 105 .
  • the product management device 10 does not cooperate with the register 110 , it is possible to execute a product allocation inspection being triggered by the customer 107 purchasing any product even when the product purchase information acquisition part 17 fails to acquire product purchase information.
  • a clerk may have a possibility of displaying products on shelves when the operation of a clerk repeatedly extending his/her arms towards shelves is reflected in images captured by the surveillance imaging device 105 .
  • a clerk other than the clerk 106 followed by the mobile imaging device 108 displays products on shelves at another location, it is possible to carry out a product allocation inspection being triggered by detecting the other clerk's displaying products on shelves.
  • step S 46 denoting a decision as to “whether a customer looks around a predetermined number of times or more”. That is, it is possible to carry out a product allocation inspection of step S 44 upon assuming a possibility of shoplifting only when a customer extends his/her arm toward a shelf after looking around his/her surroundings.
  • step S 46 it is possible to carry out collation of personal characters when facial images of persons conducting shoplifting in the past have been registered in a database in advance. In this case, it is possible to carry out the process of step S 46 only when a person's facial image matches any one of persons' facial images registered in a database.
  • FIGS. 9A and 9B show examples of data tables stored on the storage unit 21 being referred by the product management device 10 .
  • FIG. 9A shows a product inspection frequency table while FIG. 9B shows a shelf inspection frequency table.
  • the product inspection frequency table of FIG. 9A has items of “Product ID” and “Frequency”.
  • the item “Product ID” has records concerning identifications of products.
  • the item “Frequency” has records concerning periods for performing a product allocation inspection for each product.
  • the product inspection frequency table describes that a product allocation inspection is carried out at the frequency of every hour with respect to the well-sold product A.
  • the product inspection frequency table describes that a product allocation inspection is carried out at the frequency of every twenty-four hours or every seventy-two hours depending on sales conditions (or sales) with respect to products B and C.
  • the shelf inspection frequency table of FIG. 9B has items of “Shelf ID” and “Frequency”.
  • the item “Shelf ID” has records concerning identifications of shelves.
  • the item “Frequency” has records concerning periods for making a product allocation inspection for each shelf.
  • the shelf inspection frequency table describes that a product allocation inspection is carried out at the frequency of every hour with respect to a shelf having a shelf ID of “001” displaying well-sold products.
  • the shelf inspection frequency table describes that a product allocation inspection is carried out at the frequency of every twenty-four hours or every seventy-two hours, depending on sales conditions of products displayed on shelves having shelf IDs “002” and “003”.
  • FIG. 10 is a flowchart showing a fifth procedure for the product management device 10 .
  • the product management device 10 carries out a product allocation inspection based on the frequency for checking products or shelves depending on sales conditions of products or sold conditions of products on shelves.
  • the execution determination part 18 reads one record from the product inspection frequency table of the storage unit 21 at a predetermined interval of time (step S 51 ).
  • the execution determination part 18 reads a last check time, concerning a product ID for the read record, from the product allocation table.
  • the execution determination part 18 adds to the last check time concerning the product ID the numerical value assigned to the item “Frequency” read from the product inspection frequency table.
  • the item “Frequency” of the product inspection frequency table describes “Every Hour”, for example, one hour is added to the last check time.
  • the execution determination part 18 determines whether the current matches the time for carrying out a product allocation inspection (step S 52 ). Specifically, the execution determination part 18 compares the current time with the numeric value (i.e. addition time) that is produced by adding the numeric value of the item “Frequency” to the last check time. When the current time passes the addition time, the execution determination part 18 determines that it comes to the time for carrying out a product allocation inspection. Upon determining the time for carrying out a product allocation inspection (step S 52 : YES), the execution determination part 18 carries out a product allocation inspection (step S 53 ).
  • the execution determination part 18 carries out a product allocation inspection (step S 53 ).
  • the position specifying part 11 reads a shelf ID for executing a product allocation inspection with reference to the shelf space allocation table based on the product ID, and then it reads positional information concerning the shelf ID from the map information table. After completion of step S 53 , the product management device 10 carries out a series of steps from step S 12 onwards in FIG. 5 .
  • the execution determination part 18 determines whether any unread record is found in the product inspection frequency table (step S 54 ). When it is determined that any unread record is found in the product inspection frequency table (step S 54 : YES), the execution determination part 18 returns to step S 51 so as to read a new record. Thereafter, the execution determination part 18 repeatedly carries out a series of steps from step S 52 onwards. When no unread record is found in the product inspection frequency table (step S 54 : NO), the execution determination part 18 exits the procedure of FIG. 10 .
  • the execution determination part 18 determines whether to carry out a product allocation inspection with reference to the product inspection frequency table; but this is not a restriction. To carry out a process for allocating products depending on sales performance of products, it is possible to carry out a product allocation inspection by executing the same procedure of FIG. 10 with reference to the shelf inspection frequency table instead of the product inspection frequency table.
  • FIG. 11 is a flowchart showing a sixth procedure for the product management device 10 .
  • FIG. 11 shows conditions that are incidentally considered before determining whether to execute a product allocation inspection with the execution determination part 18 .
  • the execution determination part 18 has already determined to carry out a product allocation inspection for a certain shelf.
  • the image acquisition part 14 acquires images captured by surveillance imaging device 105 (step S 61 ).
  • the image acquisition part 14 sends images to the human detector 13 .
  • the human detector 13 detects all the positions indicating possible existence of customers in images via image recognition.
  • the human detector 13 sends positional information to the execution determination part 18 .
  • the execution determination part 18 compares the positional information acquired from the human detector 13 with the positional information concerning a shelf ID of a shelf specified by the position specifying part 11 so as to determine whether any customer is found in proximity to a shelf subjected to product allocation inspection (step S 62 ). When any customer is found in proximity to a shelf (step S 62 : YES), the execution determination part 18 proceeds to step S 65 . When no customer is found in proximity to the shelf (step S 62 : NO), the execution determination part 18 determines whether a predetermined time or more has elapsed after the previous timing of executing a product allocation inspection with respect to the shelf subjected to product allocation inspection (step S 63 ).
  • the execution determination part 18 compares the current time with the time (i.e. the addition time) that is produced by adding the predetermined time to the last check time for a record concerning the shelf ID in the shelf space allocation table, and therefore it determines that the predetermined time or more has passed the previous timing of executing a product allocation inspection when the current time is equal to or greater than the addition time (step S 63 : YES). Thereafter, the execution determination part 18 executes a product allocation inspection (step S 64 ).
  • step S 65 when the execution determination part 18 determines that the predetermined time or more has passed the previous timing of executing a product allocation inspection since the current time is smaller than the addition time (step S 63 : NO), the execution determination part 18 stops executing a product allocation inspection (step S 65 ).
  • the foregoing embodiment is explained using an unmanned air vehicle as the mobile imaging device 108 ; but this is not a restriction.
  • the aforementioned product management device 10 includes a computer system therein.
  • the product management device 10 implements processes using programs stored on computer-readable storage media. That is, the computer system loads and executes programs to achieve the foregoing processes.
  • the computer-readable storage media refer to magnetic disks, magneto-optical disks, CD-ROM, DVD-ROM, semiconductor memory and the like.
  • the foregoing programs may achieve part of the foregoing functions.
  • the foregoing programs may be differential files (or differential programs) that can achieve the foregoing functions by being combined with pre-installed programs in computer system.
  • the floor 100 shows an example of an area arranging shelves in a store while an image captured by the surveillance imaging device 105 shows an example of an image monitoring the internal state of a store.
  • the present invention provides a shelf space allocation management system that automatically executes product allocation inspection for products displayed on shelves in a store at an appropriate timing; however, its applications should not be limited to products.
  • the present invention is applicable to any types of systems that may manage electronic parts and materials allocated on multiple shelves.

Abstract

A shelf space allocation management device manages products allocated on shelves aligned in a store by use of an image captured by an imaging device. The shelf space allocation management device acquires an image including a position assumed to be changed in allocation status of each product on each shelf; it determines whether the type and the allocation status of each product reflected in the image match the predetermined type and the predetermined allocation status; then, it determines whether to execute a product allocation inspection based on the determination result. Herein, the shelf space allocation management device specifies a position at which a person conducts a behavior to cause any change in the allocation status of each product on each shelf, and therefore it may control the imaging device to capture an image including the position. It is possible to carry out a product allocation inspection for each period determined in advance depending on the type of each product, or it is possible to carry out a product allocation inspection being triggered by a customer purchasing each product.

Description

    TECHNICAL FIELD
  • The present invention relates to a shelf space allocation management device and a shelf space allocation management method for managing products allocated on shelves i.e. shelf space allocation of products).
  • The present application claims the benefit of priority on Japanese Patent Application No. 2015-9978 filed on Jan. 22, 2015, the subject matter of which is hereby incorporated herein by reference.
  • BACKGROUND ART
  • Conventionally, clerks have managed products by allocating products on shelves i.e. shelf space allocation of products) in distribution-retail stores. They may visually recognize products so as to manually count and check the number of products displayed on shelves. To automatically check shelves by way of image recognition, clerks or operators of marketing researches may have recognized objects or products from images captured by imaging devices. In addition, they may have determined the presence/absence of products by use of electronic readers for RIM (Radio Frequency Identification) tags attached to products.
  • Various documents have been known with respect to image recognition of products. For example, Patent Literature 1 discloses a product monitoring system that processes images representing statuses of displaying products so as to determine the timings of supplementing products based on time-series variations of images. Patent Literature 2 discloses an inventory management method using image recognition of products displayed on shelves and captured by multiple cameras. This method uses mobile cameras in stores, and therefore cameras are moved under a predetermined procedure so as to capture images of products and thereby check the number of products by way of image recognition. Patent Literature 3 discloses a technology of recognizing the type of products being imaged by way of comparison between the images captured by cameras and the features of products stored on a database.
  • CITATION LIST Patent Literature
  • Patent Literature 1: Japanese Patent Application No. H05-81552
  • Patent Literature 2: Japanese Patent Application No. 2001-88912
  • Patent Literature 3: Japanese Patent Application No. 2014-164594
  • SUMMARY OF INVENTION Technical Problem
  • Originally, it is preferable to provide a wide range of products on shelves without losing product-purchasing chances by customers. For this reason, it is necessary to check the number of products at an appropriate timing. However, clerks need to manually capture images of products even when shelf inspection is automated using image recognition, and therefore they may fail to capture images of products in busy periods of duties. The product management using RFID tags attached to products my need operation costs for attaching RFID tags to products. In addition, it is necessary to reexamine manufacturing and shipping processes. The inventory management method of Patent Literature 2 needs to capture images of products under a predetermined procedure, and therefore it is difficult to efficiently carry out image recognition and thereby grasp the statue of products displayed on shelves in real time.
  • The present invention aims to provide a shelf space allocation management device and a shelf space allocation management method, which can efficiently carry out image recognition of products so as to check the allocation of products and the number of products on shelves.
  • Solution to Problem
  • A first aspect of the present invention relates to a shelf space allocation management device for managing products allocated on a shelf. The shelf space allocation management device includes an image acquisition part configured to acquire an image including a position assumed to be changed in an allocation status of each product on the shelf; an allocation status determination part configured to determine whether a type and an allocation status of each product reflected in the image match a predetermined type and a predetermined allocation status of each product; and an execution determination part configured to execute a product allocation inspection based on the determination result of the allocation status determination part.
  • A second aspect of the present invention relates to a shelf space allocation management system for managing products allocated on a shelf. The shelf space allocation management system includes an imaging device configured to capture an image of each product while moving along the shelf; and the aforementioned shelf space allocation management device.
  • A third aspect of the present invention relates to a shelf space allocation management method for managing products allocated on a shelf. The shelf space allocation management method includes the steps of: acquiring an image including a position assumed to be changed in allocation status of each product on the shelf; determining whether a type and an allocation status of each product reflected in the image match a predetermined type and a predetermined allocation status of each product; and determining whether to execute a product allocation inspection based on the determination result. In addition, it is possible to provide a program that causes a computer to implement the aforementioned shelf space allocation management method.
  • Advantageous Effects of Invention
  • According to the present invention, it is possible to automatically check the status of allocating products on shelves arranged in a store by way of image recognition. In addition, it is possible to efficiently check the allocation of ducts at an appropriate timing.
  • BRIEF DESCRIPTION OF DRAWINGS
  • FIG. 1 is a block diagram showing the minimum configuration of a product management device according to one embodiment of the present invention.
  • FIG. 2 is a block diagram showing the detailed configuration of the product management device according to one embodiment of the present invention.
  • FIG. 3 is a layout plan of a store adopting a product allocation management system according to one embodiment of the present invention.
  • FIG. 4A shows an example of a product table registered in a product database stored on a storage unit of the product management device.
  • FIG. 4B shows an example of a map information table registered in a map information database stored on the storage unit of the product management device.
  • FIG. 4C shows an example of a product allocation table registered in a shelf space allocation database stored on the storage unit of the product management device.
  • FIG. 5 is a flowchart showing a first procedure for the product management device.
  • FIG. 6 is a flowchart showing a second procedure for the product management device.
  • FIG. 7 is a flowchart showing a third procedure for the product management device.
  • FIG. 8 is a flowchart showing a fourth procedure for the product management device.
  • FIG. 9A shows a product inspection frequency table stored on the storage unit of the product management device.
  • FIG. 9B shows a shelf inspection frequency table stored on the storage unit of the product management device.
  • FIG. 10 is a flowchart showing a fifth procedure for the product management device.
  • FIG. 11 is a flowchart showing a sixth procedure for the product management device.
  • DESCRIPTION OF EMBODIMENTS
  • A shelf space allocation management device and a shelf space allocation management method according to the present invention will be described in detail with reference to the accompanying drawings.
  • FIG. 1 is a block diagram showing the minimum configuration of a shelf space allocation management device (hereinafter, referred to as a product management device) 10 according to one embodiment of the present invention. The product management device 10 includes an image acquisition part 14 and an allocation status determination part 16. Herein, it is possible to adopt a computer having a CPU as the product management device 10. Upon assuming any changes occurring in the status of allocating products, the image acquisition part 14 captures an image including the position of any product assumed to be changed in its status. The image acquisition part 14 may capture moving images or still images. The allocation status determination part 16 determines whether a product reflected in an image captured by the image acquisition part 14 is of a predetermined type or in a predetermined allocation status.
  • FIG. 2 is a block diagram showing the detailed configuration of the product management device 10 according to one embodiment of the present invention. The product management device 10 checks appropriateness with respect to the type of each product, the allocation status for each product, the number of products, and the allocation method of products. For example, the allocation status represents the number of products and the allocation of products on shelves in rows and columns. Hereinafter, the operation for checking the type of each product and the allocation status for each product will be referred to as product allocation inspection.
  • The product management device 10 includes a position specifying part 11, an imaging device controller 12, a human detector 13, an image acquisition part 14, a product recognition part 15, an allocation status determination part 16, a product purchase information acquisition part 17, an execution determination part 18, an output part 19, a communication part 20, and a storage unit 21. Herein, the image acquisition part 14 and the allocation status determination part 16 are included in the minimum configuration shown in FIG. 1.
  • The position specifying part 11 specifies the position of any product assumed to be changed in its allocation status. The imaging device controller 12 controls the movement of an imaging device (not shown) and the timing of starting or stopping imaging. The human detector 13 detects whether each product is assumed to be changed in its allocation status by a person reflected in an image captured by an imaging device. The product recognition part 15 recognizes which product is regarded as an object reflected in an image captured by an imaging device with reference to images of products registered in a product database (not shown) prepared in advance. The product purchase information acquisition part 17 acquires the information of each product purchased by a customer. For example, the product information represents any product selected by a customer and the number of products purchased by a customer. The output part 19 outputs a message to a display device connected to the product management device 10. For example, the output part 19 outputs a message concerning the current allocation status of products and urging a clerk displaying products. The communication part 20 communicates with an external device. The storage unit 21 stores various pieces of information such as a product database registering images of products, a map information database representing the location of each shelf in a store, and a shelf space allocation database representing which product should be displayed on a specified shelf and the number of products displayed thereon.
  • FIG. 3 shows a layout plan of a store adopting a shelf space allocation management system according to one embodiment of the present invention. Herein, shelves 101 through 104 and a surveillance imaging device 105 are arranged on a floor 100 of a store. In addition, a mobile imaging device 108 is arranged on the floor 100. FIG. 3 shows that a clerk 106 walks around the floor 100 so as to visually recognize the shelves 101 through 104 while a customer 107 is selecting any product displayed on the shelf 103. In addition, another clerk 109 may receive payment and settles its account for any product purchased by a customer by use of a register 110. FIG. 3 shows a shelf space allocation management system including the product management device 10 and the mobile imaging device 108.
  • The mobile imaging device 108 moves on the floor 100 so as to capture an image in a predetermined scope of imaging based on the allocated positions of products subjected to inspection and displayed on the shelves 101 through 104. For example the predetermined scope of imaging may entirely cover the allocated position of each product to reflect an explanatory tag (e.g. POP advertisement) for each product. Images captured by the mobile imaging device 108 are subjected to product allocation inspection for the product management device 10.
  • For example, it is possible to use an unmanned air vehicle (or a drone) having an automatic tracking function and an imaging device as the mobile imaging device 108. The unmanned air vehicle may wirelessly communicate with a mobile terminal device worn by the clerk 106. In addition, the unmanned air vehicle may acquire the positional information of a mobile terminal device so as to track the movement of the clerk 106 while holding a predetermined relative positional relationship with a mobile terminal device. As an automatic tracking method, it is possible to track the clerk 106 by way of image recognition for identifying the clerk 106 based on an image captured by the mobile imaging device 108.
  • As a moving method, it is possible for an unmanned air vehicle to move and follow after the clerk 106 with holding a certain distance by way of an automatic tracking function. In addition, it is possible to control the position of an unmanned air vehicle in response to a command signal from the imaging device controller 12 of the product management device 10. In this case, for example, the unmanned air vehicle can be equipped with a GPS device to transmit its current position to the product management device 10. The imaging device controller 12 transmits to the unmanned air vehicle via the communication part 20 a command signal indicating the direction of moving the unmanned air vehicle based on a predetermined target position and the positional information acquired from the unmanned air vehicle. In addition, the imaging device controller 12 transmits to the unmanned air vehicle a command signal indicating that the unmanned air vehicle should stop when it arrives at a target position. In this connection, it is possible to switch over an automatic tracking mode of an unmanned air vehicle and a position control mode of an unmanned air vehicle by the imaging device controller 12.
  • For example, an unmanned air vehicle may be equipped with an imaging device to capture an image in a traveling direction. Alternatively, the imaging device may capture an image in a scope of 360° around an unmanned air vehicle. According to an instruction to start imaging by the imaging device controller 12, the mobile imaging device 108 starts to capture an image. According to an instruction to stop imaging by the imaging device controller 12, the mobile imaging device 108 stops to capture an image. The mobile imaging device 108 transmits its captured image to the product management device 10.
  • In the product management device 10, the image acquisition part 14 acquires images captured by the mobile imaging device 108 so as to send images to the product recognition part 15. The product recognition part 15 analyzes images so as to recognize which product corresponds to objects reflected in images by way of the known image recognition technology. For example, the product recognition part 15 compares multiple features in images of various products, which are registered in the product database stored on the storage unit 21, with features in captured images of products so as to specify a product having the largest number of matched features as a product reflected in an image captured by the image acquisition part 14. In addition, the product recognition part 15 specifies products reflected in an image so as to determine which product is being reflected in an image while counting the number of products. For example, the product recognition part 15 compares features such as patterns of wrapping and outlines of images of products, which are registered in the product database, with features such as outlines of objects reflected in images captured by the image acquisition part 14, thus counting the number of specified products being reflected in images. Moreover, the product recognition part 15 determines the mutual positional relationship based on positional information for images of specified products so as to determine the number of rows and the number of columns for allocating products on each shelf.
  • After the product recognition part 15 determines the type of products and the number of products displayed on each shelf, the allocation status determination part 16 determines whether those products are subjected to an appropriate allocation of products. For example, the allocation status determination part 16 inspects whether products are allocated at proper positions on each shelf and whether the appropriate number of products are allocated on each shelf based on the allocated position of each product registered in the shelf space allocation database stored on the storage unit 21 as well as the minimum number of products. When an insufficient number of products are allocated on each shelf, for example, the allocation status determination part 16 outputs to a display device through the output part 19 a message of “X sets are insufficient in product A”.
  • In the shelf space allocation management system shown in FIG. 3, the surveillance imaging device 105 transmits to the product management device 10 an image capturing the entire status of the floor 100. The status concerning the clerk 106 and the customer 107 is reflected in an image captured by the surveillance imaging device 105. Although a single set of the surveillance imaging device 105 is arranged in a store layout, it is possible to arrange multiple sets of the surveillance imaging device 105 for imaging the entire status of the shelves 101 through 104.
  • In the product management device 10, the image acquisition part 14 acquires an image captured by the surveillance imaging device 105 so as to transmit the image to the human detector 13. The human detector 13 analyzes images by way of the known image recognition technology so as to identify whether a person reflected in an image is a clerk or a customer and thereby recognize the behavior of a person reflected in an image. In the case of a clerk, for example, his/her facial image is stored on the storage unit 21 in advance, and therefore the human detector 13 compares the facial image of a person reflected in an image with the facial image of a clerk stored on the storage unit 21, thus identifying whether the person matches a clerk. In addition, the human detector 13 may identify a person as a clerk when the costume of a person reflected in an image matches the uniform for a clerk. The human detector 13 identifies a person as a customer when it fails to identify a person reflected in an image as a clerk. Moreover, the human detector 13 is able to recognize the behavior of a person reflected in an image (e.g. activities concerning “a person who extends his/her arm toward a shelf so as to take a product in his/her hand” and “a person who looks around in front of a shelf”) by way of the known image recognition technology for detecting movements of persons' arms and variations of facial directions based on a plurality of time-series images.
  • In the shelf space allocation management system shown in FIG. 3, the register 110 is connected to the product management device 10. The register 110 is equipped with a barcode reader. When the clerk 109 uses a barcode reader to read a barcode printed on a product purchased by the customer 107, the barcode reader acquires and stores a product's identification on a storage unit of the register 110. Herein, the clerk 109 completes reading a barcode on a product and then the customer 107 completes paying the price of a product, and therefore the clerk 109 makes a completion operation with the register 110, which in turn transmits the identification of a product purchased by the customer 107 to the product management device 10.
  • Triggered by the operation of the clerk 106 and/or the customer 107 reflected in an image captured by the mobile imaging device 108 and/or the surveillance imaging device 105 as well as the identification of a purchased product transmitted from the register 110, the product management device 10 of the present embodiment carries out product allocation inspection when it is assume that any change may occur in the status of allocating products displayed on each shelf.
  • FIGS. 4A to 4C show examples of data tables stored on the storage unit 21 of the product management device 10 of the present embodiment. Specifically, FIG. 4A shows an example of a product table registered in a product database stored on the storage unit 21. FIG. 4B shows an example of a map information table registered in a map information database stored on the storage unit 21. FIG. 4C shows an example of a shelf space allocation table registered in a shelf space allocation database stored on the storage unit 21.
  • The product table of FIG. 4A has items such as “Product ID” and “Product Image 1” through “Product Image N”. The item of “Product ID” has records concerning identifications of products. The items of “Product Image 1” through “Product Image N” have records concerning images capturing products at various angles. As to the same product, an image capturing its front face may differs from an image capturing its rear face. When a customer takes a product in his/her hand but mistakenly returns the product to a shelf with its rear face turning to its front side, for example, the product management device 10 cannot properly recognize the product if the product table registers an image capturing the front face of the product alone. Therefore, the product table is designed to register multiple images for each product. In this connection, it is unnecessary to register N images for each product.
  • The map information table of FIG. 4B has items such as “Shelf ID” and “Position Information”. The item of “Shelf ID” has records concerning identifications of shelves. The item of “Position information” has records concerning positional information for each shelf. For example, the positional information represents three-dimensional coordinates for each shelf based on an origin corresponding to a corner of the floor 100. In FIG. 4B, the shelf ID “001” relates to positional information which represents a position measuring x meters in an x-axis direction, y meters in a y-axis direction, and z meters in a z-axis direction (i.e. a height direction). That is, a shelf having the shelf ID “001” is located at a position representing the positional information (x,y,z) on the floor 100.
  • The shelf space allocation table of FIG. 4C has items such as “Shelf ID”, “Product ID”, “Minimum Number”, “Current Number”, and “Last Check Time”. The item of “Shelf ID” has records concerning identifications of shelves while the item of “Product ID” has records concerning identifications of products. The item of “Minimum Number” has records concerning the minimum number of products having “Product ID” allocated on a shelf having “Shelf ID”. The item of “Current Number” has records concerning the number of products having “Product ID” currently allocated on a shelf having “Shelf ID”. The item of “Last Check Time” has records concerning the last time of making a product allocation inspection with respect to products having “Product ID” allocated on a shelf having “Shelf ID”. In this connection, it is possible to register in the shelf space allocation table the information representing which row and which column in a shelf each product should be allocated to.
  • FIG. 5 is a flowchart showing a first procedure for the product management device 10 according to the present embodiment. Specifically, it shows a procedure for the product management device 10 making a product inspection.
  • First, the execution determination part 18 determines whether or not any one shelf should undergo a product allocation inspection (step S11). Detailed examples of this determination process will be discussed later. The following description refers to the situation that the product management device 10 is scheduled to conduct a product allocation inspection every day at a predetermined time. The execution determination part 18 compares the current time with the start time of a product allocation inspection stored on the storage unit 21 in advance, and therefore it determines that any shelf should undergo a product allocation inspection when the current time matches the start time. The execution determination part 18 exits the flowchart of FIG. 5 when it is determined that no shelf should undergo a product allocation inspection (step S11: NO). In contrast, the execution determination part 18 indicates the commencement of a product allocation inspection with the position specifying part 11 when it is determined that any shelf should undergo a product allocation inspection (step S11: YES).
  • The position specifying part 11 obtains the positional information of each shelf undergoing a product allocation inspection (step S12). To carry out a product allocation inspection for all shelves, the position specifying part 11 reads from the storage unit 21 identifications (i.e. shelf IDs) of shelves undergoing a product allocation inspection according to an order of inspecting shelves while reading positional information corresponding to each shelf ID from the map information table. The position specifying part 11 sends the positional information corresponding to each shelf ID to the imaging device controller 12 and the allocation status determination part 16.
  • The imaging device controller 12 moves the mobile imaging device 108 toward a shelf undergoing a product allocation inspection (step S13). The imaging device controller 12 obtains the current positional information of the mobile imaging device 108 by communicating with the mobile imaging device 108 via the communication part 20. The imaging device controller 12 determines a direction of moving the mobile imaging device 108 based on the current positional information of the mobile imaging device 108 and the positional information of a shelf obtained by the position specifying part 11. Next, the imaging device controller 12 generates a command signal for moving the mobile imaging device 108 in its moving direction so as to transmit the command signal to the mobile imaging device 108 via the communication part 20. The imaging device controller 12 repeats the above process until the mobile imaging device 108 reaches an image-capture position close to a shelf undergoing a product allocation inspection.
  • Next, upon determining that the mobile imaging device 108 reaches an image-capture position, the imaging device controller adjusts the direction of the mobile imaging device 108. For example, it is determined to capture age of a shelf serving as an image-capture object in its front side. The imaging device controller 12 controls the mobile imaging device 108 to turn its direction by 90° when the position of the mobile imaging device 108 matches the position of a shelf serving as an image-capture object in the x-axis direction and the y-axis direction. Next, the imaging device controller 12 instructs the mobile imaging device 108 to capture an image of a shelf (step S14). The mobile imaging device 108 transmits an image of a shelf to the product management device 10. In the product management device 10, the image acquisition part 14 obtains an image captured by the mobile imaging device 12 via the communication part 20. Next, the product recognition part 15 calculates product allocation information (step S15). The product allocation information represents which product is displayed on a shelf and the number of products allocated on a shelf. For example, the product recognition part 15 compares an image captured by the image acquisition part 14 with images of various products registered in the product table so as to specify an image of product having features closest to a captured image as a product reflected in the captured image. In addition, the product recognition part 15 detects the number of specified products being reflected in the captured image by way of image recognition. Moreover, the product recognition part 15 may detect which row and which column in a shelf the specified product is allocated to by way of image recognition.
  • Next, the product recognition part 15 sends to the allocation status determination part 16 the identification of the specified product (i.e. a product ID) and product allocation information of products (e.g. the number of products) reflected in the captured image. Herein, it is possible to register images representing various orientations of each product in the product table in advance, thus conducting pattern matching between features of each product image and features of each captured image. In this case, the product recognition part 15 may estimate an allocated orientation of the specified product (e.g. a vertical/horizontal orientation of each product, a backward orientation of each product), thus sending to the allocation status determination part 16 the information concerning an allocated orientation of each product in addition to product IDs, the number of products, and the number of arrays for products.
  • The allocation status determination part 16 determines whether or not products are arranged on each shelf in a proper status of allocation (step S16). For example, the allocation status determination part 16 refers to the shelf space allocation table based on a product ID obtained from the product recognition part 15 and a shelf ID obtained from the position specifying part 11, thus reading the minimum number of products which is determined for each combination of the product ID and the shelf ID. Subsequently, the allocation status determination part 16 compares the minimum number of products with the number of products that the product recognition part 15 obtains via image recognition. The allocation status determination part 16 determines that products are arranged in a proper status of allocation when the number of products is equal to or greater than the minimum number of products. In contrast, it determines that products are arranged in an improper status of allocation when the number of products is less than the minimum number of products. In some situation, the product recognition part 15 may detect multiple types of products obtained via image recognition i.e. it may detect multiple product IDs). When the allocation status determination part 16 refers to the shelf space allocation table based on a shelf ID and a certain product ID among multiple product IDs but it fails to find out the corresponding records in the product allocation table, it is assumed that products representing a certain product ID should not be arranged on a shelf representing the shelf ID. In this case, the allocation status determination part 16 determines that products are arranged in an improper status of allocation.
  • The above shelf space allocation table may prescribe numeric values representing the number of rows and the number of columns for allocating products on each shelf. Herein, the allocation status determination part 16 may determines that products are arranged in an improper status of allocation when the number of rows and the number of columns for allocating products on each shelf obtained via image recognition differ from the numeric values registered in the shelf space allocation table. When the product recognition part 15 obtains an allocated orientation of each product, for example, when each product is allocated in a backward orientation, the allocation status determination part 16 may determine that products are arranged in an improper status of allocation. In addition, the allocation status determination part 16 may determine whether a tag of each product is placed at an appropriate position in addition to the allocation status of each product. For example, the product recognition part 15 calculates the position and the inclination of each product tag via image recognition, and then the allocation status determination part 16 compares them with the positional information for placing each product tag so as to determine whether each product tag is placed in a proper manner.
  • Upon determining a proper status of allocating products (step S16: NO), the allocation status determination part 16 sends information concerning the cause of improperness determination to the output part 19. For example, the information concerning the cause of improperness determination may refer to “Shelf ID: 001, Product ID: A, Insufficient Number: Two”. The output part 19 outputs an error list describing a shelf having a problem in a product allocation status based on the information concerning the cause of improper determination for a product allocation status. The clerk 106 may approach a shelf described on an error list so as to supplement products or appropriately reorganize products.
  • When the allocation status determination part 16 determines that products are arranged in a proper status of allocation (step S16: YES), or when the output part 19 outputs an error list, the allocation status determination part 16 updates the numeric value assigned to the item of “Current Number” for a record relating to the product ID and the shelf ID described in the shelf space allocation table with the number of products obtained from the product recognition part 15. In addition, the allocation status determination part 16 updates the numeric value assigned to the item of “Last Check Time” for a record of the shelf space allocation table with the current time (step S18). Thereafter, the flow returns to step S11, and therefore the aforementioned processes are repeated as long as any shelf remains to undergo a product allocation inspection.
  • The shelf space allocation management system of the present embodiment is able to automatically carry out a product allocation inspection without spending time and effort by store clerks manually capturing images of products displayed on shelves and attaching RFID tags to products. In addition, the present embodiment can be realized using an unmanned helicopter having an imaging device which is sold in market. In this connection, it is possible to provide multiple types of products subjected to recognition for product allocation in step S15.
  • The procedure of FIG. 5 has been described such that shelves are subjected to product allocation inspection at a predetermined time in a predetermined order. However, the shelf space allocation management system of the present embodiment is able to specify the position of each shelf undergoing the occurrence of any change upon assuming the occurrence of any change in product allocation status so as to carry out a product allocation inspection for the shelf located at the specified position. Next, processing realizing this function will be described in detail with reference to FIGS. 6 to 11.
  • FIG. 6 is a flowchart showing a second procedure for the product management device 10. In the second procedure, the execution determination part 18 determines whether to carry out a product allocation inspection when a customer purchases a product to as to change the status of allocating products on each shelf.
  • In FIG. 3, the customer 107 purchases any product to pay the price thereof, and therefore the clerk 109 reads product information with a barcode reader so as to carry out an input process, and then the register 110 transmits to the product management device 10 a product ID of the purchased product and the number of products being purchased. In the product management device 10, the product purchase information acquisition part 17 acquires the product ID via the communication part 20 (step S21). The product purchase information acquisition part 17 refers to the shelf space allocation table based on the product ID so as to update the numeric value registered as a record of the item “Current Number” concerning the product ID with a value subtracting the number of products being purchased (step S22). When plenty of articles corresponding to the same product are distributed and displayed on a plurality of shelves, the product purchase information acquisition part 17 carries out an update process for subtracting the number of purchased products from the numeric value registered in the item “Current Number” with respect to all the records concerning the same “Product ID” in the shelf space allocation table.
  • After the product purchase information acquisition part 17 updates the numeric value of the item “Current Number” concerning the “Product ID” of the purchased product in the shelf space allocation table, the allocation status determination part 16 determines the properness of the allocation status of each product (step S23). Specifically, the allocation status determination part 16 compares the numeric value of the item “Minimum Number” with the numeric value of the item “Current Number”, and therefore it determines that the product allocation status is improper when the numeric value of the item “Current Number” is less than the numeric value of the item “Minimum Number”. When plenty of articles corresponding to the same product are distributed and displayed on a plurality of shelves, the allocation status determination part 16 determines the properness of the allocation status of each product with respect to all records concerning the same “Product ID” in the shelf space allocation table. The allocation status determination part 16 determines that the allocation status of each product is proper when the numeric value of the item “Current Number” after subtracting the number of purchased products in all records concerning the same “Product ID” is equal to or greater than the numeric value of the item “Minimum Number”. On the other hand, the allocation status determination part 16 determines that the allocation status of each product is improper when one of all records concerning the same “Product ID” is deemed to be improper in its allocation status. The allocation status determination part 16 notifies the determination result of the allocation status of each product to the execution determination part 18.
  • Upon determining the properness of the allocation status of each product (step S23: YES), the execution determination part 18 exits the procedure of FIG. 6 without carrying out a product allocation inspection. Upon determining the improperness of the allocation status of each product (step S23: NO), the execution determination part 18 executes a product allocation inspection (step S24). After completion of step S24, the present embodiment proceeds to a series of steps from step S12 in FIG. 5. Specifically, the execution determination part 18 indicates the commencement of product allocation inspection with the position specifying part 11. The allocation status determination part 16 reads a shelf ID of a shelf displaying product purchased by a customer from the shelf space allocation table so as to notify it to the position specifying part 11. The position specifying part 11 reads positional information of the shelf having the shelf ID from the map information table so as to send it to the imaging device controller 12 and the allocation status determination part 16. Next, the imaging device controller 12 instructs the mobile imaging device 108 to move toward the shelf having the shelf ID so as to capture an image. When plenty of articles corresponding to the product purchased by a customer are distributed and displayed on a plurality of shelves, the mobile imaging device 108 captures images for all the multiple shelves. Next, the product recognition part 15 specifies each product via image recognition so as to count the number of products displayed on each shelf. In addition, the allocation status determination part 16 determines the status of allocating products so as to update the shelf space allocation table.
  • According to the procedure of FIG. 6, the product management device 10 carries out a product allocation inspection solely for a shelf displaying the purchased product upon assuming the occurrence of any change in allocation status of each product when a customer purchases each product. Thus, it is possible to efficiently carry out a product allocation inspection in real time.
  • FIG. 7 is a flowchart showing a third procedure for the product management device 10. In FIG. 7, the execution determination part 18 carries out a product allocation inspection when any change occurs in allocation status of each product due to the clerk 106 displaying products on shelves. The procedure of FIG. 7 is based on the precondition that the mobile imaging device 108 having an automatic tracking mode moves behind the clerk 106 who displays products on shelves. In addition, the mobile imaging device 108 is continuously capturing images in its moving direction under an image-capture instruction from the imaging device controller 12 of the product management device 10. Moreover, the imaging device controller 12 is able to acquire positional information of the mobile imaging device 108 at a predetermined interval of time.
  • First, the image acquisition part 14 of the product management device 10 acquires images captured by the mobile imaging device 108 (step S31). The image acquisition part 14 sends images to the human detector 13. The human detector 13 detects the behavior of the clerk 106 reflected in multiple images captured by the image acquisition part 14 in time series. For example, the human detector 13 detects whether the clerk 106 is moving along a path between shelves in a store, whether the clerk 106 visually recognizes products displayed on shelves, and whether the clerk 106 extends his/her arm towards shelves. In this connection, features of the clerk 106 (e.g. facial images and features of clothes) are stored on the storage unit 21 in advance, and therefore human detector 13 is able to identify the clerk 106 and the customer 107. The human detector 13 sends the behavior of the clerk 106 to the execution determination part 18.
  • The execution determination part 18 determines whether the clerk 106 displays products on shelves (step S32). For example, the execution determination part 18 determines that the clerk 106 displays products based on multiple images captured by the image acquisition part 14 in time series when the clerk 106 repeatedly extends his/her arms a predetermined number of times or more in a predetermined period of time. The product management device 10 exits the procedure of FIG. 7 (step S33) when it is determined that the clerk 106 does not display products (step S32 YES). Specifically, the execution determination part 18 indicates the commencement of a product allocation inspection with the position specifying part 11. The position specifying part 11 acquires the positional information of the mobile imaging device 108 from the imaging device controller 12. The position specifying part 11 sends the positional information to the allocation status determination part 16. The procedure of FIG. 7 is based on the precondition that the mobile imaging device 108 is following behind the clerk 106, and therefore the mobile imaging device 108 should have already moved close to a shelf to be inspected. Therefore, the imaging device controller 12 changes the mode of the mobile imaging device 108 from an automatic tracking mode to a position control mode, thus controlling the mobile imaging device 108 to capture images of shelves, which should be checked in allocation status of each product, in terms of the positioning and the image-capture direction. After completion of step S33, the product management device 10 carries out a series of steps similar to the foregoing steps from step S12 onwards in FIG. 5.
  • According to the third procedure, when it is assumed that any change occurs in the status of allocating products on a shelf due to a clerk displaying products on a shelf in a store, the execution determination part 18 carries out a product allocation inspection solely for the shelf having arranged products. Thus, it is possible to confirm whether a clerk appropriately displays products on shelves after the clerk complete displaying products on shelves.
  • FIG. 8 is a flowchart showing a fourth procedure for the product management device 10. In FIG. 8, the execution determination part 18 determines to execute a product allocation inspection when a clerk or a customer moves products on shelves so as to change the status of allocating products. The third procedure of FIG. 7 is based on the precondition that the mobile imaging device 108 follows behind the clerk 106 who displays products on shelves. On the other hand, the fourth procedure of FIG. 8 is carried out upon detecting any change in allocation status of each product due to the clerk 106 or the customer 107 based on images captured by the surveillance imaging device 105 instead of images captured by the mobile imaging device 108.
  • First, the image acquisition part 14 of the product management device 10 acquires images captured by the surveillance imaging device 105 (step S41). The image acquisition part 14 sends images to the human detector 13. Next, the human detector 13 recognizes a person reflected in images so as to identify whether the person is clerk 106 or the customer 107 (step S42). For example, an image of the uniform worn by the clerk 106 is stored on the storage unit 21 in advance, and therefore the human detector 13 determines whether clothes of a person reflected in an image matches the uniform of the clerk 106 via pattern matching. The human detector 13 identifies the person as the clerk 106 when the person's clothes reflected in an image matches the uniform. On the other hand, the human detector 13 identifies the person as the customer 107 when the person's clothes reflected in an image doses not match the uniform. When a facial image of the clerk 106 is stored on the storage unit 21 in advance, the human detector 13 may identify whether a person reflected in an image is the clerk 106 by way of the facial recognition technology. Upon identifying a person reflected in an image as the clerk 106 (step S42: YES), the human detector 13 detects the behavior of the clerk 106 from multiple images captured by the image acquisition part 14 in time series so as to notify it to the execution determination part 18. Similar to step S32 of FIG. 7, the execution determination part 18 determines whether or not the clerk 106 displays products on shelves (step S43). Upon determining that the clerk 106 or the customer 107 displays products on shelves (step S43: YES), the execution determination part 18 executes a product allocation inspection (step S44). Upon detecting that the clerk 106 does not display products on shelves (step S43: NO), the product management device 10 exits the procedure of FIG. 8.
  • Upon identifying a person reflected in an image as the customer 107 in step S42 (step S42: NO), the human detector 13 detects the behavior of the customer 107 via image recognition. For example, the human detector 13 detects any change in behavior or attitude of the customer 107 reflected in multiple time-series images obtained from the image acquisition part 14. That is, the human detector 13 detects whether the customer 107 is moving along a path between shelves, whether the customer 107 turns to a shelf, whether the customer 107 takes a product in his/her hand, or whether the customer 107 looks around in front of shelves. The human detector 13 detects the operation of the customer 107 so as to notify it to the execution determination part 18. In addition, the human detector 13 detects the position of the clerk 106 or the customer 107 reflected in an image. For example, the human detector 13 detects the positional information of the customer 107 as “Customer 107, before Shelf ID=001”.
  • Next, the execution determination part 18 determines whether the customer 107 takes a product in his/her hand (step S45). For example, the execution determination part 18 determines that the customer 107 takes a product in his/her hand when the human detector 13 notifies the execution determination part 18 of the operation of the customer 107 taking a product in his/her hand. Upon determining that the customer 107 takes a product in his/her hand (step S45: YES), the execution determination part 18 executes a product allocation inspection (step S44: YES). Upon determining that the customer 107 does not take any product in his/her hand (step S45: NO), the execution determination part 18 determines whether the customer 107 looks around in front of a shelf a predetermined number of times or more (step S46). For example, when the human detector 13 notifies that the customer 107 looks around in front of a shelf while the customer 107 repeatedly makes his/her movements a predetermined number of times or more, the execution determination part 18 determines that the customer 107 looks around his/her surroundings in front of a shelf a predetermined number of times or more. Upon determining that the customer 107 does not look around his/her surroundings (step S46: NO), the execution determination part 18 exits the procedure of FIG. 8. On the other hand, upon determining that the customer 107 looks around his/her surroundings (step S46: YES), the execution determination part 18 executes a product allocation inspection. Specifically, the execution determination part 18 indicates the commencement of a product allocation inspection with the position specifying part 11. The position specifying part 11 obtains information detected by the human detector 13. The position specifying part 11 specifies a shelf ID of a shelf subjected to product allocation inspection based on the detected information of the human detector 13 so as to read positional information concerning the shelf ID from the map information table. Next, the position specifying part 11 sends the positional information to the allocation status determination part 16 of the imaging device controller 12. After completion of step S44, the product management device 10 carries out a series of steps from step S12 onwards in FIG. 5.
  • It is assumed that the customer 107 may have a possibility of purchasing a product when the operation of the customer 107 taking the product in his/her hand is reflected in an image captured by the surveillance imaging device 105. According to the procedure of FIG. 8 in which the product management device 10 does not cooperate with the register 110, it is possible to execute a product allocation inspection being triggered by the customer 107 purchasing any product even when the product purchase information acquisition part 17 fails to acquire product purchase information.
  • It is assumed that a clerk may have a possibility of displaying products on shelves when the operation of a clerk repeatedly extending his/her arms towards shelves is reflected in images captured by the surveillance imaging device 105. Although a clerk other than the clerk 106 followed by the mobile imaging device 108 displays products on shelves at another location, it is possible to carry out a product allocation inspection being triggered by detecting the other clerk's displaying products on shelves.
  • Even when a customer taking a product in his/her hand is not reflected in images captured by the surveillance imaging device 105 depending on its installation position in a store, it is assumed that the customer has a possibility of conducting shoplifting when the customer frequently looks around his/her surroundings in front of a shelf. According to the procedure of FIG. 8, it is possible to confirm whether any product has been actually taken away from shelves by carrying out a product allocation inspection upon detecting a questionable behavior of a customer.
  • In this connection, it is possible to add a decision as to “whether a customer extends his/her arm towards a shelf” after step S46 denoting a decision as to “whether a customer looks around a predetermined number of times or more”. That is, it is possible to carry out a product allocation inspection of step S44 upon assuming a possibility of shoplifting only when a customer extends his/her arm toward a shelf after looking around his/her surroundings. In addition, it is possible to carry out collation of personal characters when facial images of persons conducting shoplifting in the past have been registered in a database in advance. In this case, it is possible to carry out the process of step S46 only when a person's facial image matches any one of persons' facial images registered in a database.
  • Next, other methods for assuming the occurrence of any change in the status of allocating products on shelves will be described with reference to FIG. 9A, 9B, and FIG. 10. FIGS. 9A and 9B show examples of data tables stored on the storage unit 21 being referred by the product management device 10. Specifically, FIG. 9A shows a product inspection frequency table while FIG. 9B shows a shelf inspection frequency table.
  • The product inspection frequency table of FIG. 9A has items of “Product ID” and “Frequency”. The item “Product ID” has records concerning identifications of products. The item “Frequency” has records concerning periods for performing a product allocation inspection for each product. For example, the product inspection frequency table describes that a product allocation inspection is carried out at the frequency of every hour with respect to the well-sold product A. In addition, the product inspection frequency table describes that a product allocation inspection is carried out at the frequency of every twenty-four hours or every seventy-two hours depending on sales conditions (or sales) with respect to products B and C.
  • The shelf inspection frequency table of FIG. 9B has items of “Shelf ID” and “Frequency”. The item “Shelf ID” has records concerning identifications of shelves. The item “Frequency” has records concerning periods for making a product allocation inspection for each shelf. For example, the shelf inspection frequency table describes that a product allocation inspection is carried out at the frequency of every hour with respect to a shelf having a shelf ID of “001” displaying well-sold products. In addition, the shelf inspection frequency table describes that a product allocation inspection is carried out at the frequency of every twenty-four hours or every seventy-two hours, depending on sales conditions of products displayed on shelves having shelf IDs “002” and “003”.
  • FIG. 10 is a flowchart showing a fifth procedure for the product management device 10. In FIG. 10, the product management device 10 carries out a product allocation inspection based on the frequency for checking products or shelves depending on sales conditions of products or sold conditions of products on shelves.
  • First, the execution determination part 18 reads one record from the product inspection frequency table of the storage unit 21 at a predetermined interval of time (step S51). The execution determination part 18 reads a last check time, concerning a product ID for the read record, from the product allocation table. The execution determination part 18 adds to the last check time concerning the product ID the numerical value assigned to the item “Frequency” read from the product inspection frequency table. When the item “Frequency” of the product inspection frequency table describes “Every Hour”, for example, one hour is added to the last check time.
  • Next, the execution determination part 18 determines whether the current matches the time for carrying out a product allocation inspection (step S52). Specifically, the execution determination part 18 compares the current time with the numeric value (i.e. addition time) that is produced by adding the numeric value of the item “Frequency” to the last check time. When the current time passes the addition time, the execution determination part 18 determines that it comes to the time for carrying out a product allocation inspection. Upon determining the time for carrying out a product allocation inspection (step S52: YES), the execution determination part 18 carries out a product allocation inspection (step S53). The position specifying part 11 reads a shelf ID for executing a product allocation inspection with reference to the shelf space allocation table based on the product ID, and then it reads positional information concerning the shelf ID from the map information table. After completion of step S53, the product management device 10 carries out a series of steps from step S12 onwards in FIG. 5.
  • Next, the execution determination part 18 determines whether any unread record is found in the product inspection frequency table (step S54). When it is determined that any unread record is found in the product inspection frequency table (step S54: YES), the execution determination part 18 returns to step S51 so as to read a new record. Thereafter, the execution determination part 18 repeatedly carries out a series of steps from step S52 onwards. When no unread record is found in the product inspection frequency table (step S54: NO), the execution determination part 18 exits the procedure of FIG. 10.
  • According to the procedure of FIG. 10, it is possible to determine the timing causing any change in allocation status of each product depending on sales performance of products in advance, thus carrying out a product allocation inspection at the timing. This makes it possible to efficiently confirm an allocation of products. In FIG. 10, the execution determination part 18 determines whether to carry out a product allocation inspection with reference to the product inspection frequency table; but this is not a restriction. To carry out a process for allocating products depending on sales performance of products, it is possible to carry out a product allocation inspection by executing the same procedure of FIG. 10 with reference to the shelf inspection frequency table instead of the product inspection frequency table.
  • FIG. 11 is a flowchart showing a sixth procedure for the product management device 10. FIG. 11 shows conditions that are incidentally considered before determining whether to execute a product allocation inspection with the execution determination part 18. In the precondition for the explanation of FIG. 11, the execution determination part 18 has already determined to carry out a product allocation inspection for a certain shelf.
  • First, the image acquisition part 14 acquires images captured by surveillance imaging device 105 (step S61). The image acquisition part 14 sends images to the human detector 13. The human detector 13 detects all the positions indicating possible existence of customers in images via image recognition. The human detector 13 sends positional information to the execution determination part 18.
  • The execution determination part 18 compares the positional information acquired from the human detector 13 with the positional information concerning a shelf ID of a shelf specified by the position specifying part 11 so as to determine whether any customer is found in proximity to a shelf subjected to product allocation inspection (step S62). When any customer is found in proximity to a shelf (step S62: YES), the execution determination part 18 proceeds to step S65. When no customer is found in proximity to the shelf (step S62: NO), the execution determination part 18 determines whether a predetermined time or more has elapsed after the previous timing of executing a product allocation inspection with respect to the shelf subjected to product allocation inspection (step S63). Specifically, the execution determination part 18 compares the current time with the time (i.e. the addition time) that is produced by adding the predetermined time to the last check time for a record concerning the shelf ID in the shelf space allocation table, and therefore it determines that the predetermined time or more has passed the previous timing of executing a product allocation inspection when the current time is equal to or greater than the addition time (step S63: YES). Thereafter, the execution determination part 18 executes a product allocation inspection (step S64). On the other hand, when the execution determination part 18 determines that the predetermined time or more has passed the previous timing of executing a product allocation inspection since the current time is smaller than the addition time (step S63: NO), the execution determination part 18 stops executing a product allocation inspection (step S65).
  • According to the procedure of FIG. 11, it is possible to stop executing a product allocation inspection when any customer is found in proximity to a shelf, and therefore it is possible to prevent an event that the mobile imaging device 108 moving around each customer may disturb each customer from selecting products or an event of invading a customer's privacy by mistakenly capturing an image of each customer. In addition, it is possible for the present embodiment to suspend a further product allocation inspection for a predetermined time with respect to a shelf that has been already subjected to product allocation inspection once. For this reason, it is possible to prevent a product allocation inspection from being repeatedly executed at more than the necessary frequency with respect to the same shelf. In FIG. 11, it is unnecessary to execute both the decision of step S11 and the decision of step S63; hence, it is possible to execute one of those steps.
  • The foregoing embodiment is explained using an unmanned air vehicle as the mobile imaging device 108; but this is not a restriction. For example, it is possible to facilitate rails on the ceiling or the floor 100 of a store so that the mobile imaging device 108 can move along the rails. That is, it is possible to install in the shelf space allocation management system any transportation means that enables the mobile imaging device 108 to move in a store.
  • The aforementioned product management device 10 includes a computer system therein. In addition, the product management device 10 implements processes using programs stored on computer-readable storage media. That is, the computer system loads and executes programs to achieve the foregoing processes. Herein, the computer-readable storage media refer to magnetic disks, magneto-optical disks, CD-ROM, DVD-ROM, semiconductor memory and the like. In addition, it is possible to deliver programs to computers through communication lines, and therefore computers may execute programs.
  • The foregoing programs may achieve part of the foregoing functions. Alternatively, the foregoing programs may be differential files (or differential programs) that can achieve the foregoing functions by being combined with pre-installed programs in computer system. In FIG. 3, the floor 100 shows an example of an area arranging shelves in a store while an image captured by the surveillance imaging device 105 shows an example of an image monitoring the internal state of a store.
  • INDUSTRIAL APPLICATION
  • The present invention provides a shelf space allocation management system that automatically executes product allocation inspection for products displayed on shelves in a store at an appropriate timing; however, its applications should not be limited to products. For example, the present invention is applicable to any types of systems that may manage electronic parts and materials allocated on multiple shelves.
  • REFERENCE SIGNS LIST
    • 10 product management device
    • 11 position specifying part
    • 12 imaging device controller
    • 13 human detector
    • 14 image acquisition part
    • 15 product recognition part
    • 16 allocation status determination part
    • 17 product purchase information acquisition part
    • 18 execution determination part
    • 19 output part
    • 20 communication part
    • 21 storage unit
    • 100 floor
    • 101-104 shelf
    • 105 surveillance imaging device
    • 106, 109 clerk
    • 107 customer
    • 108 mobile imaging device
    • 110 register

Claims (16)

1-11. (canceled)
12. A shelf space allocation management device for managing products allocated on a shelf, the shelf space allocation management device comprising:
a memory storing instructions; and
one or more processors coupled to the memory, wherein the one or more processors are configured to execute the instructions to:
acquire a first image taken by a surveillance imaging device;
determine whether a person reflected in the first image has performed a predetermined action on the shelf;
specify position information of the shelf for a product allocation inspection based on a determination result of an action of the person;
move a mobile imaging device to an image-capture position based on the specified position information;
adjust a direction of the mobile imaging device upon determining that the mobile imaging device reaches the image-capture position;
control the mobile imaging device to capture an image of the shelf at the image-capture position;
determine whether a type and an allocation status of a plurality of products reflected in a second image match a predetermined type and a predetermined allocation status of the plurality of products respectively, wherein the second image is taken by the mobile imaging device; and
execute a product allocation inspection based on a determination result of the type and the allocation status of the product reflected in the second image.
13. The shelf space allocation management device according to claim 12, wherein
the one or more processors are further configured to execute the instructions to:
identify whether the person reflected in the first image is a clerk or a customer.
14. The shelf space allocation management device according to claim 13, wherein
the predetermined action is an action that the clerk has displayed the product on the shelf.
15. The shelf space allocation management device according to claim 13, wherein
the predetermined action is an action that the customer has taken the product on the shelf.
16. The shelf space allocation management device according to claim 13, wherein
the predetermined action is an action that the customer has looked around in front of a shelf for a predetermined number of times or more.
17. A shelf space allocation management method for managing products allocated on a shelf, the shelf space allocation management method comprising:
acquiring a first image taken by a surveillance imaging device;
determining whether a person reflected in the first image has performed a predetermined action on the shelf;
specifying position information of the shelf for product allocation inspection based on a determination result of an action of the person;
moving a mobile imaging device to an image-capture position based on the specified position information;
adjusting a direction of the mobile imaging device upon determining that the mobile imaging device reaches the image-capture position;
controlling the mobile imaging device to capture an image of the shelf at the image-capture position;
determining whether a type and an allocation status of a plurality of products reflected in a second image match a predetermined type and a predetermined allocation status of the plurality of products respectively, wherein the second image is taken by the mobile imaging device; and
executing a product allocation inspection based on a determination result of the type and the allocation status of the product reflected in the second image.
18. The shelf space allocation management method according to claim 17, comprising:
identifying whether the person reflected in the first image is a clerk or a customer.
19. The shelf space allocation management method according to claim 18, wherein
the predetermined action is an action that the clerk has displayed the product on the shelf.
20. The shelf space allocation management method according to claim 18, wherein
the predetermined action is an action that the customer has taken the product on the shelf.
21. The shelf space allocation management method according to claim 18, wherein
the predetermined action is an action that the customer has looked around in front of a shelf for a predetermined number of times or more.
22. A non-transitory computer readable medium storing a shelf space allocation management program causing a computer to perform a shelf space allocation management process for managing products allocated on a shelf, the shelf space allocation management process comprising:
acquiring a first image taken by a surveillance imaging device;
determining whether a person reflected in the first image has performed a predetermined action on the shelf;
specifying position information of the shelf for product allocation inspection based on a determination result of an action of the person;
moving a mobile imaging device to an image-capture position based on the specified position information;
adjusting a direction of the mobile imaging device upon determining that the mobile imaging device reaches the image-capture position;
controlling the mobile imaging device to capture an image of the shelf at the image-capture position;
determining whether a type and an allocation status of a plurality of products reflected in a second image match a predetermined type and a predetermined allocation status of the plurality of products respectively, wherein the second image is taken by the mobile imaging device; and
executing a product allocation inspection based on a determination result of the type and the allocation status of the product reflected in the second image.
23. The non-transitory computer readable medium according to claim 22, the shelf space allocation management process comprising:
identifying whether the person reflected in the first image is a clerk or a customer.
24. The non-transitory computer readable medium according to claim 23, wherein
the predetermined action is an action that the clerk has displayed the product on the shelf.
25. The non-transitory computer readable medium according to claim 23, wherein
the predetermined action is an action that the customer has taken the product on the shelf.
26. The non-transitory computer readable medium according to claim 23, wherein
the predetermined action is an action that the customer has looked around in front of a shelf for a predetermined number of times or more.
US16/125,345 2015-01-22 2018-09-07 Shelf space allocation management device and shelf space allocation management method Abandoned US20190009987A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US16/125,345 US20190009987A1 (en) 2015-01-22 2018-09-07 Shelf space allocation management device and shelf space allocation management method

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
JP2015-009978 2015-01-22
JP2015009978 2015-01-22
PCT/JP2016/051568 WO2016117600A1 (en) 2015-01-22 2016-01-20 Product shelf allocation management device and product shelf allocation management method
US201715544750A 2017-07-19 2017-07-19
US16/125,345 US20190009987A1 (en) 2015-01-22 2018-09-07 Shelf space allocation management device and shelf space allocation management method

Related Parent Applications (2)

Application Number Title Priority Date Filing Date
PCT/JP2016/051568 Continuation WO2016117600A1 (en) 2015-01-22 2016-01-20 Product shelf allocation management device and product shelf allocation management method
US15/544,750 Continuation US20180002109A1 (en) 2015-01-22 2016-01-20 Shelf space allocation management device and shelf space allocation management method

Publications (1)

Publication Number Publication Date
US20190009987A1 true US20190009987A1 (en) 2019-01-10

Family

ID=56417140

Family Applications (4)

Application Number Title Priority Date Filing Date
US15/544,750 Abandoned US20180002109A1 (en) 2015-01-22 2016-01-20 Shelf space allocation management device and shelf space allocation management method
US16/125,345 Abandoned US20190009987A1 (en) 2015-01-22 2018-09-07 Shelf space allocation management device and shelf space allocation management method
US16/125,383 Active US10872264B2 (en) 2015-01-22 2018-09-07 Shelf space allocation management device and shelf space allocation management method
US16/125,308 Active US10891470B2 (en) 2015-01-22 2018-09-07 Shelf space allocation management device and shelf space allocation management method

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US15/544,750 Abandoned US20180002109A1 (en) 2015-01-22 2016-01-20 Shelf space allocation management device and shelf space allocation management method

Family Applications After (2)

Application Number Title Priority Date Filing Date
US16/125,383 Active US10872264B2 (en) 2015-01-22 2018-09-07 Shelf space allocation management device and shelf space allocation management method
US16/125,308 Active US10891470B2 (en) 2015-01-22 2018-09-07 Shelf space allocation management device and shelf space allocation management method

Country Status (3)

Country Link
US (4) US20180002109A1 (en)
JP (1) JP6791534B2 (en)
WO (1) WO2016117600A1 (en)

Families Citing this family (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6197952B2 (en) * 2014-05-12 2017-09-20 富士通株式会社 Product information output method, product information output program and control device
WO2017208720A1 (en) * 2016-06-03 2017-12-07 裕之 本地川 Information collection device, article management system using same, and winding device
JP6329225B2 (en) * 2016-09-12 2018-05-23 ユーピーアール株式会社 Luggage monitoring system in warehouse using drone
CN110383315B (en) * 2017-03-03 2022-06-03 日本电气株式会社 Information processing system, information processing apparatus, information processing method, and information processing program
CN110709868A (en) * 2017-04-07 2020-01-17 思比机器人公司 Method for tracking inventory levels within a store
US20180293596A1 (en) * 2017-04-10 2018-10-11 International Business Machines Corporation Shelf image recognition analytics
JP7019357B2 (en) * 2017-09-19 2022-02-15 東芝テック株式会社 Shelf information estimation device and information processing program
JP7243627B2 (en) * 2017-09-29 2023-03-22 日本電気株式会社 Information processing device, information processing system, control method, and program
WO2019171572A1 (en) * 2018-03-09 2019-09-12 日本電気株式会社 Self-checkout system, purchased product management method, and purchased product management program
US11049279B2 (en) * 2018-03-27 2021-06-29 Denso Wave Incorporated Device for detecting positional relationship among objects
CN108647242B (en) * 2018-04-10 2022-04-29 北京天正聚合科技有限公司 Generation method and system of thermodynamic diagram
JP2020009216A (en) * 2018-07-10 2020-01-16 富士ゼロックス株式会社 Bulletin object management system, and program
KR102200579B1 (en) * 2018-10-25 2021-01-11 한국로봇융합연구원 Goods auto display system and method thereof
JP7387981B2 (en) 2018-11-15 2023-11-29 日本電気株式会社 Image processing device, image processing method, program
JP7287015B2 (en) * 2019-03-14 2023-06-06 富士電機株式会社 Merchandise management system and merchandise management method
WO2020202318A1 (en) * 2019-03-29 2020-10-08 日本電気株式会社 Sales management system, store device, sales management method, and program
JP6710876B1 (en) * 2019-04-01 2020-06-17 株式会社テクムズ Automatic payment management method for products
JP7366651B2 (en) * 2019-09-03 2023-10-23 東芝テック株式会社 Shelf imaging device and information processing device
CN110675106A (en) * 2019-09-12 2020-01-10 创新奇智(合肥)科技有限公司 Unmanned container commodity identification method based on dynamic commodity inventory information
CA3173972A1 (en) * 2020-04-29 2021-11-04 Ivar Fjeldheim Method for monitoring a storage system with a flying drone
JP7400962B2 (en) * 2020-05-14 2023-12-19 日本電気株式会社 Product identification device, product identification method, and program
JP2021196885A (en) 2020-06-15 2021-12-27 パナソニックIpマネジメント株式会社 Monitoring device, monitoring method, and computer program
WO2022065282A1 (en) * 2020-09-28 2022-03-31 日本電気株式会社 Information processing device, system, information processing method, and recording medium

Citations (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6304855B1 (en) * 1993-11-30 2001-10-16 Raymond R. Burke Computer system for allowing a consumer to purchase packaged goods at home
US20060149634A1 (en) * 2004-12-30 2006-07-06 Kimberly-Clark Worldwide, Inc. Method and system for determining product assortment for retail placement
US20070288296A1 (en) * 2006-05-05 2007-12-13 Graham Lewis System and method for automatic placement of products within shelving areas using a planogram with two-dimensional sequencing
US20080159634A1 (en) * 2006-12-30 2008-07-03 Rajeev Sharma Method and system for automatically analyzing categories in a physical space based on the visual characterization of people
US20080208719A1 (en) * 2007-02-28 2008-08-28 Fair Isaac Corporation Expert system for optimization of retail shelf space
US20080215462A1 (en) * 2007-02-12 2008-09-04 Sorensen Associates Inc Still image shopping event monitoring and analysis system and method
US20090059270A1 (en) * 2007-08-31 2009-03-05 Agata Opalach Planogram Extraction Based On Image Processing
US20090063306A1 (en) * 2007-08-31 2009-03-05 Andrew Fano Determination Of Product Display Parameters Based On Image Processing
US20100171826A1 (en) * 2006-04-12 2010-07-08 Store Eyes, Inc. Method for measuring retail display and compliance
US20110025461A1 (en) * 2008-03-25 2011-02-03 Ishida Co., Ltd. Electronic shelf label system
US20130051667A1 (en) * 2011-08-31 2013-02-28 Kevin Keqiang Deng Image recognition to support shelf auditing for consumer research
US20130235206A1 (en) * 2012-03-12 2013-09-12 Numerex Corp. System and Method of On-Shelf Inventory Management
US20140006229A1 (en) * 2012-04-05 2014-01-02 Thomas A. Birch Method and Apparatus for Managing Product Placement on Store Shelf
US8630924B2 (en) * 2007-08-31 2014-01-14 Accenture Global Services Limited Detection of stock out conditions based on image processing
US8922163B2 (en) * 2009-04-24 2014-12-30 Murray MacDonald Automated battery and data delivery system
US20150088641A1 (en) * 2013-09-26 2015-03-26 Panasonic Corporation Method for providing information and information providing system
US20150088701A1 (en) * 2013-09-23 2015-03-26 Daniel Norwood Desmarais System and method for improved planogram generation
US20150178565A1 (en) * 2010-03-12 2015-06-25 Google Inc. System and method for determining position of a device
US20150213498A1 (en) * 2014-01-29 2015-07-30 Fujitsu Limited Method and apparatus for providing product information
US20150379366A1 (en) * 2013-03-04 2015-12-31 Nec Corporation Article management system, information processing apparatus, and control method and control program of information processing apparatus
US20160210829A1 (en) * 2013-09-06 2016-07-21 Nec Corporation Security system, security method, and non-transitory computer readable medium
US9636825B2 (en) * 2014-06-26 2017-05-02 Robotex Inc. Robotic logistics system

Family Cites Families (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0581552A (en) 1991-07-12 1993-04-02 Omron Corp Merchandise monitoring system
JP2001088912A (en) * 1999-09-20 2001-04-03 Fujitsu General Ltd Stocktaking managing method and stocktaking system by image recognition
JP3908047B2 (en) * 2002-02-04 2007-04-25 富士通株式会社 Display state monitoring method and display state monitoring program
JP2003323539A (en) * 2002-04-30 2003-11-14 Fujitsu Ltd Store space management system and information processing device
JP2005286619A (en) * 2004-03-29 2005-10-13 Matsushita Electric Ind Co Ltd Monitoring camera system
JP2006113711A (en) * 2004-10-13 2006-04-27 Matsushita Electric Ind Co Ltd Marketing information providing system
US8429004B2 (en) * 2005-04-13 2013-04-23 Store Eyes, Inc. Method and system for automatically measuring retail store display compliance
JP2006309280A (en) * 2005-04-26 2006-11-09 Hitachi Software Eng Co Ltd System for analyzing purchase behavior of customer in store using noncontact ic tag
US7778744B2 (en) 2006-04-20 2010-08-17 Honeywell International Inc. Avionics framework
JP2008015577A (en) * 2006-07-03 2008-01-24 Matsushita Electric Ind Co Ltd Consumer's act analyzing device and consumer's act analyzing method
JP2009003701A (en) * 2007-06-21 2009-01-08 Denso Corp Information system and information processing apparatus
US20100138281A1 (en) * 2008-11-12 2010-06-03 Yinying Zhang System and method for retail store shelf stock monitoring, predicting, and reporting
US9940525B2 (en) * 2012-11-19 2018-04-10 Mace Wolf Image capture with privacy protection
JP2014164594A (en) 2013-02-26 2014-09-08 Nec Corp Portable terminal, information system, information collection method, and program
JP6108159B2 (en) * 2013-03-04 2017-04-05 日本電気株式会社 Information processing system, information processing apparatus, control method thereof, and control program
EP2973295A4 (en) * 2013-03-15 2016-11-16 Proximity Concepts Llc Systems and methods involving proximity, mapping, indexing, mobile, advertising and/or other features
US8989922B2 (en) 2013-03-15 2015-03-24 Azure Sky Group, LLC. Modular drone and methods for use
BR112015020989A2 (en) * 2013-03-29 2017-07-18 Nec Corp target object identification device, target object identification method, and target object identification program
JP2014222374A (en) * 2013-05-13 2014-11-27 大日本印刷株式会社 Planogram information generation device, planogram information generation method, program, planogram reproduction system and planogram reproduction device
US9280757B2 (en) * 2013-05-14 2016-03-08 DecisionGPS, LLC Automated inventory management
US20150321758A1 (en) 2013-08-31 2015-11-12 II Peter Christopher Sarna UAV deployment and control system
US9928531B2 (en) * 2014-02-24 2018-03-27 Intelligrated Headquarters Llc In store voice picking system
US9858615B2 (en) * 2014-04-10 2018-01-02 Point Inside, Inc. Location assignment system and method
WO2015179797A1 (en) * 2014-05-23 2015-11-26 Lily Robotics, Inc. Unmanned aerial copter for photography and/or videography
US10180321B2 (en) * 2014-05-31 2019-01-15 3Vr Security, Inc. Calculating duration time in a confined space
US10402777B2 (en) * 2014-06-18 2019-09-03 Trax Technology Solutions Pte Ltd. Method and a system for object recognition
JP5673888B1 (en) * 2014-10-20 2015-02-18 富士ゼロックス株式会社 Information notification program and information processing apparatus
US10373116B2 (en) * 2014-10-24 2019-08-06 Fellow, Inc. Intelligent inventory management and related systems and methods
US9305216B1 (en) * 2014-12-15 2016-04-05 Amazon Technologies, Inc. Context-based detection and classification of actions

Patent Citations (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6304855B1 (en) * 1993-11-30 2001-10-16 Raymond R. Burke Computer system for allowing a consumer to purchase packaged goods at home
US20060149634A1 (en) * 2004-12-30 2006-07-06 Kimberly-Clark Worldwide, Inc. Method and system for determining product assortment for retail placement
US20100171826A1 (en) * 2006-04-12 2010-07-08 Store Eyes, Inc. Method for measuring retail display and compliance
US20070288296A1 (en) * 2006-05-05 2007-12-13 Graham Lewis System and method for automatic placement of products within shelving areas using a planogram with two-dimensional sequencing
US20080159634A1 (en) * 2006-12-30 2008-07-03 Rajeev Sharma Method and system for automatically analyzing categories in a physical space based on the visual characterization of people
US8189926B2 (en) * 2006-12-30 2012-05-29 Videomining Corporation Method and system for automatically analyzing categories in a physical space based on the visual characterization of people
US20080215462A1 (en) * 2007-02-12 2008-09-04 Sorensen Associates Inc Still image shopping event monitoring and analysis system and method
US20080208719A1 (en) * 2007-02-28 2008-08-28 Fair Isaac Corporation Expert system for optimization of retail shelf space
US20090059270A1 (en) * 2007-08-31 2009-03-05 Agata Opalach Planogram Extraction Based On Image Processing
US20090063306A1 (en) * 2007-08-31 2009-03-05 Andrew Fano Determination Of Product Display Parameters Based On Image Processing
US8630924B2 (en) * 2007-08-31 2014-01-14 Accenture Global Services Limited Detection of stock out conditions based on image processing
US20110025461A1 (en) * 2008-03-25 2011-02-03 Ishida Co., Ltd. Electronic shelf label system
US8922163B2 (en) * 2009-04-24 2014-12-30 Murray MacDonald Automated battery and data delivery system
US20150178565A1 (en) * 2010-03-12 2015-06-25 Google Inc. System and method for determining position of a device
US20130051667A1 (en) * 2011-08-31 2013-02-28 Kevin Keqiang Deng Image recognition to support shelf auditing for consumer research
US20130235206A1 (en) * 2012-03-12 2013-09-12 Numerex Corp. System and Method of On-Shelf Inventory Management
US20140006229A1 (en) * 2012-04-05 2014-01-02 Thomas A. Birch Method and Apparatus for Managing Product Placement on Store Shelf
US9659272B2 (en) * 2012-04-05 2017-05-23 Intel Corporation Method and apparatus for managing product placement on store shelf
US20150379366A1 (en) * 2013-03-04 2015-12-31 Nec Corporation Article management system, information processing apparatus, and control method and control program of information processing apparatus
US20160210829A1 (en) * 2013-09-06 2016-07-21 Nec Corporation Security system, security method, and non-transitory computer readable medium
US20150088701A1 (en) * 2013-09-23 2015-03-26 Daniel Norwood Desmarais System and method for improved planogram generation
US20150088641A1 (en) * 2013-09-26 2015-03-26 Panasonic Corporation Method for providing information and information providing system
US20150213498A1 (en) * 2014-01-29 2015-07-30 Fujitsu Limited Method and apparatus for providing product information
US9636825B2 (en) * 2014-06-26 2017-05-02 Robotex Inc. Robotic logistics system

Also Published As

Publication number Publication date
JP6791534B2 (en) 2020-11-25
JPWO2016117600A1 (en) 2017-11-09
US20190009986A1 (en) 2019-01-10
US20180002109A1 (en) 2018-01-04
US10872264B2 (en) 2020-12-22
WO2016117600A1 (en) 2016-07-28
US20190002201A1 (en) 2019-01-03
US10891470B2 (en) 2021-01-12

Similar Documents

Publication Publication Date Title
US10891470B2 (en) Shelf space allocation management device and shelf space allocation management method
US20210304176A1 (en) Information processing system
US11408965B2 (en) Methods and apparatus for locating RFID tags
US20150066550A1 (en) Flow line data analysis device, system, non-transitory computer readable medium and method
US20150066551A1 (en) Flow line data analysis device, system, program and method
US11328513B1 (en) Agent re-verification and resolution using imaging
US20210248889A1 (en) Article display system
EP3689791A1 (en) Projection instruction device, package sorting system, and projection instruction method
US20220122125A1 (en) Information processing device, information processing system, display control method, and recording medium
EP3689792B1 (en) Package recognition device, package sorting system and package recognition method
US20240037776A1 (en) Analysis system, analysis apparatus, and analysis program
US20240036635A1 (en) Display system, control apparatus, and control program
US20220300989A1 (en) Store system and method
WO2023026277A1 (en) Context-based moniitoring of hand actions
JP2021084772A (en) Management device
JP2021189691A (en) Management device and commodity shelf
WO2024064163A1 (en) Customized retail environments
CN117121056A (en) Position detection system

Legal Events

Date Code Title Description
STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION