WO2018005369A1 - Multiple camera system for inventory tracking - Google Patents

Multiple camera system for inventory tracking Download PDF

Info

Publication number
WO2018005369A1
WO2018005369A1 PCT/US2017/039304 US2017039304W WO2018005369A1 WO 2018005369 A1 WO2018005369 A1 WO 2018005369A1 US 2017039304 W US2017039304 W US 2017039304W WO 2018005369 A1 WO2018005369 A1 WO 2018005369A1
Authority
WO
WIPO (PCT)
Prior art keywords
inventory
multiple cameras
camera
shelf
product
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/US2017/039304
Other languages
English (en)
French (fr)
Inventor
Stephen Williams
Juan Pablo GONZALEZ
Sarjoun Skaff
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Bossa Nova Robotics IP Inc
Original Assignee
Bossa Nova Robotics IP Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Bossa Nova Robotics IP Inc filed Critical Bossa Nova Robotics IP Inc
Priority to CA3028156A priority Critical patent/CA3028156C/en
Priority to EP17821015.9A priority patent/EP3479298A4/en
Priority to CN201780038500.6A priority patent/CN109328359A/zh
Priority to JP2018566442A priority patent/JP2019530035A/ja
Publication of WO2018005369A1 publication Critical patent/WO2018005369A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/08Logistics, e.g. warehousing, loading or distribution; Inventory or stock management
    • G06Q10/087Inventory or stock management, e.g. order filling, procurement or balancing against orders
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/60Intended control result
    • G05D1/656Interaction with payloads or external entities
    • G05D1/689Pointing payloads towards fixed or moving targets
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/08Logistics, e.g. warehousing, loading or distribution; Inventory or stock management
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/12Details of acquisition arrangements; Constructional details thereof
    • G06V10/14Optical characteristics of the device performing the acquisition or on the illumination arrangements
    • G06V10/147Details of sensors, e.g. sensor lenses
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D2105/00Specific applications of the controlled vehicles
    • G05D2105/80Specific applications of the controlled vehicles for information gathering, e.g. for academic research
    • G05D2105/93Specific applications of the controlled vehicles for information gathering, e.g. for academic research for inventory
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D2107/00Specific environments of the controlled vehicles
    • G05D2107/70Industrial sites, e.g. warehouses or factories
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D2109/00Types of controlled vehicles
    • G05D2109/10Land vehicles
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D2111/00Details of signals used for control of position, course, altitude or attitude of land, water, air or space vehicles
    • G05D2111/10Optical signals

Definitions

  • the present disclosure relates generally to a multiple camera sensor suite capable of accurately monitoring retail or warehouse product inventory.
  • the multiple camera sensor suite can include onboard processing to provide near real time product tracking and reduce data transfer requirements.
  • BACKGROUND Retail stores or warehouses can have thousands of distinct products that are often sold, removed, added, or repositioned. Even with frequent restocking schedules, products assumed to be in stock may actually be out of stock, decreasing both sales and customer satisfaction. Point of sales data can be used to roughly estimate product availability, but does not help with identifying misplaced, stolen, or damaged products, all of which can reduce product availability. However, manually monitoring product inventory and tracking product position is expensive and time consuming.
  • Machine vision can be used to assist in shelf space monitoring. For example, large numbers of fixed position cameras can be used throughout a store to monitor aisles, with large gaps in shelf space being flagged. Alternatively, a smaller number of movable cameras can be used to scan a store aisle. Even with such systems, human intervention is usually required to determine product identification number, product count, and to search for misplaced product inventory.
  • a low cost, accurate, and scalable camera system for product or other inventory monitoring can include a movable base. Multiple cameras supported by the movable base are directable toward shelves or other systems for holding products or inventory.
  • a processing module is connected to the multiple cameras and able to construct from the camera derived images an updateable map of product or inventory position. Because it can be updated in real or near real time, this map is known as a "realogram" to distinguish from conventional "planograms" that take the form of 3D models, cartoons, diagrams or lists that show how and where specific retail products and signage should be placed on shelves or displays. Realograms can be locally stored with a data storage module connected to the processing module.
  • a communication module can be connected to the processing module to transfer realogram data to remote locations, including store servers or other supported camera systems, and additionally receive inventory information including planograms to aid in realogram construction.
  • this system can be used detect out of stock products, estimate depleted products, estimate amount of products including in stacked piles, estimate products heights, lengths and widths, build 3D models of products, determine products' positions and orientations, determine whether one or more products are in disorganized on-shelf presentation that requires corrective action such as facing or zoning operations, estimate freshness of products such as produce, estimate quality of products including packaging integrity, locate products, including at home locations, secondary locations, top stock, bottom stock, and in the backroom, detect a misplaced product event (also known as a plug), identify misplaced products, estimate or count the number of product facings, compare the number of product facings to the planogram, estimate label locations, detect label type, read label content, including product name, barcode, UPC code and pricing, detect missing labels, compare label locations to the planogram
  • the movable base can be a manually pushed or guidable cart.
  • the movable base can be a tele-operated robot, or in preferred embodiments, an autonomous robot capable of guiding itself through a store or warehouse.
  • multiple autonomous robots can be used. Aisles can be regularly inspected to create realograms, with aisles having high product movement being inspected more often.
  • the multiple cameras are typically positioned a set distance from the shelves during the inspection process.
  • the shelves can be lit with ambient lighting, or in some embodiments, by an array of LED or other directable light sources positioned near the cameras.
  • the multiple cameras can be linearly mounted in vertical, horizontal, or other suitable orientation on a camera support.
  • multiple cameras are fixedly mounted on a camera support.
  • Such cameras can be arranged to point upward, downward, forward, backward, or level with respect to the camera support and the shelves. This advantageously permits a reduction in glare from products and shelving fixtures having highly reflective surfaces by orienting cameras out of the way of reflected light paths.
  • multiple cameras with overlapping fields of view can result in at least one image with little or no glare.
  • the cameras can include one or more movable cameras, zoom cameras, focusable cameras, wide-field cameras, infrared cameras, or other specialty cameras to aid in product identification or image construction, reduce power consumption and motion blur, and relax the requirement of positioning the cameras at a set distance from shelves.
  • a wide-field camera can be used to create a template into which data from higher resolution cameras with a narrow field of view are mapped.
  • a tilt controllable, high resolution camera positioned on the camera support can be used to detect shelf labels and their content, including the price and product name, and decode their barcodes.
  • an inventory monitoring method includes the steps of allowing an autonomous robot to move along an aisle that is lined with shelves capable of holding inventory or products, with the autonomous robot acting as a movable base for multiple cameras. Multiple cameras are directed toward inventory on the shelf lined aisle, with data derived at least in part from these cameras being used to construct a realogram of inventory using a processing module contained in the autonomous robot. Realogram data created by the processing module can be transferred to remote locations using a communication module, and inventory information received via the communication module can be used to aid in realogram construction.
  • an inventory monitoring method includes the steps of allowing an autonomous robot to move along a shelf lined aisle holding inventory, with the autonomous robot acting as a movable base for multiple cameras.
  • the autonomous robot can maintain a substantially constant distance from the shelf lined aisle holding inventory while moving in a forward or reverse direction.
  • at least part of a realogram of inventory positioned along a shelf lined aisle holding inventory can be constructed.
  • the realogram is created and updated with a locally sited data storage and a processing module contained in the autonomous robot.
  • the autonomous robot can pause, reverse, or mark for further multiple camera inspection if realogram creation for a portion of the shelf lined aisle is incomplete.
  • common issues associated with taking pictures from a moving base can be reduced by orientation of one or more of the multiple cameras in such a way as to take advantage of the rolling shutter effects and the direction of travel of the autonomous robot.
  • aligning a camera in such a way as to take advantage of the "rasterized" delay of the rolling shutter reduces the artifacts (elongation/shortening) that could occur while the robot is traveling in its path.
  • FIG. 1 is an illustration of a camera system mounted on a movable base to track product changes in aisle shelves or other suitable targets;
  • FIG. 2 is a cartoon illustrating two autonomous robots inspecting opposite shelves in an aisle
  • FIG. 3 is an illustration of various systems and electronic modules connected to inventory cameras
  • FIG. 4 is an illustration of steps in one embodiment of operation
  • FIG. 5A and B are respectively examples in side view and cross section of an autonomous robot capable of acting as a mobile base for a camera system
  • FIG. 6 is a top view, looking down, of various possible camera support sites on a mobile base.
  • FIG. 1 is an illustration of an inventory monitoring camera system 100 mounted on a movable base 1 10 (with drive wheels 1 14) to track product changes in aisle shelves or other targets 102.
  • the movable base 1 10 is an autonomous robot having a navigation and object sensing suite 130 that is capable of independently navigating and moving throughout a building.
  • the autonomous robot has multiple cameras 140 attached to movable base 1 10 by a vertically extending camera support 140.
  • Lights 150 are positioned to direct light toward target 102.
  • the object sensing suite includes forward (133), side (134 and 135), top (132) and rear (not shown) image and depth sensors to aid in object detection, localization, and navigation.
  • image sensors can be depth sensors that infer depth from stereo images, project an infrared mesh overlay that allows rough determination of object distance in an image, or that infer depth from the time of flight of light reflecting off the target.
  • simple cameras and various image processing algorithms for identifying object position and location can be used.
  • ultrasonic sensors, radar systems, magnetometers or the like can be used to aid in navigation.
  • FIG. 1 various representative camera types useful for constructing an updateable realogram are shown.
  • a realogram can use camera derived images to produce an updateable map of product or inventory position.
  • camera derived images can provide other useful inventory related information such as out of stocks, low stock, label location and content, shelves height and depth, section boundaries, or other operations or marketing/sales relevant data that can be extracted, utilized, and potentially delivered as a service to customers.
  • one or more shelf units e.g.
  • target 102) would be imaged by a diverse set of camera types, including downwardly (142 and 144) or upwardly (143 and 148) fixed focal length cameras that cover a defined field less than the whole of a target shelf unit; a wide field camera 145 to provide greater photographic coverage than the fixed focal length cameras; and a narrow field, zoomable telephoto 146 to capture bar codes, product identification numbers, and shelf labels.
  • a high resolution, tilt controllable camera can be used to identify shelf labels.
  • These camera 140 derived images can be stitched together, with products in the images identified, and position determined.
  • the multiple cameras are typically positioned a set distance from the shelves during the inspection process.
  • the shelves can be illuminated with LED or other directable lights 150 positioned on or near the cameras.
  • the multiple cameras can be linearly mounted in vertical, horizontal, or other suitable orientation on a camera support.
  • multiple cameras are fixedly mounted on a camera support. Such cameras can be arranged to point upward, downward, or level with respect to the camera support and the shelves. This advantageously permits a reduction in glare from products having highly reflective surfaces, since multiple cameras pointed in slightly different directions can result in at least one image with little or no glare.
  • Electronic control unit 120 contains an autonomous robot sensing and navigation control module 124 that manages robot responses.
  • Robot position localization may utilize external markers and fiducials, or rely solely on localization information provided by robot-mounted sensors.
  • Sensors for position determination include previously noted imaging, optical, ultrasonic sonar, radar, Lidar, Time of Flight, structured light, or other means of measuring distance between the robot and the environment, or incremental distance traveled by the mobile base, using techniques that include but are not limited to triangulation, visual flow, visual odometry and wheel odometry.
  • Electronic control unit 120 also provides image processing using a camera control and data processing module 122.
  • Autonomous robot sensing and navigation control module 124 manages robot responses
  • communication module 126 manages data input and output.
  • the camera control and data processing module 122 can include a separate data storage module 123 (e.g. solid state hard drives) connected to a processing module 125.
  • the communication module 126 is connected to the processing module 125 to transfer realogram data to remote locations, including store servers or other supported camera systems, and additionally receive inventory information to aid in realogram construction.
  • realogram data is primarily stored and images are processed within the autonomous robot.
  • this reduces data transfer requirements, and permits operation even when local or cloud servers are not available.
  • FIG. 2 is a cartoon 200 illustrating two autonomous robots 230 and 232, similar to that discussed with respect to FIG. 1 , inspecting opposite shelves 202 in an aisle. As shown each robot follows path 205 along the length of an aisle, with multiple cameras capturing images of the shelves 202.
  • the robots 230 and 232 support at least one range finding sensor to measure distance between the multiple cameras and the shelves and products on shelves, with an accuracy of less than 5cm, and with a typical accuracy range between about 5 cm and 1 mm.
  • LIDAR or other instruments with sub-millimeter accuracy can also be used in selected applications.
  • the robots 230 and 232 can move along a path generally parallel to a shelves 202. As the robots move, vertically positioned cameras are synchronized to simultaneously capture images of the shelves 202.
  • a depth map of the shelves and products is created by measuring distances from the shelf cameras to the shelves and products over the length of the shelving unit using image depth sensors and or laser ranging instrumentation. Using available information, consecutive images can be stitched together to create a panorama that spans an entire shelving unit. The images can be first stitched vertically among all the cameras, and then horizontally and incrementally stitched with each new consecutive set of vertical images as the robots 230 and 232 move along an aisle.
  • FIG. 3 is an illustration of various systems and electronic modules 300 supported by an autonomous robot having robot navigation and sensing 310.
  • Inventory cameras 340 are moved into a desired position with the aid of robot navigation and sensing module 310.
  • Lights 350 are directed toward product inventory and inventory camera control and image reconstruction 312 takes a series of inventory photos (and optional depth measurements) that can be stitched together to help form or update a realogram.
  • Realogram data is handled by an inventory data and local update module 314, which can transmit or receive realogram relevant information via communication system 316. Data can be communicated to a server local to the store, or transmitted by suitable internet or networking devices to remote company servers or cloud accessible data sites.
  • Inventory cameras 340 can include one or more movable cameras, zoom cameras, focusable cameras, wide-field cameras, infrared cameras, or other specialty cameras to aid in product identification or image construction.
  • a wide-field camera can be used to create an image organizing template into which data from higher resolution cameras with a narrow field of view are mapped.
  • a tilt controllable, high resolution camera positioned on the camera support roughly at a height of a shelf lip can be used to read shelf attached bar codes, identifying numbers, or labels.
  • conventional RGB CMOS or CCD sensors can be used, alone or in combination with spectral filters that may include narrowband, wideband, or polarization filters.
  • Embodiments can also include sensors capable of detecting infrared, ultraviolet, or other wavelengths to allow for hyperspectral image processing. This can allow, for example, monitoring and tracking of markers, labels or guides that are not visible to people, or using flashing light in the invisible spectrum that do not induce discomfort of health risk while reducing energy consumption and motion blur.
  • Lights can may be mounted along with, or separately from, the sensors, and can include monochromatic or near monochromatic light sources such as lasers, light emitting diodes (LEDs), or organic light emitting diodes (OLEDs). Broadband light sources may be provided by multiple LEDs of varying wavelength (including infrared or ultraviolet LEDs), halogen lamps or other suitable conventional light source.
  • both cameras 340 and lights 350 can be movably mounted.
  • hinged, rail, electromagnetic piston, or other suitable actuating mechanisms used to rotate, elevate, depress, oscillate, or laterally or vertically reposition cameras or lights.
  • one or more of the cameras can be mounted in such a way as to take advantage of the rolling shutter effects and direction of travel of the autonomous robot.
  • Inventory data 314 can include but is not limited to an inventory database capable of storing data on a plurality of products, each product associated with a product type, product dimensions, a product 3D model, a product image and a current product shelf inventory count and number of facings.
  • Realograms captured and created at different times can be stored, and data analysis used to improve estimates of product availability. In certain embodiments, frequency of realogram creation can be increased or reduced, and changes to robot navigation being determined.
  • the communication system 316 can include connections to both a wired or wireless connect subsystem for interaction with devices such as servers, desktop computers, laptops, tablets, or smart phones. Data and control signals can be received, generated, or transported between varieties of external data sources, including wireless networks, personal area networks, cellular networks, the Internet, or cloud mediated data sources.
  • sources of local data e.g. a hard drive, solid state drive, flash memory, or any other suitable memory, including dynamic memory, such as SRAM or DRAM
  • multiple communication systems can be provided. For example, a direct Wi-Fi connection (802.1 1 b/g/n) can be used as well as a separate 4G cellular connection.
  • Remote server 318 can include, but is not limited to servers, desktop
  • Cloud computing may be defined as a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and services) that can be rapidly provisioned via virtualization and released with minimal management effort or service provider interaction, and then scaled accordingly.
  • a cloud model can be composed of various characteristics (e.g., on-demand self-service, broad network access, resource pooling, rapid elasticity, measured service, etc.), service models (e.g. , Software as a Service (“SaaS”), Platform as a Service (“PaaS”), Infrastructure as a Service (“laaS”), and deployment models (e.g. , private cloud, community cloud, public cloud, hybrid cloud, etc.).
  • FIG. 4 is an illustration of realogram updating steps in one embodiment of operation.
  • a robot moves to an identified position and proceeds along an aisle path at a predetermined distance (step 410). If the path is blocked by people or objects, the robot can wait till the path is unobstructed, begin movement and slow down or wait as it nears the obstruction, move along the path until required to divert around the object before reacquiring the path, or simply select an alternative aisle.
  • step 412 multiple images are captured and stitched together. These stitched images, along with depth information created by distance ranging systems
  • an infrared depth sensor including but not limited to LIDAR or time-of-flight systems
  • an infrared depth sensor ultrasonic, systems that infer depth from stereo images, or systems that project an infrared mesh overlay that allows rough determination of object distance in an image, or other suitable system capable of distinguishing depth at a about a ten centimeter or less scale (including, but not limited to centimeter scale, sub- centimeter scale, or millimeter scale)
  • the realogram use shelf labels, bar codes, and product identification databases to identify products, localize product placement, estimate product count, count the number of product facings, or even identify or locate missing product.
  • This information is communicated to a remote server (step 416) for use by, for example, store managers, stocking employees, or customer assistant representatives.
  • a segmented image can include multiple product bounding boxes, typically ranging from dozens to hundreds of outlined or distinct image areas. The bounding boxes can surround either product facings, groups of products, or gaps between products.
  • the product bounding box with suitable identifiers, can be registered to a simple or panoramic stitched image of the shelf, and image descriptors extracted for the portion of the image contained in the bounding box.
  • Methods for generating image descriptors include but are not limited to: image templates, Histogram of Gradients, Histogram of Colors, the Scale Invariant Feature Transform, Binary Robust Independent Elementary Features, Maximally Stable Extremal Regions, Binary Robust Invariant Scalable Keypoints, Fast Retina Keypoints, Kaze features, and variations thereof.
  • Classifiers may include those based on deep structured learning
  • a deep learning based classifier can automatically learn image descriptors based on an annotated training data.
  • deep learning based image descriptors can be hierarchical, corresponding to multiple layers in deep convolutional neural networks.
  • the final layer of a convolutional layer network outputs the confidence values of the product being in one of the designated image categories.
  • the image descriptor generator part and the classification part get integrated in a convolutional neural network and these two parts are trained together using a training set.
  • embodiments that use both deep learning based image descriptors and conventional image descriptors can be combined in a hybrid system.
  • the image descriptors can be classified and labelled with the identifier.
  • Classification algorithms that can include but are not limited to support vector machine. This process can be repeated for every image of the bounding box associated to the same identifier, whether the image is captured in the same store at different times, or in different stores. In time, this allows automatically building a product library (i.e. the "Library of Products"), without requiring an initial planogram or storage of specific product databases.
  • products within product bounding boxes can be manually identified, identified using crowd source or paid reviewer image identification systems, identified with or without the aid of an initial planogram or realogram, or automatically identified using various image classifiers discussed herein. Gaps between products are useful for identifying shelf spacings, product separation, or missing/absent inventory. Automatic identification can be performed using an autonomous robot, alone or in combination with an external image classifier system.
  • a product bounding box can be defined as the horizontal space on the shelf occupied by one or more copies (facings) of the same product, along with the vertical space spanning the distance between a current shelf and the shelf above it. When the current shelf is the top shelf, the vertical space is a number generally corresponding to the distance to top of the fixture. The vertical space can alternatively be top of the product as sensed by depth sensors.
  • Image segmentation to automatically assist in creation of product bounding boxes and product identification can rely on use of image templates in some embodiments.
  • each image template is compared with the image captured by a camera system mounted on an autonomous robot. If a match is positive, the matched section of the image is used as the image segmentation for that product.
  • image segmentation can be supported by machine learning systems, including but not limited to deep learning methods.
  • classifiers such as convolution neural networks or other deep learning methods, template matching or HAAR cascades can be used to aid in detection of each shelf label.
  • Each shelf label is analyzed to obtain one or more product identifiers. Analysis may include but is not limited to optical character recognition, bar code scanning, QR code scanning, AR code scanning, or hologram code scanning.
  • Product identifiers may be UPC code, the product name, or a coded collection of letters, numbers, or other symbols. If more than one identifier is available, a preferred identifier such as the UPC code can be selected.
  • infrared or ultraviolet detectable product identifiers embedded on product packaging or shelf labels can be used, as well as any other suitable tag, marker, or detectable identifying indicia such as a visible UPC code or serial number on the product packaging.
  • the library can be searched for realogram related information. For example, products objects with a large number of similar features can be used to assist in developing the product bounding box. For each potential product object match, the geometric consistency of the feature locations in the library can be compared with the features in a shelf image. Some methods further include indexing the sets of descriptors within the library for improved searching performance and/or reduced storage requirements. Indexing methods include but are not limited to: hashing techniques, tree representations, and bag-of-words encodings. Alternatively, planograms, realograms, other product information, or product location information from the product library can be used to reduce the number of products that must be searched to just those products contained within the imaged shelf.
  • FIG. 5A and B are respectively examples in side view and cross section of an autonomous robot 500 capable of acting as a mobile base for a camera system in accordance with this disclosure.
  • the robot navigation and sensing unit includes a top mount sensor module 510 with a number of forward, side, rear, and top mounted cameras.
  • a vertically aligned array of lights 520 is sited next to a vertically arranged line of cameras 530, and both are supported by a drive base 540 that includes control electronics, power, and docking interconnects.
  • Mobility is provided by drive wheels 560, and stability is improved by caster wheels 550.
  • FIG. 6 is a top view, looking down, of camera platform 600 with various possible camera support sites situated on a mobile base 610.
  • the mobile base 610 has a top mounted camera and sensor suite 620 (optionally viewing over 360 degrees) to aid in positioning and navigating the mobile base 610 with respect to shelves in a retail store or warehouse aisle, and to capture 360-deg or spherical images of the environment.
  • Fixedly mounted cameras 630 and 640 can be positioned to point at a perpendicular angle with respect to mobile base motion (640) or slightly angle forward (630).
  • a controllable gimble or tilt mount (650) can be used to point a camera in a desired direction.
  • a boom 670 horizontally extending from the mobile base 610, can be used to support multiple linearly extending cameras that are directed to simultaneously capture images from each side of an aisle.
  • high-resolution cameras can be directed to point to one side of the aisle to read barcodes and identify labels, and low-resolution cameras are pointed toward the other side of the aisle.
  • the low-resolution cameras simply detect labels and match them to previously identified labels.
  • the respective high resolution and low-resolution cameras supported by the robot can scan opposite sides.
  • two-dimensional arrays of cameras or 360 degree cameras mounted at various positions on the mobile base 610 or camera platform 600 can be used.

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Economics (AREA)
  • Theoretical Computer Science (AREA)
  • Quality & Reliability (AREA)
  • Strategic Management (AREA)
  • General Business, Economics & Management (AREA)
  • Tourism & Hospitality (AREA)
  • Operations Research (AREA)
  • Marketing (AREA)
  • Human Resources & Organizations (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Development Economics (AREA)
  • Multimedia (AREA)
  • Finance (AREA)
  • Accounting & Taxation (AREA)
  • Health & Medical Sciences (AREA)
  • Vascular Medicine (AREA)
  • General Health & Medical Sciences (AREA)
  • Automation & Control Theory (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Image Analysis (AREA)
  • Warehouses Or Storage Devices (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)
  • General Factory Administration (AREA)
  • Studio Devices (AREA)
PCT/US2017/039304 2016-06-30 2017-06-26 Multiple camera system for inventory tracking Ceased WO2018005369A1 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
CA3028156A CA3028156C (en) 2016-06-30 2017-06-26 MULTI-CAMERA SYSTEM FOR INVENTORY TRACKING
EP17821015.9A EP3479298A4 (en) 2016-06-30 2017-06-26 MULTIPLE CAMERA SYSTEM FOR INVENTORY TRACKING
CN201780038500.6A CN109328359A (zh) 2016-06-30 2017-06-26 用于库存跟踪的多摄像机系统
JP2018566442A JP2019530035A (ja) 2016-06-30 2017-06-26 在庫追跡のための複数のカメラシステム

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201662357124P 2016-06-30 2016-06-30
US62/357,124 2016-06-30

Publications (1)

Publication Number Publication Date
WO2018005369A1 true WO2018005369A1 (en) 2018-01-04

Family

ID=60785281

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2017/039304 Ceased WO2018005369A1 (en) 2016-06-30 2017-06-26 Multiple camera system for inventory tracking

Country Status (6)

Country Link
US (1) US10769582B2 (enExample)
EP (1) EP3479298A4 (enExample)
JP (1) JP2019530035A (enExample)
CN (1) CN109328359A (enExample)
CA (1) CA3028156C (enExample)
WO (1) WO2018005369A1 (enExample)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190213534A1 (en) * 2018-01-10 2019-07-11 Trax Technology Solutions Pte Ltd. Withholding notifications due to temporary misplaced products
EP3621000A1 (en) * 2018-09-10 2020-03-11 Aioi Systems Co., Ltd Operation assist device
WO2020147996A1 (en) * 2019-01-14 2020-07-23 Siemens Schweiz Ag Method and system for detecting building objects installed within a building
EP3750114A4 (en) * 2018-02-06 2021-10-27 Adroit Worldwide Media, Inc. AUTOMATIC INVENTORY INTELLIGENCE SYSTEMS AND METHODS
EP3953879A4 (en) * 2019-04-11 2022-11-23 Carnegie Mellon University SYSTEM AND PROCEDURE FOR ASSOCIATION OF PRODUCTS AND PRODUCT LABELS
US12079771B2 (en) 2018-01-10 2024-09-03 Trax Technology Solutions Pte Ltd. Withholding notifications due to temporary misplaced products
US12254671B2 (en) 2021-06-30 2025-03-18 ARpalus LTD. Using SLAM 3D information to optimize training and use of deep neural networks for recognition and tracking of 3D object

Families Citing this family (124)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10453046B2 (en) * 2014-06-13 2019-10-22 Conduent Business Services, Llc Store shelf imaging system
WO2016103285A1 (en) * 2014-12-24 2016-06-30 Datalogic Ip Tech S.R.L. System and method for reading direct part marking (dpm) codes on objects
US10552933B1 (en) 2015-05-20 2020-02-04 Digimarc Corporation Image processing methods and arrangements useful in automated store shelf inspections
US10909710B2 (en) * 2016-03-23 2021-02-02 Akcelita, LLC System and method for tracking product stock in a store shelf
US11681980B2 (en) * 2016-03-23 2023-06-20 Akcelita, LLC System and method for tracking product stock in a store shelf
KR20190031431A (ko) * 2016-03-29 2019-03-26 보사 노바 로보틱스 아이피, 인크. 물품의 위치 파악, 식별 및 카운트하는 방법 및 시스템
US10089599B2 (en) * 2016-04-07 2018-10-02 Walmart Apollo, Llc Systems and methods for locating containers with low inventory
US10949797B2 (en) * 2016-07-01 2021-03-16 Invia Robotics, Inc. Inventory management robots
WO2018016494A1 (ja) * 2016-07-20 2018-01-25 株式会社サンクレエ 在庫管理サーバ、在庫管理システム、在庫管理プログラムおよび在庫管理方法
US10679275B2 (en) * 2016-08-19 2020-06-09 Jacob Kaufman System and method for locating in-store products
US20180101813A1 (en) * 2016-10-12 2018-04-12 Bossa Nova Robotics Ip, Inc. Method and System for Product Data Review
US11042161B2 (en) 2016-11-16 2021-06-22 Symbol Technologies, Llc Navigation control method and apparatus in a mobile automation system
US10438165B2 (en) * 2017-03-07 2019-10-08 Ricoh Company, Ltd. Planogram generation
US11978011B2 (en) 2017-05-01 2024-05-07 Symbol Technologies, Llc Method and apparatus for object status detection
US10949798B2 (en) 2017-05-01 2021-03-16 Symbol Technologies, Llc Multimodal localization and mapping for a mobile automation apparatus
US11449059B2 (en) 2017-05-01 2022-09-20 Symbol Technologies, Llc Obstacle detection for a mobile automation apparatus
US11367092B2 (en) 2017-05-01 2022-06-21 Symbol Technologies, Llc Method and apparatus for extracting and processing price text from an image set
US10663590B2 (en) 2017-05-01 2020-05-26 Symbol Technologies, Llc Device and method for merging lidar data
US10505057B2 (en) 2017-05-01 2019-12-10 Symbol Technologies, Llc Device and method for operating cameras and light sources wherein parasitic reflections from a paired light source are not reflected into the paired camera
US10726273B2 (en) 2017-05-01 2020-07-28 Symbol Technologies, Llc Method and apparatus for shelf feature and object placement detection from shelf images
US11093896B2 (en) 2017-05-01 2021-08-17 Symbol Technologies, Llc Product status detection system
US10591918B2 (en) 2017-05-01 2020-03-17 Symbol Technologies, Llc Fixed segmented lattice planning for a mobile automation apparatus
WO2018201423A1 (en) 2017-05-05 2018-11-08 Symbol Technologies, Llc Method and apparatus for detecting and interpreting price label text
US20190057367A1 (en) * 2017-08-17 2019-02-21 Walmart Apollo, Llc Method and apparatus for handling mis-ringing of products
US10572763B2 (en) 2017-09-07 2020-02-25 Symbol Technologies, Llc Method and apparatus for support surface edge detection
US10489677B2 (en) 2017-09-07 2019-11-26 Symbol Technologies, Llc Method and apparatus for shelf edge detection
US10521914B2 (en) * 2017-09-07 2019-12-31 Symbol Technologies, Llc Multi-sensor object recognition system and method
EP3483780A1 (en) * 2017-11-10 2019-05-15 Skidata Ag Classification and identification systems and methods
NZ765310A (en) 2017-11-14 2022-04-29 Hai Robotics Co Ltd Automated guided vehicle designed for warehouse
US11501522B2 (en) * 2017-12-06 2022-11-15 Nec Corporation Image recognition model generating device, image recognition model generating method, and image recognition model generating program storing medium
US10780930B1 (en) * 2018-02-20 2020-09-22 Zoox, Inc. Worm gear drive unit interface and assembly methods
US10960939B1 (en) 2018-02-20 2021-03-30 Zoox, Inc. Worm gear drive unit interface and assembly methods
US10823572B2 (en) 2018-04-05 2020-11-03 Symbol Technologies, Llc Method, system and apparatus for generating navigational data
US10809078B2 (en) 2018-04-05 2020-10-20 Symbol Technologies, Llc Method, system and apparatus for dynamic path generation
US11327504B2 (en) 2018-04-05 2022-05-10 Symbol Technologies, Llc Method, system and apparatus for mobile automation apparatus localization
US10740911B2 (en) 2018-04-05 2020-08-11 Symbol Technologies, Llc Method, system and apparatus for correcting translucency artifacts in data representing a support structure
US10832436B2 (en) 2018-04-05 2020-11-10 Symbol Technologies, Llc Method, system and apparatus for recovering label positions
US10630384B2 (en) 2018-06-13 2020-04-21 Infineon Technologies Ag Dual-mode optical devices for time-of-flight sensing and information transfer, and apparatus, systems, and methods utilizing same
CN109064395B (zh) * 2018-06-19 2023-06-16 广东数相智能科技有限公司 一种基于图书盘点的书架图像拼接方法、电子设备及存储介质
CN108960202B (zh) * 2018-08-01 2022-05-10 京东方科技集团股份有限公司 一种智能货架、系统、判断商品叠放的方法
WO2020041734A1 (en) * 2018-08-24 2020-02-27 Bossa Nova Robotics Ip, Inc. Shelf-viewing camera with multiple focus depths
WO2020051469A1 (en) * 2018-09-06 2020-03-12 Apple Inc. Ultrasonic sensor
US11506483B2 (en) * 2018-10-05 2022-11-22 Zebra Technologies Corporation Method, system and apparatus for support structure depth determination
US11010920B2 (en) 2018-10-05 2021-05-18 Zebra Technologies Corporation Method, system and apparatus for object detection in point clouds
US11090811B2 (en) 2018-11-13 2021-08-17 Zebra Technologies Corporation Method and apparatus for labeling of support structures
US11003188B2 (en) 2018-11-13 2021-05-11 Zebra Technologies Corporation Method, system and apparatus for obstacle handling in navigational path generation
US11416000B2 (en) 2018-12-07 2022-08-16 Zebra Technologies Corporation Method and apparatus for navigational ray tracing
US11079240B2 (en) 2018-12-07 2021-08-03 Zebra Technologies Corporation Method, system and apparatus for adaptive particle filter localization
US11100303B2 (en) 2018-12-10 2021-08-24 Zebra Technologies Corporation Method, system and apparatus for auxiliary label detection and association
US11015938B2 (en) 2018-12-12 2021-05-25 Zebra Technologies Corporation Method, system and apparatus for navigational assistance
US10731970B2 (en) 2018-12-13 2020-08-04 Zebra Technologies Corporation Method, system and apparatus for support structure detection
US11126861B1 (en) 2018-12-14 2021-09-21 Digimarc Corporation Ambient inventorying arrangements
CN109669375B (zh) * 2018-12-27 2021-05-14 格讯科技(深圳)有限公司 一种备餐间集控方法及系统
CA3028708A1 (en) 2018-12-28 2020-06-28 Zih Corp. Method, system and apparatus for dynamic loop closure in mapping trajectories
US11321655B2 (en) * 2019-11-26 2022-05-03 Ncr Corporation Frictionless and autonomous control processing
US11037225B2 (en) * 2019-04-25 2021-06-15 Capital One Services, Llc Generating augmented reality vehicle information for a vehicle captured by cameras in a vehicle lot
CN110175483A (zh) * 2019-05-16 2019-08-27 王志伟 一种基于标签的识别方法
US11662739B2 (en) 2019-06-03 2023-05-30 Zebra Technologies Corporation Method, system and apparatus for adaptive ceiling-based localization
US11200677B2 (en) 2019-06-03 2021-12-14 Zebra Technologies Corporation Method, system and apparatus for shelf edge detection
US11341663B2 (en) 2019-06-03 2022-05-24 Zebra Technologies Corporation Method, system and apparatus for detecting support structure obstructions
US11402846B2 (en) 2019-06-03 2022-08-02 Zebra Technologies Corporation Method, system and apparatus for mitigating data capture light leakage
US11080566B2 (en) 2019-06-03 2021-08-03 Zebra Technologies Corporation Method, system and apparatus for gap detection in support structures with peg regions
US11151743B2 (en) 2019-06-03 2021-10-19 Zebra Technologies Corporation Method, system and apparatus for end of aisle detection
US11960286B2 (en) 2019-06-03 2024-04-16 Zebra Technologies Corporation Method, system and apparatus for dynamic task sequencing
US11093785B1 (en) * 2019-06-27 2021-08-17 Amazon Technologies, Inc. Inferring facility planograms
DE102019118046B4 (de) 2019-07-04 2025-03-20 Hänel & Co. Lagerregal zur Lagerung mindestens eines Lagergutträgers und Verfahren zum Erfassen eines Lagerbestands eines Lagerregals
US11501326B1 (en) 2019-07-23 2022-11-15 Inmar Clearing, Inc. Store low-stock item reporting and promotion system and related methods
US11069073B2 (en) 2019-07-23 2021-07-20 Advanced New Technologies Co., Ltd. On-shelf commodity detection method and system
US11562500B2 (en) 2019-07-24 2023-01-24 Squadle, Inc. Status monitoring using machine learning and machine vision
US11915192B2 (en) 2019-08-12 2024-02-27 Walmart Apollo, Llc Systems, devices, and methods for scanning a shopping space
WO2021034681A1 (en) 2019-08-16 2021-02-25 Bossa Nova Robotics Ip, Inc. Systems and methods for image capture and shelf content detection
US11354910B2 (en) * 2019-09-27 2022-06-07 Ncr Corporation Frictionless authentication and monitoring
CN111191974B (zh) * 2019-11-28 2023-07-04 泰康保险集团股份有限公司 药品盘点的方法和装置
US11507103B2 (en) 2019-12-04 2022-11-22 Zebra Technologies Corporation Method, system and apparatus for localization-based historical obstacle handling
US11107238B2 (en) 2019-12-13 2021-08-31 Zebra Technologies Corporation Method, system and apparatus for detecting item facings
FR3107977B1 (fr) * 2020-03-03 2022-03-18 Thales Sa Procede d'aide a la detection d'elements, dispositif et plateforme associes
US12288187B2 (en) * 2020-03-09 2025-04-29 Nec Corporation Product detection apparatus, product detection method, and non-transitory storage medium
JP7638628B2 (ja) * 2020-03-26 2025-03-04 東芝テック株式会社 撮影装置及び撮影方法
US11822333B2 (en) 2020-03-30 2023-11-21 Zebra Technologies Corporation Method, system and apparatus for data capture illumination control
US20210374659A1 (en) * 2020-05-27 2021-12-02 Vimaan Robotics, Inc. Real Time Event Tracking and Digitization for Warehouse Inventory Management
CA3177901C (en) 2020-06-01 2024-01-02 Ido Merkado Systems and methods for retail environments
US11450024B2 (en) 2020-07-17 2022-09-20 Zebra Technologies Corporation Mixed depth object detection
CN112001669A (zh) * 2020-07-22 2020-11-27 国网内蒙古东部电力有限公司赤峰供电公司 一种工程材料管控方法、系统、计算机存储介质
US12067527B2 (en) 2020-08-12 2024-08-20 Carnegie Mellon University System and method for identifying misplaced products in a shelf management system
US12437258B2 (en) 2020-08-12 2025-10-07 Carnegie Mellon University System and method for identifying products in a shelf management system
US11341456B2 (en) 2020-08-25 2022-05-24 Datalogic Usa, Inc. Compact and low-power shelf monitoring system
CN112215142B (zh) * 2020-10-12 2021-08-13 上海汉时信息科技有限公司 基于深度图像信息检测货架缺货率的方法、装置及设备
US11593915B2 (en) 2020-10-21 2023-02-28 Zebra Technologies Corporation Parallax-tolerant panoramic image generation
US11392891B2 (en) 2020-11-03 2022-07-19 Zebra Technologies Corporation Item placement detection and optimization in material handling systems
US11704787B2 (en) 2020-11-06 2023-07-18 Wipro Limited Method and system for determining stock in an inventory
US11847832B2 (en) 2020-11-11 2023-12-19 Zebra Technologies Corporation Object classification for autonomous navigation systems
US20230419702A1 (en) * 2020-11-20 2023-12-28 Omni Consumer Products, Llc System, method and apparatus for price label modeling tool
US20220187844A1 (en) * 2020-12-14 2022-06-16 Google Llc Autonomous 3D Datacenter Mapping System
CN112581618B (zh) * 2020-12-23 2024-05-24 深圳前海贾维斯数据咨询有限公司 建筑工程行业的三维建筑模型与实景比对方法及系统
US12469002B2 (en) * 2020-12-28 2025-11-11 International Business Machines Corporation Apparatus for automating inventory and automatic inventory system and method
US12406225B1 (en) * 2021-02-17 2025-09-02 Amazon Technologies, Inc. Devices for monitoring inventory within enclosures
CN115086539B (zh) * 2021-03-15 2024-02-02 虫极科技(北京)有限公司 拍摄点的定位方法和系统
US11842321B1 (en) * 2021-03-17 2023-12-12 Amazon Technologies, Inc. Image-based detection of planogram product spaces
CN115147748A (zh) * 2021-03-30 2022-10-04 上海聚均科技有限公司 一种对仓储空间中货物进行远程智能检测的方法及系统
CN115236082A (zh) * 2021-04-25 2022-10-25 泰科电子(上海)有限公司 便携式视觉检查设备和使用其对物品进行检查的方法
US11954882B2 (en) 2021-06-17 2024-04-09 Zebra Technologies Corporation Feature-based georegistration for mobile computing devices
CN115983758A (zh) * 2021-10-14 2023-04-18 元气森林(北京)食品科技集团有限公司 区域检查方法、介质及产品
US12238422B2 (en) 2022-01-14 2025-02-25 Samsung Electronics Co., Ltd. Method of constructing front panorama of shelving from arbitrary series of frames based on shelving 3D model
US12189915B2 (en) 2022-06-24 2025-01-07 Lowe's Companies, Inc. Simulated environment for presenting virtual objects and virtual resets
US12211161B2 (en) 2022-06-24 2025-01-28 Lowe's Companies, Inc. Reset modeling based on reset and object properties
EP4332710B1 (en) 2022-08-30 2024-10-02 Ovh Methods and autonomous robots for taking inventory in a structure
JP2025535680A (ja) 2022-09-29 2025-10-28 ノマド ゴー,インコーポレイテッド エッジコンピュータビジョン及びアクティブリアリティのための機械学習システムに関する方法及び装置
US12450558B2 (en) 2022-10-11 2025-10-21 Walmart Apollo, Llc Systems and methods of selecting an image from a group of images of a retail product storage area
US12288408B2 (en) 2022-10-11 2025-04-29 Walmart Apollo, Llc Systems and methods of identifying individual retail products in a product storage area based on an image of the product storage area
US12430608B2 (en) 2022-10-11 2025-09-30 Walmart Apollo, Llc Clustering of items with heterogeneous data points
US12380400B2 (en) 2022-10-14 2025-08-05 Walmart Apollo, Llc Systems and methods of mapping an interior space of a product storage facility
US12333488B2 (en) 2022-10-21 2025-06-17 Walmart Apollo, Llc Systems and methods of detecting price tags and associating the price tags with products
US12367457B2 (en) 2022-11-09 2025-07-22 Walmart Apollo, Llc Systems and methods of verifying price tag label-product pairings
US12430856B2 (en) 2022-12-16 2025-09-30 Lowe's Companies, Inc. Compact augmented reality view experience
US12374115B2 (en) 2023-01-24 2025-07-29 Walmart Apollo, Llc Systems and methods of using cached images to determine product counts on product storage structures of a product storage facility
US12469005B2 (en) 2023-01-24 2025-11-11 Walmart Apollo, Llc Methods and systems for creating reference image templates for identification of products on product storage structures of a product storage facility
US12450883B2 (en) 2023-01-24 2025-10-21 Walmart Apollo, Llc Systems and methods for processing images captured at a product storage facility
US12361375B2 (en) 2023-01-30 2025-07-15 Walmart Apollo, Llc Systems and methods of updating model templates associated with images of retail products at product storage facilities
US12412149B2 (en) 2023-01-30 2025-09-09 Walmart Apollo, Llc Systems and methods for analyzing and labeling images in a retail facility
US12469255B2 (en) 2023-02-13 2025-11-11 Walmart Apollo, Llc Systems and methods for identifying different product identifiers that correspond to the same product
US20240338658A1 (en) * 2023-04-07 2024-10-10 Walmart Apollo, Llc Linking items to digital shelf labels using modular image data
US12437263B2 (en) 2023-05-30 2025-10-07 Walmart Apollo, Llc Systems and methods of monitoring location labels of product storage structures of a product storage facility
CN117706572B (zh) * 2023-11-03 2025-06-20 中国外运股份有限公司 货物跟踪方法、装置、电子设备及存储介质
KR102837746B1 (ko) * 2024-11-06 2025-07-25 변성안 의약품의 재고 정보를 획득하는 장치 및 그 방법

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2006063314A2 (en) * 2004-12-09 2006-06-15 Harvey Koselka Agricultural robot system and method
US8091782B2 (en) * 2007-11-08 2012-01-10 International Business Machines Corporation Using cameras to monitor actual inventory
US9120322B2 (en) * 2012-08-07 2015-09-01 Hitachi Industrial Equipment Systems Co., Ltd. Ink jet recording device
US20160119540A1 (en) * 2014-10-23 2016-04-28 Xerox Corporation Model-based plane-like panorama and retail applications

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7693757B2 (en) * 2006-09-21 2010-04-06 International Business Machines Corporation System and method for performing inventory using a mobile inventory robot
US9135491B2 (en) * 2007-08-31 2015-09-15 Accenture Global Services Limited Digital point-of-sale analyzer
JP2012215959A (ja) * 2011-03-31 2012-11-08 Fuji Security Systems Co Ltd 警備用ロボット
US8965561B2 (en) * 2013-03-15 2015-02-24 Cybernet Systems Corporation Automated warehousing using robotic forklifts
US9582516B2 (en) * 2013-10-17 2017-02-28 Nant Holdings Ip, Llc Wide area augmented reality location-based services
US9796093B2 (en) * 2014-10-24 2017-10-24 Fellow, Inc. Customer service robot and related systems and methods
US9536167B2 (en) * 2014-12-10 2017-01-03 Ricoh Co., Ltd. Realogram scene analysis of images: multiples for scene analysis
US9120622B1 (en) * 2015-04-16 2015-09-01 inVia Robotics, LLC Autonomous order fulfillment and inventory control robots
KR20190031431A (ko) * 2016-03-29 2019-03-26 보사 노바 로보틱스 아이피, 인크. 물품의 위치 파악, 식별 및 카운트하는 방법 및 시스템

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2006063314A2 (en) * 2004-12-09 2006-06-15 Harvey Koselka Agricultural robot system and method
US8091782B2 (en) * 2007-11-08 2012-01-10 International Business Machines Corporation Using cameras to monitor actual inventory
US9120322B2 (en) * 2012-08-07 2015-09-01 Hitachi Industrial Equipment Systems Co., Ltd. Ink jet recording device
US20160119540A1 (en) * 2014-10-23 2016-04-28 Xerox Corporation Model-based plane-like panorama and retail applications

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3479298A4 *

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190213534A1 (en) * 2018-01-10 2019-07-11 Trax Technology Solutions Pte Ltd. Withholding notifications due to temporary misplaced products
US10628660B2 (en) * 2018-01-10 2020-04-21 Trax Technology Solutions Pte Ltd. Withholding notifications due to temporary misplaced products
US12079771B2 (en) 2018-01-10 2024-09-03 Trax Technology Solutions Pte Ltd. Withholding notifications due to temporary misplaced products
EP3750114A4 (en) * 2018-02-06 2021-10-27 Adroit Worldwide Media, Inc. AUTOMATIC INVENTORY INTELLIGENCE SYSTEMS AND METHODS
EP3621000A1 (en) * 2018-09-10 2020-03-11 Aioi Systems Co., Ltd Operation assist device
CN110889317A (zh) * 2018-09-10 2020-03-17 爱鸥自动化系统有限公司 操作辅助装置
CN110889317B (zh) * 2018-09-10 2023-12-19 爱鸥自动化系统有限公司 操作辅助装置
WO2020147996A1 (en) * 2019-01-14 2020-07-23 Siemens Schweiz Ag Method and system for detecting building objects installed within a building
CN113544693A (zh) * 2019-01-14 2021-10-22 西门子瑞士有限公司 用于检测安装在建筑物内的建筑物对象的方法和系统
US12026595B2 (en) 2019-01-14 2024-07-02 Siemens Schweiz Ag Method and system for detecting building objects installed within a building
EP3953879A4 (en) * 2019-04-11 2022-11-23 Carnegie Mellon University SYSTEM AND PROCEDURE FOR ASSOCIATION OF PRODUCTS AND PRODUCT LABELS
US12254671B2 (en) 2021-06-30 2025-03-18 ARpalus LTD. Using SLAM 3D information to optimize training and use of deep neural networks for recognition and tracking of 3D object

Also Published As

Publication number Publication date
CA3028156A1 (en) 2018-01-04
US20180005176A1 (en) 2018-01-04
CA3028156C (en) 2025-05-27
US10769582B2 (en) 2020-09-08
EP3479298A4 (en) 2020-01-22
JP2019530035A (ja) 2019-10-17
EP3479298A1 (en) 2019-05-08
CN109328359A (zh) 2019-02-12

Similar Documents

Publication Publication Date Title
US10769582B2 (en) Multiple camera system for inventory tracking
US11087272B2 (en) System and method for locating, identifying and counting items
US20190034864A1 (en) Data Reduction in a Bar Code Reading Robot Shelf Monitoring System
US11774842B2 (en) Systems and methods for image capture and shelf content detection
US20190180150A1 (en) Color Haar Classifier for Retail Shelf Label Detection
US12437258B2 (en) System and method for identifying products in a shelf management system
US11915463B2 (en) System and method for the automatic enrollment of object images into a gallery
US12118506B2 (en) System and method for associating products and product labels
US11587195B2 (en) Image processing methods and arrangements useful in automated store shelf inspections
US10785418B2 (en) Glare reduction method and system
US12067527B2 (en) System and method for identifying misplaced products in a shelf management system
US20180101813A1 (en) Method and System for Product Data Review
US20200068126A1 (en) Shelf-Viewing Camera With Multiple Focus Depths
AU2019396253B2 (en) Method, system and apparatus for auxiliary label detection and association
JP2019523924A (ja) 製品を店舗内の棚構造に割り当てる棚割表を自動的に生成する方法
US20200097892A1 (en) System and method for automatic product enrollment

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17821015

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 3028156

Country of ref document: CA

ENP Entry into the national phase

Ref document number: 2018566442

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2017821015

Country of ref document: EP

Effective date: 20190130