US20230129852A1 - Apparatus and method to facilitate identification of items - Google Patents

Apparatus and method to facilitate identification of items Download PDF

Info

Publication number
US20230129852A1
US20230129852A1 US17/914,745 US202117914745A US2023129852A1 US 20230129852 A1 US20230129852 A1 US 20230129852A1 US 202117914745 A US202117914745 A US 202117914745A US 2023129852 A1 US2023129852 A1 US 2023129852A1
Authority
US
United States
Prior art keywords
identified
clustered
identifying label
item
group
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US17/914,745
Other versions
US12333813B2 (en
Inventor
Joshua M. Horowitz
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Walmart Apollo LLC
Original Assignee
Walmart Apollo LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Walmart Apollo LLC filed Critical Walmart Apollo LLC
Priority to US17/914,745 priority Critical patent/US12333813B2/en
Assigned to WALMART APOLLO, LLC reassignment WALMART APOLLO, LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: HOROWITZ, Joshua M.
Publication of US20230129852A1 publication Critical patent/US20230129852A1/en
Application granted granted Critical
Publication of US12333813B2 publication Critical patent/US12333813B2/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/53Querying
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/762Arrangements for image or video recognition or understanding using pattern recognition or machine learning using clustering, e.g. of similar faces in social networks
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/70Labelling scene content, e.g. deriving syntactic or semantic representations
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/08Logistics, e.g. warehousing, loading or distribution; Inventory or stock management
    • G06Q10/087Inventory or stock management, e.g. order filling, procurement or balancing against orders

Definitions

  • These teachings relate generally to identifying items and more particularly to selectively facilitating the identification of previously unidentified items.
  • Inventory for a vehicular repair facility or for a modern retailer can include tens of thousands or even hundreds of thousands of individual differentiated items as well as many duplicated items. Proper identification of such items is necessary to facilitate the overall management of such items.
  • a simple approach to resolve the foregoing problem is to alert a human to assess every such instance of uncertainty regarding the identification of a particular item by an automated-identification platform.
  • Unfortunately such an approach can lead to a considerable waste of time for the corresponding human(s).
  • uncertainty can arise through relatively rare occurrences (on a per item basis) that are nevertheless relatively voluminous in the aggregate across a large number of items needing identification.
  • FIG. 1 comprises a block diagram as configured in accordance with various embodiments of these teachings
  • FIG. 2 comprises a flow diagram as configured in accordance with various embodiments of these teachings.
  • FIG. 3 comprises a view of an N-dimensional sampled vector space as projected down to three dimensions as configured in accordance with various embodiments of these teachings.
  • a control circuit accesses a digitized image of a particular item and then processes that digitized image to thereby assign various aspects of the digitized image to various dimensions in N-dimensional Euclidean space to thereby provide an N-dimensional representation.
  • the control circuit then accesses a database containing a plurality of various clustered groups of N-dimensional representations.
  • at least some of the clustered groups have a corresponding identifying label.
  • Conducting a nearest neighbor search serves to identify a clustered group to which the particular item most likely belongs to thereby provide an identified cluster group.
  • the control circuit can then further determine whether a predetermined condition has also been met. When true, the control circuit alerts a user via a user interface that the identified cluster group should be assigned an identifying label.
  • control circuit identifies the item as being one that is identified by that identifying label.
  • N an integer. Generally speaking, it can be computationally useful for N to be an even number. For many application settings N can equal 512 with very useful results.
  • the aforementioned predetermined condition can comprise a particular number of members of the identified cluster group. Examples include, for example, 3 or 10. So configured, these teachings will not generate an alert to a human to take any actions when only one or only a very few items have not been previously identified. Instead, the system must see at least a predetermined number of items that are both unidentified and similarly clustered before such an alert is provided. This approach avoids wasting unnecessary human oversight for items that are too rare in the application setting.
  • FIG. 1 an illustrative apparatus 100 that is compatible with many of these teachings will now be presented.
  • the enabling apparatus 100 includes a control circuit 101 .
  • the control circuit 101 therefore comprises structure that includes at least one (and typically many) electrically-conductive paths (such as paths comprised of a conductive metal such as copper or silver) that convey electricity in an ordered manner, which path(s) will also typically include corresponding electrical components (both passive (such as resistors and capacitors) and active (such as any of a variety of semiconductor-based devices) as appropriate) to permit the circuit to effect the control aspect of these teachings.
  • Such a control circuit 101 can comprise a fixed-purpose hard-wired hardware platform (including but not limited to an application-specific integrated circuit (ASIC) (which is an integrated circuit that is customized by design for a particular use, rather than intended for general-purpose use), a field-programmable gate array (FPGA), and the like) or can comprise a partially or wholly-programmable hardware platform (including but not limited to microcontrollers, microprocessors, and the like).
  • ASIC application-specific integrated circuit
  • FPGA field-programmable gate array
  • This control circuit 101 is configured (for example, by using corresponding programming as will be well understood by those skilled in the art) to carry out one or more of the steps, actions, and/or functions described herein.
  • control circuit 101 operably couples to a memory 102 .
  • This memory 102 may be integral to the control circuit 101 or can be physically discrete (in whole or in part) from the control circuit 101 as desired.
  • This memory 102 can also be local with respect to the control circuit 101 (where, for example, both share a common circuit board, chassis, power supply, and/or housing) or can be partially or wholly remote with respect to the control circuit 101 (where, for example, the memory 102 is physically located in another facility, metropolitan area, or even country as compared to the control circuit 101 ).
  • this memory 102 can serve, for example, to non-transitorily store the computer instructions that, when executed by the control circuit 101 , cause the control circuit 101 to behave as described herein.
  • this reference to “non-transitorily” will be understood to refer to a non-ephemeral state for the stored contents (and hence excludes when the stored contents merely constitute signals or waves) rather than volatility of the storage media itself and hence includes both non-volatile memory (such as read-only memory (ROM) as well as volatile memory (such as a dynamic random access memory (DRAM).)
  • control circuit 101 also operably couples to a user interface 103 .
  • This user interface 103 can comprise any of a variety of user-input mechanisms (such as, but not limited to, keyboards and keypads, cursor-control devices, touch-sensitive displays, speech-recognition interfaces, gesture-recognition interfaces, and so forth) and/or user-output mechanisms (such as, but not limited to, visual displays, audio transducers, printers, and so forth) to facilitate receiving information and/or instructions from a user and/or providing information to a user.
  • the user interface 103 may communicatively couple to the control circuit 101 via a wireless or non-wireless mechanism/network.
  • the aforementioned memory 102 has stored therein a digitized image of a particular item 104 .
  • That particular item 104 may be, for example, a replacement part at an aviation repair facility, munitions at a military supply depot, an item being offered for sale at retail at a retail sales facility 107 , or essentially any other non-human item (including both naturally occurring as well as man-made items).
  • the digitized image can be provided by one or more image capture apparatuses 105 . Digital still image and video cameras are examples in these regards.
  • That digitized image of the particular item 104 may be captured as a singular event or, depending upon the application setting, may be captured in a series of images that are attained when capturing a number of digitized images of a plurality of items (represented in FIG. 1 by the aforementioned particular item 104 through an Nth item 106 ).
  • control circuit 101 carries out the illustrated process 200 .
  • This process 200 provides for accessing a digitized image of a particular item 104 .
  • the control circuit 101 accesses the aforementioned memory 102 to thereby access that digitized image.
  • the control circuit 101 then processes that digitized image to thereby assign various aspects of the digitized image to various dimensions in N-dimensional Euclidean space to thereby provide an N-dimensional representation (where N is an integer).
  • N can be selected from the range of about 64 to about 512. In this example it is presumed that N equals 512.
  • FaceNet publication See “FaceNet: A Unified Embedding for Face Recognition and Clustering” by Schroff et al. in the Proceedings of the IEEE Computer Society conference on Computer Vision and Pattern Recognition published Jun. 17, 2015 (hereinafter “FaceNet publication”), the contents of which are hereby incorporated in their entirety by this reference.
  • the control circuit 101 accesses a database (which, if desired, can also be stored by the aforementioned memory 102 ) that contains a plurality of clustered groups of at least other N-dimensional representations, wherein at least some of the clustered groups have a corresponding identifying label, and then conducting a nearest neighbor search to identify a clustered group to which the particular item most likely belongs to thereby provide an identified clustered group.
  • FIG. 3 presents an illustrative example of a plurality of clustered groups for retail products for a particular retail facility where a 128-dimensional sampled vector space has been projected down to three dimensions for the sake of simplicity and clarity. Again, there are various approaches by which the foregoing may be accomplished, and where the aforementioned FaceNet publication again provides a useful salient example.
  • the control circuit 101 determines whether that identified clustered group has a corresponding identifying label.
  • the corresponding identifying label might be “bananas.”
  • the corresponding identifying label can represent that group accordingly.
  • the identifying label can constitute or at least comprise a stock keeping unit (SKU) number, such SKU numbers being well known in the art.
  • this process 200 can provide for identifying the particular item 104 as being an item that corresponds to the identifying label for the identified cluster group.
  • the identifying label is “banana,” the particular item 104 can be accordingly identified as being a banana.
  • this process provides for determining whether a predetermined condition has also been met.
  • this predetermined condition comprises a particular number of members of the identified clustered group. Such a number, typically an integer greater than zero, might range from a typical value of 3 to, say, 5 or 10.
  • this process 200 can provide for determining whether this current instance of the particular item 104 represents (in the aggregate along with other members of the identified clustered group) a sum total that equals or exceeds (as desired) the foregoing number.
  • this process 200 can accommodate any of a variety of responses.
  • One such example can include temporal multitasking (pursuant to which the control circuit 101 conducts other tasks before returning to process a new digitized image of a subsequent item).
  • this process 200 can then, at block 207 , alert a user via the above-describe user interface 103 that the identified clustered group should be assigned an identifying label.
  • An authorized person can then assess the item and assign the item/identified clustered group with an appropriate corresponding identifying label.
  • clustered groups can receive a human-based identifying label when such labeling will be useful enough to be worth the time and effort.
  • users will not be bothered with needing to pay attention to clustered groups that are too small from a practical standpoint and which may not benefit from an identifying label. Accordingly, these teachings strike a useful balance that permits a strong leveraging of automation while also preserving for appropriate times the need for measured human intervention.
  • the aforementioned predetermined condition can be static over time, in which case the same predetermined condition applies over time.
  • the predetermined condition can be more dynamic over time and hence change.
  • the number representing the predetermined condition can be temporarily reduced in order to facilitate those new items being more quickly provided with an identifying label.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Artificial Intelligence (AREA)
  • Computational Linguistics (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Computing Systems (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Image Analysis (AREA)

Abstract

A control circuit accesses a digitized image of a particular item and then processes that digitized image to thereby assign various aspects of the digitized image to various dimensions in N-dimensional Euclidean space to thereby provide an N-dimensional representation. The control circuit then accesses a database containing a plurality of various clustered groups of N-dimensional representations. At least some of the clustered groups have a corresponding identifying label. Conducting a nearest neighbor search serves to identify a clustered group to which the particular item most likely belongs to thereby provide an identified cluster group. When the identified cluster group does not have a corresponding identifying label, the control circuit can then further determine whether a predetermined condition has also been met. When true, the control circuit alerts a user via a user interface that the identified cluster group should be assigned an identifying label.

Description

    CROSS-REFERENCE TO RELATED APPLICATION
  • This application claims the benefit of U.S. Provisional Application Number 63/000,029, filed Mar. 26, 2020, which is incorporated herein by reference in its entirety.
  • TECHNICAL FIELD
  • These teachings relate generally to identifying items and more particularly to selectively facilitating the identification of previously unidentified items.
  • BACKGROUND
  • Many application settings include a large number of discrete items. Inventory for a vehicular repair facility or for a modern retailer can include tens of thousands or even hundreds of thousands of individual differentiated items as well as many duplicated items. Proper identification of such items is necessary to facilitate the overall management of such items.
  • Humans are generally capable of identifying such items. As items come and go, and especially as new items are introduced, however, the scale of the identification task can exceed practical human capability. Some prior art approaches suggest using artificial intelligence to handle the identification of a large number of items. Unfortunately, even the best and most sophisticated of present artificial intelligence approaches can be error-prone when new items or modified items are presented for consideration.
  • A simple approach to resolve the foregoing problem is to alert a human to assess every such instance of uncertainty regarding the identification of a particular item by an automated-identification platform. Unfortunately, such an approach can lead to a considerable waste of time for the corresponding human(s). In particular, such uncertainty can arise through relatively rare occurrences (on a per item basis) that are nevertheless relatively voluminous in the aggregate across a large number of items needing identification.
  • Requiring a human to oversee each and every such instance of uncertainty can be counterproductive at best and of little value in the overall scheme of things.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The above needs are at least partially met through provision of the apparatus and method to facilitate identification of items described in the following detailed description, particularly when studied in conjunction with the drawings, wherein:
  • FIG. 1 comprises a block diagram as configured in accordance with various embodiments of these teachings;
  • FIG. 2 comprises a flow diagram as configured in accordance with various embodiments of these teachings; and
  • FIG. 3 comprises a view of an N-dimensional sampled vector space as projected down to three dimensions as configured in accordance with various embodiments of these teachings.
  • Elements in the figures are illustrated for simplicity and clarity and have not necessarily been drawn to scale. For example, the dimensions and/or relative positioning of some of the elements in the figures may be exaggerated relative to other elements to help to improve understanding of various embodiments of the present teachings. Also, common but well-understood elements that are useful or necessary in a commercially feasible embodiment are often not depicted in order to facilitate a less obstructed view of these various embodiments of the present teachings. Certain actions and/or steps may be described or depicted in a particular order of occurrence while those skilled in the art will understand that such specificity with respect to sequence is not actually required. The terms and expressions used herein have the ordinary technical meaning as is accorded to such terms and expressions by persons skilled in the technical field as set forth above except where different specific meanings have otherwise been set forth herein. The word “or” when used herein shall be interpreted as having a disjunctive construction rather than a conjunctive construction unless otherwise specifically indicated.
  • DETAILED DESCRIPTION
  • Generally speaking, pursuant to many of these various embodiments a control circuit accesses a digitized image of a particular item and then processes that digitized image to thereby assign various aspects of the digitized image to various dimensions in N-dimensional Euclidean space to thereby provide an N-dimensional representation. The control circuit then accesses a database containing a plurality of various clustered groups of N-dimensional representations. By one approach, at least some of the clustered groups have a corresponding identifying label. Conducting a nearest neighbor search serves to identify a clustered group to which the particular item most likely belongs to thereby provide an identified cluster group. When the identified cluster group does not have a corresponding identifying label, the control circuit can then further determine whether a predetermined condition has also been met. When true, the control circuit alerts a user via a user interface that the identified cluster group should be assigned an identifying label.
  • By one approach, when the identified cluster group does have a corresponding identifying label, the control circuit identifies the item as being one that is identified by that identifying label.
  • By one approach, N equals an integer. Generally speaking, it can be computationally useful for N to be an even number. For many application settings N can equal 512 with very useful results.
  • By one approach, the aforementioned predetermined condition can comprise a particular number of members of the identified cluster group. Examples include, for example, 3 or 10. So configured, these teachings will not generate an alert to a human to take any actions when only one or only a very few items have not been previously identified. Instead, the system must see at least a predetermined number of items that are both unidentified and similarly clustered before such an alert is provided. This approach avoids wasting unnecessary human oversight for items that are too rare in the application setting.
  • These and other benefits may become clearer upon making a thorough review and study of the following detailed description. Referring now to the drawings, and in particular to FIG. 1 , an illustrative apparatus 100 that is compatible with many of these teachings will now be presented.
  • In this particular example, the enabling apparatus 100 includes a control circuit 101. Being a “circuit,” the control circuit 101 therefore comprises structure that includes at least one (and typically many) electrically-conductive paths (such as paths comprised of a conductive metal such as copper or silver) that convey electricity in an ordered manner, which path(s) will also typically include corresponding electrical components (both passive (such as resistors and capacitors) and active (such as any of a variety of semiconductor-based devices) as appropriate) to permit the circuit to effect the control aspect of these teachings.
  • Such a control circuit 101 can comprise a fixed-purpose hard-wired hardware platform (including but not limited to an application-specific integrated circuit (ASIC) (which is an integrated circuit that is customized by design for a particular use, rather than intended for general-purpose use), a field-programmable gate array (FPGA), and the like) or can comprise a partially or wholly-programmable hardware platform (including but not limited to microcontrollers, microprocessors, and the like). These architectural options for such structures are well known and understood in the art and require no further description here. This control circuit 101 is configured (for example, by using corresponding programming as will be well understood by those skilled in the art) to carry out one or more of the steps, actions, and/or functions described herein.
  • In this example the control circuit 101 operably couples to a memory 102. This memory 102 may be integral to the control circuit 101 or can be physically discrete (in whole or in part) from the control circuit 101 as desired. This memory 102 can also be local with respect to the control circuit 101 (where, for example, both share a common circuit board, chassis, power supply, and/or housing) or can be partially or wholly remote with respect to the control circuit 101 (where, for example, the memory 102 is physically located in another facility, metropolitan area, or even country as compared to the control circuit 101).
  • In addition to the aforementioned database information, this memory 102 can serve, for example, to non-transitorily store the computer instructions that, when executed by the control circuit 101, cause the control circuit 101 to behave as described herein. (As used herein, this reference to “non-transitorily” will be understood to refer to a non-ephemeral state for the stored contents (and hence excludes when the stored contents merely constitute signals or waves) rather than volatility of the storage media itself and hence includes both non-volatile memory (such as read-only memory (ROM) as well as volatile memory (such as a dynamic random access memory (DRAM).)
  • In this illustrative example the control circuit 101 also operably couples to a user interface 103. This user interface 103 can comprise any of a variety of user-input mechanisms (such as, but not limited to, keyboards and keypads, cursor-control devices, touch-sensitive displays, speech-recognition interfaces, gesture-recognition interfaces, and so forth) and/or user-output mechanisms (such as, but not limited to, visual displays, audio transducers, printers, and so forth) to facilitate receiving information and/or instructions from a user and/or providing information to a user. It will be understood that the user interface 103 may communicatively couple to the control circuit 101 via a wireless or non-wireless mechanism/network.
  • The aforementioned memory 102 has stored therein a digitized image of a particular item 104. That particular item 104 may be, for example, a replacement part at an aviation repair facility, munitions at a military supply depot, an item being offered for sale at retail at a retail sales facility 107, or essentially any other non-human item (including both naturally occurring as well as man-made items). The digitized image can be provided by one or more image capture apparatuses 105. Digital still image and video cameras are examples in these regards. That digitized image of the particular item 104 may be captured as a singular event or, depending upon the application setting, may be captured in a series of images that are attained when capturing a number of digitized images of a plurality of items (represented in FIG. 1 by the aforementioned particular item 104 through an Nth item 106).
  • Referring now to FIG. 2 , in this illustrative example it is presumed that the above-mentioned control circuit 101 carries out the illustrated process 200.
  • This process 200, at block 201, provides for accessing a digitized image of a particular item 104. In this example the control circuit 101 accesses the aforementioned memory 102 to thereby access that digitized image.
  • At block 202 the control circuit 101 then processes that digitized image to thereby assign various aspects of the digitized image to various dimensions in N-dimensional Euclidean space to thereby provide an N-dimensional representation (where N is an integer). For many application settings the applicant has determined that having N be an even number is beneficial. Generally speaking, for many application settings N can be selected from the range of about 64 to about 512. In this example it is presumed that N equals 512. There are various approaches by which the foregoing may be specifically accomplished. As one salient example in these regards, see “FaceNet: A Unified Embedding for Face Recognition and Clustering” by Schroff et al. in the Proceedings of the IEEE Computer Society conference on Computer Vision and Pattern Recognition published Jun. 17, 2015 (hereinafter “FaceNet publication”), the contents of which are hereby incorporated in their entirety by this reference.
  • At block 203 the control circuit 101 then accesses a database (which, if desired, can also be stored by the aforementioned memory 102) that contains a plurality of clustered groups of at least other N-dimensional representations, wherein at least some of the clustered groups have a corresponding identifying label, and then conducting a nearest neighbor search to identify a clustered group to which the particular item most likely belongs to thereby provide an identified clustered group. FIG. 3 presents an illustrative example of a plurality of clustered groups for retail products for a particular retail facility where a 128-dimensional sampled vector space has been projected down to three dimensions for the sake of simplicity and clarity. Again, there are various approaches by which the foregoing may be accomplished, and where the aforementioned FaceNet publication again provides a useful salient example.
  • Having identified the clustered group to which the particular item 104 likely belongs, at block 204 the control circuit 101 determines whether that identified clustered group has a corresponding identifying label. As one example, when the identified clustered group is for bananas, the corresponding identifying label might be “bananas.” As another example, when the identified clustered group is for a particular brand and size of motor oil, the corresponding identifying label can represent that group accordingly. By one approach, if desired, the identifying label can constitute or at least comprise a stock keeping unit (SKU) number, such SKU numbers being well known in the art.
  • When the identified clustered group does have a corresponding identifying label, by one optional approach and as shown at block 205, this process 200 can provide for identifying the particular item 104 as being an item that corresponds to the identifying label for the identified cluster group. As a simple example, when the identifying label is “banana,” the particular item 104 can be accordingly identified as being a banana.
  • When the identified clustered group does not have a corresponding identifying label, then at block 206 this process provides for determining whether a predetermined condition has also been met. By one approach this predetermined condition comprises a particular number of members of the identified clustered group. Such a number, typically an integer greater than zero, might range from a typical value of 3 to, say, 5 or 10. In such a case, this process 200 can provide for determining whether this current instance of the particular item 104 represents (in the aggregate along with other members of the identified clustered group) a sum total that equals or exceeds (as desired) the foregoing number.
  • When not true, meaning that the sum total of all members of the identified clustered group (including the particular item 104) does not equal or exceed (as desired) the foregoing threshold number, this process 200 can accommodate any of a variety of responses. One such example can include temporal multitasking (pursuant to which the control circuit 101 conducts other tasks before returning to process a new digitized image of a subsequent item).
  • When true, however, meaning that the sum total of all members of the identified clustered group (including the particular item 104) does equal or exceed (as desired) the foregoing threshold number, this process 200 can then, at block 207, alert a user via the above-describe user interface 103 that the identified clustered group should be assigned an identifying label.
  • An authorized person can then assess the item and assign the item/identified clustered group with an appropriate corresponding identifying label.
  • So configured, clustered groups can receive a human-based identifying label when such labeling will be useful enough to be worth the time and effort. At the same time, users will not be bothered with needing to pay attention to clustered groups that are too small from a practical standpoint and which may not benefit from an identifying label. Accordingly, these teachings strike a useful balance that permits a strong leveraging of automation while also preserving for appropriate times the need for measured human intervention.
  • By one approach the aforementioned predetermined condition can be static over time, in which case the same predetermined condition applies over time. By another approach the predetermined condition can be more dynamic over time and hence change. When one or more new inventory items are introduced into a particular system, for example, the number representing the predetermined condition can be temporarily reduced in order to facilitate those new items being more quickly provided with an identifying label.
  • Those skilled in the art will recognize that a wide variety of modifications, alterations, and combinations can be made with respect to the above described embodiments without departing from the scope of the invention, and that such modifications, alterations, and combinations are to be viewed as being within the ambit of the inventive concept.

Claims (16)

What is claimed is:
1. A method comprising:
by a control circuit:
accessing a digitized image of a particular item;
processing the digitized image to thereby assign various aspects of the digitized image to various dimensions in N-dimensional Euclidian space to thereby provide an N-dimensional representation;
accessing a database containing a plurality of clustered groups of other N-dimensional representations, wherein at least some of the clustered groups have a corresponding identifying label, and conducting a nearest neighbor search to identify a clustered group to which the particular item mostly likely belongs to thereby provide an identified clustered group;
when the identified clustered group does not have a corresponding identifying label, determining whether a predetermined condition has also been met;
when the predetermined condition has also been met, alerting a user via a user interface that the identified clustered group should be assigned an identifying label.
2. The method of claim 1 wherein the particular item is an item being offered for sale at retail.
3. The method of claim 1 wherein N equals an integer.
4. The method of claim 3 wherein N is an even number.
5. The method of claim 4 wherein N equals 512.
6. The method of claim 1 wherein the corresponding identifying label for any given one of the plurality of clustered groups identifies any item that corresponds to that clustered group.
7. The method of claim 6 further comprising:
when the identified clustered group does have a corresponding identifying label, identifying the particular item as being an item that is identified by the identifying label for the identified clustered group.
8. The method of claim 1 wherein the predetermined condition comprises a particular number of members of the identified clustered group.
9. An apparatus comprising:
a memory having stored therein a database containing a plurality of clustered groups of N-dimensional image representations, wherein at least some of the clustered groups have a corresponding identifying label;
a user interface;
a control circuit operably coupled to the memory and the user interface and configured to:
access a digitized image of a particular item;
process the digitized image to thereby assign various aspects of the digitized image to various dimensions in N-dimensional Euclidian space to thereby provide an N-dimensional image representation;
access the database and conduct a nearest neighbor search to identify a clustered group to which the particular item mostly likely belongs to thereby provide an identified clustered group;
when the identified clustered group does not have a corresponding identifying label, determine whether a predetermined condition has also been met;
when the predetermined condition has also been met, alert a user via a user interface that the identified clustered group should be assigned an identifying label.
10. The apparatus of claim 9 wherein the particular item is an item being offered for sale at retail.
11. The apparatus of claim 9 wherein N equals an integer.
12. The apparatus of claim 11 wherein N is an even number.
13. The apparatus of claim 12 wherein N equals 512.
14. The apparatus of claim 9 wherein the corresponding identifying label for any given one of the plurality of clustered groups identifies any item that corresponds to that clustered group.
15. The apparatus of claim 14 wherein the control circuit is further configured to:
when the identified clustered group does have a corresponding identifying label, identify the particular item as being an item that is identified by the identifying label for the identified clustered group.
16. The apparatus of claim 9 wherein the predetermined condition comprises a particular number of members of the identified clustered group.
US17/914,745 2020-03-26 2021-03-25 Apparatus and method to facilitate identification of items Active 2042-04-02 US12333813B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US17/914,745 US12333813B2 (en) 2020-03-26 2021-03-25 Apparatus and method to facilitate identification of items

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US202063000029P 2020-03-26 2020-03-26
US17/914,745 US12333813B2 (en) 2020-03-26 2021-03-25 Apparatus and method to facilitate identification of items
PCT/US2021/024052 WO2021195314A1 (en) 2020-03-26 2021-03-25 Apparatus and method to facilitate identification of items

Publications (2)

Publication Number Publication Date
US20230129852A1 true US20230129852A1 (en) 2023-04-27
US12333813B2 US12333813B2 (en) 2025-06-17

Family

ID=77892674

Family Applications (1)

Application Number Title Priority Date Filing Date
US17/914,745 Active 2042-04-02 US12333813B2 (en) 2020-03-26 2021-03-25 Apparatus and method to facilitate identification of items

Country Status (4)

Country Link
US (1) US12333813B2 (en)
CA (1) CA3173331A1 (en)
MX (1) MX2022011775A (en)
WO (1) WO2021195314A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US12299629B2 (en) 2020-04-22 2025-05-13 Walmart Apollo, Llc Systems and methods of defining and identifying product display areas on product display shelves

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021195314A1 (en) 2020-03-26 2021-09-30 Walmart Apollo, Llc Apparatus and method to facilitate identification of items

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050021512A1 (en) * 2003-07-23 2005-01-27 Helmut Koenig Automatic indexing of digital image archives for content-based, context-sensitive searching
WO2009061434A1 (en) * 2007-11-07 2009-05-14 Viewdle, Inc. System and method for processing digital media
US20100023497A1 (en) * 2008-07-25 2010-01-28 Microsoft Corporation Using an ID Domain to Improve Searching
US20100067745A1 (en) * 2008-09-16 2010-03-18 Ivan Kovtun System and method for object clustering and identification in video
US20130066592A1 (en) * 2009-10-23 2013-03-14 Commissariat A L'energie Atomique Et Aux Energies Alternatives Method and system for evaluating the resemblance of a query object to reference objects
US8582802B2 (en) * 2009-10-09 2013-11-12 Edgenet, Inc. Automatic method to generate product attributes based solely on product images
US10963692B1 (en) * 2018-11-30 2021-03-30 Automation Anywhere, Inc. Deep learning based document image embeddings for layout classification and retrieval

Family Cites Families (68)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6142317A (en) 1997-11-12 2000-11-07 Merl; Milton J. Gravity feed shelving system with track and pusher
US9375100B2 (en) 2004-02-03 2016-06-28 Rtc Industries, Inc. Product securement and management system
US8448858B1 (en) 2004-06-21 2013-05-28 Stoplift, Inc. Method and apparatus for detecting suspicious activity using video analysis from alternative camera viewpoint
US7631808B2 (en) 2004-06-21 2009-12-15 Stoplift, Inc. Method and apparatus for detecting suspicious activity using video analysis
US9173504B2 (en) 2005-09-12 2015-11-03 Rtc Industries, Inc. Product management display system
US20100171826A1 (en) 2006-04-12 2010-07-08 Store Eyes, Inc. Method for measuring retail display and compliance
US8233702B2 (en) 2006-08-18 2012-07-31 Google Inc. Computer implemented technique for analyzing images
US8876001B2 (en) 2007-08-07 2014-11-04 Ncr Corporation Methods and apparatus for image recognition in checkout verification
US7909248B1 (en) 2007-08-17 2011-03-22 Evolution Robotics Retail, Inc. Self checkout with visual recognition
US8630924B2 (en) 2007-08-31 2014-01-14 Accenture Global Services Limited Detection of stock out conditions based on image processing
US8068674B2 (en) 2007-09-04 2011-11-29 Evolution Robotics Retail, Inc. UPC substitution fraud prevention
US9412124B2 (en) 2007-09-23 2016-08-09 Sunrise R&D Holdings, Llc Multi-item scanning systems and methods of items for purchase in a retail environment
US8260456B2 (en) 2008-03-25 2012-09-04 Fasteners For Retail, Inc. Retail shelf supply monitoring system
US20090272801A1 (en) 2008-04-30 2009-11-05 Connell Ii Jonathan H Deterring checkout fraud
US9299229B2 (en) 2008-10-31 2016-03-29 Toshiba Global Commerce Solutions Holdings Corporation Detecting primitive events at checkout
US7962365B2 (en) 2008-10-31 2011-06-14 International Business Machines Corporation Using detailed process information at a point of sale
US8494909B2 (en) 2009-02-09 2013-07-23 Datalogic ADC, Inc. Automatic learning in a merchandise checkout system with visual recognition
US20110286628A1 (en) 2010-05-14 2011-11-24 Goncalves Luis F Systems and methods for object recognition using a large database
US8682740B2 (en) 2010-10-26 2014-03-25 Cbs Interactive Inc. Systems and methods using a manufacturer line, series, model hierarchy
US8577136B1 (en) 2010-12-28 2013-11-05 Target Brands, Inc. Grid pixelation enhancement for in-stock analytics
US20120203647A1 (en) 2011-02-09 2012-08-09 Metrologic Instruments, Inc. Method of and system for uniquely responding to code data captured from products so as to alert the product handler to carry out exception handling procedures
RU2630749C2 (en) 2011-03-17 2017-09-12 Патрик КАМПБЕЛЛ System of tracking goods on shelves (tgs)
US20120280040A1 (en) 2011-05-06 2012-11-08 Verizon Patent And Licensing Inc. Wireless-based checkout and loss prevention
EP2718895A4 (en) 2011-06-06 2014-11-05 Stoplift Inc NOTIFICATION SYSTEM AND METHODS FOR USE IN RETAIL ENVIRONMENTS
US9367770B2 (en) 2011-08-30 2016-06-14 Digimarc Corporation Methods and arrangements for identifying objects
US20130235206A1 (en) 2012-03-12 2013-09-12 Numerex Corp. System and Method of On-Shelf Inventory Management
US20140039950A1 (en) 2012-08-03 2014-02-06 International Business Machines Corporation Automatically detecting lost sales
IES86318B2 (en) 2012-08-15 2013-12-04 Everseen Intelligent retail manager
JP5414878B1 (en) 2012-11-30 2014-02-12 楽天株式会社 Out-of-stock notification system, out-of-stock notification device, out-of-stock notification method, and program
US10002378B2 (en) 2012-12-20 2018-06-19 Walmart Apollo, Llc Informing customers regarding items on their shopping list
WO2014100827A1 (en) 2012-12-21 2014-06-26 Joshua Migdal Verification of fraudulent activities at a selfcheckout terminal
US10192208B1 (en) 2013-07-08 2019-01-29 Ecr Software Corporation Systems and methods for an improved self-checkout with loss prevention options
US9589433B1 (en) 2013-07-31 2017-03-07 Jeff Thramann Self-checkout anti-theft device
US9886678B2 (en) 2013-09-25 2018-02-06 Sap Se Graphic representations of planograms
US9870565B2 (en) 2014-01-07 2018-01-16 Joshua Migdal Fraudulent activity detection at a barcode scanner by verifying visual signatures
US10430776B2 (en) 2014-01-09 2019-10-01 Datalogic Usa, Inc. System and method for exception handling in self-checkout and automated data capture systems
US9495606B2 (en) 2014-02-28 2016-11-15 Ricoh Co., Ltd. Method for product recognition from multiple images
US20150262116A1 (en) 2014-03-16 2015-09-17 International Business Machines Corporation Machine vision technology for shelf inventory management
US10113910B2 (en) * 2014-08-26 2018-10-30 Digimarc Corporation Sensor-synchronized spectrally-structured-light imaging
US20160110791A1 (en) 2014-10-15 2016-04-21 Toshiba Global Commerce Solutions Holdings Corporation Method, computer program product, and system for providing a sensor-based environment
US10373116B2 (en) 2014-10-24 2019-08-06 Fellow, Inc. Intelligent inventory management and related systems and methods
US20160189277A1 (en) 2014-12-24 2016-06-30 Digimarc Corporation Self-checkout arrangements
US9710926B2 (en) 2015-01-30 2017-07-18 Hewlett-Packard Development Company, L.P. Image processing of a retail shelf area
CA2940356A1 (en) 2015-09-28 2017-03-28 Wal-Mart Stores, Inc. Systems and methods of object identification and database creation
US10592855B2 (en) 2015-12-18 2020-03-17 Ricoh Co., Ltd. Method, system, and computer program product for planogram generation
EP3420520A4 (en) 2016-02-26 2019-10-23 Imagr Limited SYSTEM AND METHODS FOR PERFORMING PURCHASES IN A PHYSICAL STORE
US10628862B2 (en) 2016-03-08 2020-04-21 Walmart Apollo, Llc Fresh perishable store item notification systems and methods
US11087272B2 (en) 2016-03-29 2021-08-10 Bossa Nova Robotics Ip, Inc. System and method for locating, identifying and counting items
US10339595B2 (en) 2016-05-09 2019-07-02 Grabango Co. System and method for computer vision driven applications within an environment
US20170330059A1 (en) 2016-05-11 2017-11-16 Xerox Corporation Joint object and object part detection using web supervision
JP6728404B2 (en) 2016-05-19 2020-07-22 シムビ ロボティクス, インコーポレイテッドSimbe Robotics, Inc. How to track product placement on store shelves
US11482082B2 (en) 2016-09-18 2022-10-25 Ncr Corporation Non-scan loss verification at self-checkout terminal
WO2018075894A1 (en) 2016-10-21 2018-04-26 Sungal Corporation Discrete gravity feed merchandise advancement seats and assembly combinations
US10959540B2 (en) 2016-12-05 2021-03-30 Retail Space Solutions Llc Shelf management system, components thereof, and related methods
GB2562095B (en) 2017-05-05 2020-07-15 Arm Kk An electronic label and methods and system therefor
US11804112B2 (en) * 2017-07-12 2023-10-31 Mastercard Asia/Pacific Pte. Ltd. Mobile device platform for automated visual retail product recognition
CN107958252A (en) 2017-11-23 2018-04-24 深圳码隆科技有限公司 A kind of commodity recognition method and equipment
EP4597456A2 (en) 2018-01-10 2025-08-06 Trax Technology Solutions Pte Ltd. Automatically monitoring retail products based on captured images
US10949799B2 (en) 2018-06-29 2021-03-16 Focal Systems, Inc. On-shelf image based out-of-stock detection
US11250411B2 (en) 2018-10-16 2022-02-15 American Express Travel Related Services Company, Inc. Secure mobile checkout system
US10607116B1 (en) 2018-10-30 2020-03-31 Eyezon Ltd Automatically tagging images to create labeled dataset for training supervised machine learning models
US20210241208A1 (en) 2020-01-31 2021-08-05 Capital One Services, Llc Method and system for identifying and onboarding a vehicle into inventory
US20230131444A1 (en) 2020-03-26 2023-04-27 Walmart Apollo, Llc Systems and methods of detecting fraudulent activity at self-checkout terminals
WO2021195314A1 (en) 2020-03-26 2021-09-30 Walmart Apollo, Llc Apparatus and method to facilitate identification of items
US20230120798A1 (en) 2020-03-26 2023-04-20 Walmart Apollo, Llc Systems and methods for detecting a mis-scan of an item for purchase
WO2021216357A1 (en) 2020-04-22 2021-10-28 Walmart Apollo, Llc Systems and methods of defining and identifying product display areas on product display shelves
US20230177458A1 (en) 2020-04-22 2023-06-08 Walmart Apollo, Llc Methods and systems for monitoring on-shelf inventory and detecting out of stock events
US11134798B1 (en) 2020-06-29 2021-10-05 Ncr Corporation Vision-based frictionless self-checkouts for small baskets

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050021512A1 (en) * 2003-07-23 2005-01-27 Helmut Koenig Automatic indexing of digital image archives for content-based, context-sensitive searching
WO2009061434A1 (en) * 2007-11-07 2009-05-14 Viewdle, Inc. System and method for processing digital media
US20100023497A1 (en) * 2008-07-25 2010-01-28 Microsoft Corporation Using an ID Domain to Improve Searching
US20100067745A1 (en) * 2008-09-16 2010-03-18 Ivan Kovtun System and method for object clustering and identification in video
US8582802B2 (en) * 2009-10-09 2013-11-12 Edgenet, Inc. Automatic method to generate product attributes based solely on product images
US20130066592A1 (en) * 2009-10-23 2013-03-14 Commissariat A L'energie Atomique Et Aux Energies Alternatives Method and system for evaluating the resemblance of a query object to reference objects
US10963692B1 (en) * 2018-11-30 2021-03-30 Automation Anywhere, Inc. Deep learning based document image embeddings for layout classification and retrieval

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Schroff, F. - "FaceNet: A Unified Embedding for Face Recognition and Clustering" – CVPR 2015, pages 815-823 (Year: 2015) *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US12299629B2 (en) 2020-04-22 2025-05-13 Walmart Apollo, Llc Systems and methods of defining and identifying product display areas on product display shelves

Also Published As

Publication number Publication date
CA3173331A1 (en) 2021-09-30
WO2021195314A1 (en) 2021-09-30
MX2022011775A (en) 2023-02-09
US12333813B2 (en) 2025-06-17

Similar Documents

Publication Publication Date Title
US11568315B2 (en) Systems and methods for learning user representations for open vocabulary data sets
Madiraju Deep temporal clustering: Fully unsupervised learning of time-domain features
US12333813B2 (en) Apparatus and method to facilitate identification of items
AU2024201361B2 (en) Processing images using self-attention based neural networks
US10354204B2 (en) Machine learning predictive labeling system
CN110598845B (en) Data processing method, data processing device, computer equipment and storage medium
US20190034766A1 (en) Machine learning predictive labeling system
CN110427970B (en) Image classification method, apparatus, computer device and storage medium
CN109543547B (en) Face image recognition method, device, equipment and storage medium
Yang Automatically labeling video data using multi-class active learning
Wang et al. Sequential projection learning for hashing with compact codes
CN104094255B (en) For searching for the method and apparatus of image and computer readable recording medium storing program for performing for performing this method
KR20180118596A (en) Semi-automatic labeling of data sets
US10957048B2 (en) Image segmentation method, apparatus and non-transitory computer readable medium of the same
CN112329660B (en) Scene recognition method and device, intelligent equipment and storage medium
CN110135943B (en) Product recommendation method, device, computer equipment and storage medium
CN113222624A (en) Intelligent analysis method and system for preventing electricity stealing
CN111311292B (en) User classification method and system
CN115115825B (en) Method, device, computer equipment and storage medium for detecting object in image
US20250086585A1 (en) Retail shelf image processing and inventory tracking system
Nalepa et al. Adaptive guided ejection search for pickup and delivery with time windows
CN115660686A (en) Transaction risk assessment method, apparatus, device, storage medium and program product
CN113822301B (en) Sorting center sorting method and device, storage medium and electronic equipment
CN113762990B (en) Commodity recommendation method, commodity recommendation device, computing equipment and computer storage medium
CN110399795B (en) Face tracking method and device in video, computer equipment and storage medium

Legal Events

Date Code Title Description
FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

AS Assignment

Owner name: WALMART APOLLO, LLC, ARKANSAS

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:HOROWITZ, JOSHUA M.;REEL/FRAME:061244/0808

Effective date: 20200817

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STCF Information on status: patent grant

Free format text: PATENTED CASE

CC Certificate of correction