US20200202163A1 - Target positioning system and target positioning method - Google Patents

Target positioning system and target positioning method Download PDF

Info

Publication number
US20200202163A1
US20200202163A1 US16/812,041 US202016812041A US2020202163A1 US 20200202163 A1 US20200202163 A1 US 20200202163A1 US 202016812041 A US202016812041 A US 202016812041A US 2020202163 A1 US2020202163 A1 US 2020202163A1
Authority
US
United States
Prior art keywords
goods
user
image
coordinates
rack
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US16/812,041
Other languages
English (en)
Inventor
Linan FENG
Ding XIA
Jieyu MA
Tingtao Li
Wenyao WU
Yimei ZHANG
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Cloudpick Smart Technology Co Ltd
Original Assignee
Shanghai Cloudpick Smart Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Cloudpick Smart Technology Co Ltd filed Critical Shanghai Cloudpick Smart Technology Co Ltd
Assigned to SHANGHAI CLOUDPICK SMART TECHNOLOGY CO., LTD. reassignment SHANGHAI CLOUDPICK SMART TECHNOLOGY CO., LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: FENG, Linan, LI, Tingtao, MA, Jieyu, WU, Wenyao, XIA, Ding, ZHANG, Yimei
Publication of US20200202163A1 publication Critical patent/US20200202163A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • G06K9/6211
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01GWEIGHING
    • G01G19/00Weighing apparatus or methods adapted for special purposes not provided for in the preceding groups
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]
    • G06Q30/0639Item locations
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47FSPECIAL FURNITURE, FITTINGS, OR ACCESSORIES FOR SHOPS, STOREHOUSES, BARS, RESTAURANTS OR THE LIKE; PAYING COUNTERS
    • A47F5/00Show stands, hangers, or shelves characterised by their constructional features
    • A47F5/0018Display racks with shelves or receptables
    • A47F5/0025Display racks with shelves or receptables having separate display containers or trays on shelves or on racks
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01GWEIGHING
    • G01G19/00Weighing apparatus or methods adapted for special purposes not provided for in the preceding groups
    • G01G19/52Weighing apparatus combined with other objects, e.g. furniture
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01VGEOPHYSICS; GRAVITATIONAL MEASUREMENTS; DETECTING MASSES OR OBJECTS; TAGS
    • G01V7/00Measuring gravitational fields or waves; Gravimetric prospecting or detecting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/22Indexing; Data structures therefor; Storage structures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/23Updating
    • G06F16/2379Updates performed during online database operations; commit processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/58Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/587Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using geographical or spatial information, e.g. location
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06K9/00771
    • G06K9/3241
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/08Logistics, e.g. warehousing, loading or distribution; Inventory or stock management
    • G06Q10/087Inventory or stock management, e.g. order filling, procurement or balancing against orders
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/08Logistics, e.g. warehousing, loading or distribution; Inventory or stock management
    • G06Q10/087Inventory or stock management, e.g. order filling, procurement or balancing against orders
    • G06Q10/0875Itemisation or classification of parts, supplies or services, e.g. bill of materials
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q20/00Payment architectures, schemes or protocols
    • G06Q20/08Payment architectures
    • G06Q20/20Point-of-sale [POS] network systems
    • G06Q20/206Point-of-sale [POS] network systems comprising security or operator identification provisions, e.g. password entry
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q20/00Payment architectures, schemes or protocols
    • G06Q20/08Payment architectures
    • G06Q20/20Point-of-sale [POS] network systems
    • G06Q20/208Input by product or record sensing, e.g. weighing or scanner processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/22Image preprocessing by selection of a specific region containing or referencing a pattern; Locating or processing of specific regions to guide the detection or recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/255Detecting or recognising potential candidate objects based on visual cues, e.g. shapes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/64Three-dimensional objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/103Static body considered as a whole, e.g. static pedestrian or occupant recognition
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C9/00Individual registration on entry or exit
    • G07C9/30Individual registration on entry or exit not involving the use of a pass
    • G07C9/32Individual registration on entry or exit not involving the use of a pass in combination with an identity check
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C9/00Individual registration on entry or exit
    • G07C9/30Individual registration on entry or exit not involving the use of a pass
    • G07C9/38Individual registration on entry or exit not involving the use of a pass with central registration
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07GREGISTERING THE RECEIPT OF CASH, VALUABLES, OR TOKENS
    • G07G1/00Cash registers
    • G07G1/0009Details of the software in the checkout register, electronic cash register [ECR] or point of sale terminal [POS]
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07GREGISTERING THE RECEIPT OF CASH, VALUABLES, OR TOKENS
    • G07G1/00Cash registers
    • G07G1/0018Constructional details, e.g. of drawer, printing means, input means
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07GREGISTERING THE RECEIPT OF CASH, VALUABLES, OR TOKENS
    • G07G1/00Cash registers
    • G07G1/0036Checkout procedures
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07GREGISTERING THE RECEIPT OF CASH, VALUABLES, OR TOKENS
    • G07G1/00Cash registers
    • G07G1/0036Checkout procedures
    • G07G1/0045Checkout procedures with a code reader for reading of an identifying code of the article to be registered, e.g. barcode reader or radio-frequency identity [RFID] reader
    • G07G1/0054Checkout procedures with a code reader for reading of an identifying code of the article to be registered, e.g. barcode reader or radio-frequency identity [RFID] reader with control of supplementary check-parameters, e.g. weight or number of articles
    • G07G1/0063Checkout procedures with a code reader for reading of an identifying code of the article to be registered, e.g. barcode reader or radio-frequency identity [RFID] reader with control of supplementary check-parameters, e.g. weight or number of articles with means for detecting the geometric dimensions of the article of which the code is read, such as its size or height, for the verification of the registration
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07GREGISTERING THE RECEIPT OF CASH, VALUABLES, OR TOKENS
    • G07G1/00Cash registers
    • G07G1/0036Checkout procedures
    • G07G1/0045Checkout procedures with a code reader for reading of an identifying code of the article to be registered, e.g. barcode reader or radio-frequency identity [RFID] reader
    • G07G1/009Checkout procedures with a code reader for reading of an identifying code of the article to be registered, e.g. barcode reader or radio-frequency identity [RFID] reader the reader being an RFID reader
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07GREGISTERING THE RECEIPT OF CASH, VALUABLES, OR TOKENS
    • G07G1/00Cash registers
    • G07G1/12Cash registers electronically operated
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07GREGISTERING THE RECEIPT OF CASH, VALUABLES, OR TOKENS
    • G07G1/00Cash registers
    • G07G1/12Cash registers electronically operated
    • G07G1/14Systems including one or more distant stations co-operating with a central processing unit
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/30Services specially adapted for particular environments, situations or purposes
    • H04W4/33Services specially adapted for particular environments, situations or purposes for indoor environments, e.g. buildings
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/30Services specially adapted for particular environments, situations or purposes
    • H04W4/35Services specially adapted for particular environments, situations or purposes for the management of goods or merchandise
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47FSPECIAL FURNITURE, FITTINGS, OR ACCESSORIES FOR SHOPS, STOREHOUSES, BARS, RESTAURANTS OR THE LIKE; PAYING COUNTERS
    • A47F10/00Furniture or installations specially adapted to particular types of service systems, not otherwise provided for
    • A47F10/02Furniture or installations specially adapted to particular types of service systems, not otherwise provided for for self-service type systems, e.g. supermarkets
    • A47F2010/025Furniture or installations specially adapted to particular types of service systems, not otherwise provided for for self-service type systems, e.g. supermarkets using stock management systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01GWEIGHING
    • G01G19/00Weighing apparatus or methods adapted for special purposes not provided for in the preceding groups
    • G01G19/40Weighing apparatus or methods adapted for special purposes not provided for in the preceding groups with provisions for indicating, recording, or computing price or other quantities dependent on the weight
    • G01G19/413Weighing apparatus or methods adapted for special purposes not provided for in the preceding groups with provisions for indicating, recording, or computing price or other quantities dependent on the weight using electromechanical or electronic computing means
    • G01G19/414Weighing apparatus or methods adapted for special purposes not provided for in the preceding groups with provisions for indicating, recording, or computing price or other quantities dependent on the weight using electromechanical or electronic computing means using electronic computing means only
    • G01G19/4144Weighing apparatus or methods adapted for special purposes not provided for in the preceding groups with provisions for indicating, recording, or computing price or other quantities dependent on the weight using electromechanical or electronic computing means using electronic computing means only for controlling weight of goods in commercial establishments, e.g. supermarket, P.O.S. systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01GWEIGHING
    • G01G19/00Weighing apparatus or methods adapted for special purposes not provided for in the preceding groups
    • G01G19/40Weighing apparatus or methods adapted for special purposes not provided for in the preceding groups with provisions for indicating, recording, or computing price or other quantities dependent on the weight
    • G01G19/42Weighing apparatus or methods adapted for special purposes not provided for in the preceding groups with provisions for indicating, recording, or computing price or other quantities dependent on the weight for counting by weighing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/10Recognition assisted with metadata
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/12Acquisition of 3D measurements of objects

Definitions

  • the disclosure relates to a user real-time tracking technology for retail industry, particularly, a target positioning system and a target positioning method.
  • a human face technology to identify user identities is adopted, and the type and the quantity of commodities purchased by a user are determined according to RFID tags of goods carried on the body when each user leaves a supermarket, so that the RFID tags need to be set on each item, and an RFID reader-writer is arranged at an entrance guard.
  • the scheme does not need to track the real-time position of a user, and has the defects that firstly, the hardware cost is high, the price of each RFID tag is about 0.5-1 yuan, the tag increases the cost of each commodity, reduces the competitiveness of a supermarket, and for goods with the cost of 5 yuan, additionally setting the RFID tag increases the cost of the goods by 10-20%; secondly, there is the possibility that the goods are shielded and removed in the goods perception, so that the phenomenon that the RFID reader is shielded by a user occurs, and results in the loss of the goods; and thirdly, the settlement can be realized only at the supermarket entrance guard, and if the user eats the edible goods before leaving the store and leaves the package in the supermarket, the RFID reader cannot sense and determine the real consumption amount of the user. That is, the scheme highly depends on the self-discipline and morality of citizen users, and is not restricted by technical means, thereby the risk in the unmanned supermarket business process is high.
  • the disclosure provides a target positioning system, a target positioning method and a target positioning apparatus.
  • a target positioning system including: a 3D image acquisition device that acquires at least one frame of 3D image in real time, where the 3D image comprises all or part of images of at least one target object; and a target coordinate generator in a data processing equipment, that establishes a 3D coordinate system in a closed space, and acquires coordinates or a group of coordinates of the target object in the 3D coordinate system in real time according to the at least one frame of 3D image.
  • a target positioning method including: setting a closed space; acquiring at least one frame of 3D image in real time, where the 3D image comprises all or part of images of at least one target object; and establishing a 3D coordinate system in the closed space, and acquiring coordinates or a group of coordinates of the target object in real time according to the at least one frame of 3D image.
  • a target positioning apparatus including: a processor; and a memory configured to store instructions executable by the processor; where the processor, upon execution of the instructions, is configured to: acquire at least one frame of 3D image in real time, where the 3D image comprises all or part of images of at least one target object; and establish a 3D coordinate system in a closed space, and acquiring coordinates or a group of coordinates of the target object in real time according to the at least one frame of 3D image.
  • FIG. 1 is a top view of an unmanned supermarket according to an example of the disclosure
  • FIG. 2 is a schematic structural view of a tray and a rack according to an example of the disclosure
  • FIG. 3 is a schematic view of the overall structure of the shelf according to an example of the disclosure.
  • FIG. 4 is a block diagram of a user identification system according to an example of the disclosure.
  • FIG. 5 is a block diagram of a target locating system according to an example of the disclosure.
  • FIG. 6 is a diagram illustrating a distribution of an image sensor in a closed space according to an example of the disclosure
  • FIG. 7 is a block diagram of a goods perception system based on weight monitoring according to an example of the disclosure.
  • FIG. 8 is a block diagram of the goods perception system based on image monitoring according to an example of the disclosure.
  • FIG. 9 is a diagram illustrating a positional relationship between the second camera and the shelf according to an example of the disclosure.
  • FIG. 10 is a block diagram of a shopping user determination system according to an example of the disclosure.
  • FIG. 11 is a block diagram of a shopping database system according to an example of the disclosure.
  • FIG. 12 is a block diagram of a settlement system according to an example of the disclosure.
  • FIG. 13 is a flowchart of a target locating method according to an example of the disclosure.
  • FIG. 14 is a flowchart illustrating a 3D image capturing process according to an example of the disclosure.
  • FIG. 15 is a flowchart illustrating a target coordinate acquiring step according to an example of the disclosure.
  • FIG. 16 is a flowchart illustrating a position parameter obtaining step according to an example of the disclosure.
  • FIG. 17 is a flowchart of another method for locating a target object according to an example of the disclosure.
  • FIG. 18 is a flowchart illustrating a shopping user determination step according to an example of the disclosure.
  • FIG. 19 is a flowchart illustrating a shopping information recording step according to an example of the disclosure.
  • an element When an element is referred to as being “on” another element, it can be directly on the other element; there may also be an intermediate element, the element being disposed on the intermediate element and the intermediate element being disposed on the other element.
  • an element When an element is referred to as being “mounted to” or “connected to” another element, it is to be understood that the element is directly “mounted to” or “connected to” the other element or that the element is indirectly “mounted to” or “connected to” the other element through an intermediate element.
  • the example of the disclosure relates to a target positioning system which is a part of an unmanned vending system of an unmanned supermarket, where a plurality of image acquisition devices are arranged at the top of the space of the unmanned supermarket, thereby the real-time position of a target object (e.g., a shopping user) in the space of the unmanned supermarket is acquired, and effective tracking is realized.
  • a target object e.g., a shopping user
  • the unmanned vending system includes a closed space 1 , in which a plurality of shelves 2 are disposed, each shelf 2 includes a support 3 and a plurality of trays 4 detachably mounted on the support 3 , and the plurality of trays 4 are parallel to each other at different heights or are flush with each other at the same height.
  • Each tray 4 is provided with a plurality of racks 5 set in parallel, and at least one kind of goods is placed on each rack 5 .
  • the goods placed on the rack 5 of the present example are required to be easily taken out or put back by the user, and thus, the end of the rack 5 facing the user is used as the front end of the rack 5 .
  • a weight sensing device 6 e.g., a rectangular weight sensor, is set between each rack 5 and tray 4 , with a lower surface of one end connected to the tray 4 and an upper surface of the other end connected to the rack 5 .
  • each rack 5 is an open box body, and can be placed with one or more kinds of goods, the goods are standard goods, and the appearance and weight of the same kind of goods are the same or similar.
  • the same kind of goods placed on the same rack 5 have the same weight value, different kinds of goods have different weight values, and each weight value only corresponds to one kind of goods.
  • the weight sensing device 6 can accurately acquire the real-time weight sensing values of the racks 5 and goods on the upper surfaces of the racks 5 , and accurately sense every variation of the weight value of each rack 5 , including the increment or the decrement.
  • the example further includes a data processing equipment 7 , such as a server or a computer, and the data processing equipment 7 is provided with a plurality of data processing software, has a plurality of functional modules, and can be connected to a plurality of hardware through data lines, so as to implement a plurality of functions in a manner of combining the software and the hardware.
  • a data processing equipment 7 such as a server or a computer
  • the data processing equipment 7 is provided with a plurality of data processing software, has a plurality of functional modules, and can be connected to a plurality of hardware through data lines, so as to implement a plurality of functions in a manner of combining the software and the hardware.
  • the example further includes a user identification system 100 for identifying the identity information of each user.
  • the user identification system 100 includes an access control device 101 and an identification device 102 .
  • the closed space 1 is not an absolutely sealed space, but a relatively closed space, and the closed space 1 is provided with an access, e.g., a user entrance 103 and a user exit 104 , where all users enter the closed space 1 through the user entrance 103 and leave the closed space 1 through the user exit 104 .
  • the closed space 1 may be formed by a housing, a plurality of walls, etc.
  • each entrance and exit of the closed space 1 are provided with an access control device 101 , e.g., an automatic door.
  • the identification device 102 is used for acquiring the identity information of the user, and comprises a scanning device 1021 connected to the data processing equipment 7 and an identity acquisition unit 1022 in the data processing equipment 7 .
  • the scanning device 1021 is disposed inside or outside the access control device 101 at the user entrance 103 , e.g., disposed on the outer surface of the automatic gate, and is used for scanning the identification code, e.g., a two-dimensional code;
  • the identity acquisition unit 1022 is a functional module in the data processing equipment 7 , and can acquire the identity information of the user according to the identity identification code.
  • the access control device 101 at the exit 104 does not need to be provided with the identification device 102 .
  • each user downloads a dedicated application software (APP) which matches an unmanned supermarket to a mobile communication terminal (a mobile phone, a tablet computer, or the like), registers an account in the application software (APP), and associates the account with payment software; or each user downloads payment software (such as WeChatPay/AliPay) into the mobile communication terminal, a small program which matches the unmanned supermarket is embedded into the payment software, an account is registered in the payment software, and user registration information and electronic payment information including user identity information, bank account information, payment passwords and the like are set in the special application software (APP) or the payment software.
  • APP special application software
  • the user identity information will be stored in the user database of the data processing equipment 7 .
  • Application software (APP) in the mobile communication terminal may generate a two-dimensional code, the two-dimensional code stores the identity information of the user, etc.
  • the two-dimensional code generated by the application software is directly opposite to the scanning end of the scanning device 1021 , the scanning device 1021 decodes the two-dimensional code after code scanning, and transmits the decoding to the data processing equipment 7 . If the two-dimensional code is identifiable and the identified identity information matches the identity information stored in the user database in advance, it is determined that the user identity is legal, the access control device 101 is open to allow the user to enter the closed space 1 .
  • the access control device 101 at the user entrance 103 is provided with a sensing device, such as an infrared sensor, and when the user enters the closed space 1 , the access control device 101 senses that someone has passed the access control and then automatically closes.
  • a sensing device such as an infrared sensor
  • the access control device 101 at the exit 104 senses that someone approaches the access control device 101 from the inside of the closed space 1 , the access control device can be automatically opened, and after the user leaves the closed space 1 , the access control device 101 senses that someone passed the access control device and then automatically closes.
  • the data processing equipment 7 may generate a shopping database of the user, and obtain shopping information to update the shopping database according to each shopping behavior of the user during the shopping process of the user. Because the mobile communication terminal carried by the user carries out real-time data exchange with the data processing equipment 7 through the application software (APP), the shopping database of the user can also be displayed in the application software (APP) in the mobile communication terminal to generate a shopping cart interface, so that the user can know the shopping records including shopping receipts and subsequent settlement details of each transaction.
  • APP application software
  • the target positioning system 200 of the present example includes a closed space 1 , a 3D image acquisition device 201 , and a target coordinate acquisition unit or a target coordinate generator 202 , which are used to acquire the real-time position of each target in the closed space 1 .
  • the target object in this example is all or part of the user and its extension, and the target positioning system 200 is the user positioning system, and is used to obtain the position of the whole or part (such as the head, the hand, etc.) of the user, that is, the group of coordinates (e.g., coordinates of a number of points presenting the position of the user) in a 3D coordinate system.
  • the group of coordinates e.g., coordinates of a number of points presenting the position of the user
  • the “target coordinate acquisition unit 202 ” may also be referred to as “target coordinate acquirer” or “target coordinate generator.” These terms may be used interchangeably throughout this disclosure, and may be implemented as a hardware device, a software module, or a combination of hardware and software, that collects and/or stores sample images.
  • the 3D image acquisition device 201 includes at least one image sensor 2011 for capturing at least one frame of 3D image in real time.
  • the plurality of the image sensors 2011 are evenly distributed at the top of the closed space 1 , the lenses of the image sensors 2011 face downwards, and the central axes of the lenses can be perpendicular to the horizontal plane or have a certain inclination angle.
  • the field of view of the lenses of the image sensors 2011 covers the entire bottom surface of the closed space 1 .
  • the 3D image acquired by the image sensor comprises a user image, and the user image refers to a picture of the whole or part of the body and the extension part of the user. If no one is in the closed space, the 3D image at each moment is the same as the previous moment, and the 3D image at the moment can be determined to be the background without any user images.
  • Each image sensor 2011 comprises a depth image sensor 2012 and an RGB image sensor 2013 which are set in parallel, and a 3D image integration unit or a 3D image integrator 2014 , wherein the depth image sensor 2012 continuously acquires a plurality of frames of depth images; the RGB image sensor 2013 continuously collects a plurality of frames of RGB images, and the 3D image integration unit 2014 combines a frame of depth image and a frame of RGB image collected by the same image sensor 201 at the same time into a frame of 3D image.
  • the depth image sensor and the RGB image sensor acquire synchronously (simultaneously, and the acquisition frequency is the same), the image sensor 2011 can acquire RGB images and depth images with the same number of frames per second, and the 3D image integration unit 2014 can continuously acquire a plurality of frames of 3D images per second and transmit the 3D images to the target coordinate acquisition unit 202 of the data processing equipment 7 .
  • the target coordinate acquisition unit 202 is a functional module in the data processing equipment 7 , establishes a 3D coordinate system in the closed space, and obtains coordinates or a group of coordinates of the user in the 3D coordinate system in real time according to the continuous multi-frame 3D images including the user images.
  • the target coordinate acquisition unit 202 includes a coordinate system building unit or a coordinate system builder 2021 , a parameter acquisition unit or a parameter acquirer 2022 , a background removal unit or a background remover 2023 , and a target coordinate calculation unit or a target coordinate calculator 2024 . As shown in FIG.
  • the “coordinate system building unit 2021 ” may also be referred to as the “coordinate system builder.” These terms may be used interchangeably throughout this disclosure, and may be implemented as a hardware device, a software module, or a combination of hardware and software, that collects and/or stores sample images. Similarly, the terms “parameter acquisition unit” and “parameter acquirer,” “background removal unit” and “background remover,” “target coordinate calculation unit” and “target coordinate calculator,” respectively, may also be used interchangeably throughout this disclosure.
  • the coordinate system building unit 2021 establishes the 3D coordinate system in the closed space 1 , e.g., by selecting a center point of a bottom surface (a ground surface of an unmanned supermarket) of the closed space as an origin of the coordinate system, and setting an X axis and a Y axis in a horizontal direction and a Z axis in a vertical direction.
  • the position of the user can be represented by a group of coordinates (i.e., coordinates of a number of points).
  • the position of the user may also be represented by the coordinates of a specific point in the group of coordinates, for example, the position of the user can be represented by the coordinates of the highest point (the point having the largest Z-axis numerical value) in the group of coordinates of the user.
  • the parameter acquisition unit 2022 processes the continuous multi-frame 3D images including the user images, and obtains the position parameters and the color parameters of each pixel point of each frame of 3D image;
  • the position parameters are x, y and z, and represent the coordinates (the coordinates of the position) of the pixel point under the 3D coordinate system;
  • the color parameters are r, g and b, and respectively represent the intensities of the three primary colors of the pixel point.
  • the data processing equipment 7 may acquire a plurality of frames of 3D images, each frame of 3D image includes a user image and a background image, and each pixel may be a part of the user or a part of the background.
  • pixel points which represent the same positions of the user body and the extension parts thereof are the same in color parameters r, g and b. Because the distances between the image sensors at different positions and the user are different, the primary position parameters directly acquired by each image sensor are the position coordinates of a point on the body of the user and the extension part thereof relative to the image sensor, so a coordinate transformation is required to convert the primary position parameters acquired by the image sensors at different positions into the position parameters under the 3D coordinate system established in the closed space.
  • the parameter acquisition unit 2022 includes a sensor coordinate acquisition unit or a sensor coordinate acquirer 20221 , a relative coordinate acquisition unit or a relative coordinate acquirer 20222 , and a coordinate correction unit or a coordinate corrector 20223 .
  • the sensor coordinate acquisition unit 20221 acquires coordinates of a center point of an image sensor that acquires the frame of 3D image (e.g., a midpoint of a connecting line between the center points of the lenses of the depth image sensor 2012 and the RGB image sensor 2013 arranged in parallel) in the 3D coordinate system established in the closed space;
  • the relative coordinate acquisition unit 20222 establishes a second 3D coordinate system by taking the central point of the image sensor as a second origin, wherein the directions of the X axis, the Y axis and the Z axis of the second 3D coordinate system are the same as the 3D coordinate system, and acquires the coordinates of each pixel point in the second 3D coordinate system from the 3D image;
  • the coordinate correction unit 20223 is configured to calculate
  • each frame of 3D image comprises and only comprises an image of a user, if the color parameters of N pixel points which belong to different 3D images and have the same position parameters are the same, and N is greater than 0.9*M and less than or equal to M, a background removal unit 2023 determines that the N pixel points are background pixel points, and removes N background pixel points from the M frames of 3D images to obtain M frames of background-free 3D images, namely the image of the user.
  • the position of the pixel point can be determined as the background, so that the pixel point can be removed from the corresponding 3D image.
  • the target coordinate calculation unit 2024 if the target object is the whole of the user and the extension portion thereof, the set of the position parameters of all the pixel points in the M frames of the background-free 3D image are the group of coordinates of the whole of the user and the extension portion thereof; in the group of coordinate, the position parameters of the pixel point with the largest parameter z is defined as the coordinates of the user.
  • the remaining pixel points can represent the whole travel track of the user.
  • the travel track may include footprints or other tracks of the user. If each frame of 3D image in the continuously acquired M frames of 3D images comprises images of a plurality of users, a 3D image which only comprises all or part of one user needs to be extracted from each of the M frames of 3D images.
  • a group of coordinates of the part of the user such as the head, shoulder, elbow, wrist, hand, etc.
  • the depth image sensor 2012 and the RGB image sensor 2013 are respectively provided with a lens, the lens of the depth image sensor 2012 and the lens of the RGB image sensor 2013 are set in parallel and adjacent to each other, and if the central axes of the two lenses are set to be perpendicular to the horizontal plane, the two lenses overlook goods and users in the closed space.
  • the two lenses can capture the group of coordinates of the head and the shoulder of the user, and when the user stretches out the hand, the group of coordinates of the arm, the elbow, the wrist and the hand of the user can be captured.
  • the head, the shoulder, the elbow, the wrist and the hand of the user at a certain moment are all connected into a fold line or a curve, the corresponding relation between the hand and the head position of the user can be established, namely, the position of a certain hand can be acquired in real time, and meanwhile, which user the hand belongs to can be determined.
  • the field of view of the image sensor 2011 may cover a partial space outside the doorway, and when the user is outside the doorway, the image of the user may be acquired by the image sensor 2011 .
  • All processes of using the unmanned vending system by the user include an identity identification process at an entrance and an exit, a process of entering the closed space 1 , a process of walking or staying in the closed space 1 and a process of leaving the closed space 1 , and all the processes are under the monitoring of the image sensor 2011 , so that the real-time position of a certain user with a known identity and a part of the body of the user in the closed space 1 can be monitored in real time.
  • the data processing equipment 7 can obtain the identity information of the user, and the image sensor 2011 starts to track the position of the user in real time from the time when the scanning device 1021 reads the code, so as to monitor whether the user matches a certain shelf.
  • the image sensor 2011 cannot acquire the real-time 3D image of the user, it can be identified that the user ends shopping and then settlement is performed.
  • the example further comprises a goods perception system for sensing the picking and placing states of each goods in real time, and when any kind of goods is taken away or placed back, the type and the quantity of the taken away or placed back goods are obtained.
  • the taking and placing states comprise a goods standing state, a taken-away state and a placed-back state.
  • the goods perception system may use two different methods.
  • the example further includes a goods perception system 300 based on weight monitoring for sensing the picking and placing states of each kind of goods in real time, wherein the picking and placing states include a goods standing state, a taken-away state and a put-back state.
  • the goods perception system based on weight monitoring 300 includes a goods database generation unit 301 , a weight acquisition unit 302 , a pick or place status judgment unit 303 , and a goods database updating unit 304 .
  • the four units are functional modules in the data processing equipment 7 , and work together with the shelf 2 provided with the weight sensing device 6 , so that the real-time weight sensing value of each rack 5 can be monitored, and whether goods are taken away or put back can be determined.
  • the goods perception system based on weight monitoring 300 obtains the type and the quantity of the removed or replaced goods.
  • the goods database generation unit 301 is used for generating a goods database; the goods database comprises goods information of each item (each goods) and a weight sensing value of each rack for placing the goods, wherein the goods information comprises the type of the goods, the weight value of a single product, the rack number and the shelf number corresponding to the goods, the serial number of the goods, the name, the model, the net content, the unit price and the like.
  • the item database generation unit 301 includes an initialization unit 3011 , an information input unit 3012 , and a value initialization unit 3013 .
  • the initialization unit 3011 is configured to perform initialization processing on the goods database, and establish the goods database in the memory of the data processing equipment 7 .
  • the information input unit 3012 is used for entering the weight value and the goods information of each item, storing the weight value and the goods information in the goods database, and the weight value of each item on the shelf of the unmanned supermarket is entered in the goods database by using a keyboard or a code scanner.
  • the value initializing unit 3013 is configured to collect the weight sensing value of each rack after the goods are placed thereon, and store the weight sensing value in the goods database.
  • the goods information is entered into the data processing equipment 7 and stored in the goods database.
  • the weight of the rack is 100 g
  • the weight of each bottle of beverage is 200 g
  • the weight sensing value of the rack in the goods database after initialization is 1700 g
  • information such as a product name (certain herbal tea), a net content (195 ml), a producing area (Guangdong), a unit price (5 yuan), a single product weight value (200 g), a shelf number (1), rack numbers (1-12), a goods number (025) et al. corresponding to the brand of beverage are also stored in the goods database.
  • the weight acquisition unit 302 is respectively connected to the weight sensing device 6 in each rack 5 through data lines to collect the real-time weight sensing value of each rack 5 in real time, for example, the collecting time interval is 0.1-0.2 seconds.
  • the real-time weight sensing value is a sensing value of the weight sensor and represents the weight of each rack before goods are placed on the racks 5 ; after the goods are placed on the rack 5 , it represents the total weight value of the rack and the goods on the rack; and the real-time weight sensing value changes when goods are taken away or put back to the shelf 5 .
  • the sensor value collected by the weight sensing device 6 (weight sensor) in real time each time is combined with the numerical values of the parameters k and b, so that the total weight of the existing goods on each rack can be calculated.
  • the pick or place status judgment unit 303 is configured to determine whether the weight sensing value of each rack changes. If the weight sensing value of a rack becomes smaller, it determines that goods are taken away from the rack; if the weight sensing value of the rack becomes larger, the goods are determined to be placed on the rack; if the weight sensing value of the rack does not change, it means that there is no change at all for the goods on the shelf, and the value acquisition unit 302 performs real-time acquisition again.
  • the pick or place status judgment unit 303 includes a weight difference calculation unit 3031 , a weight difference judgment unit 3032 , and a rack information recording unit 3033 .
  • the difference value calculation unit 3031 calculates a difference value between the real-time weight sensing value of each rack acquired in real time and the weight sensing value of the same rack stored in the goods database, and records the difference value as the weight difference value of each rack. For example, in the above example, if the weight of the rack on which the beverage of a certain brand is placed changes to 1300 g or 1900 g, the weight difference will be recorded as ⁇ 400 g or 200 g, respectively.
  • the difference value judgment unit 3032 compares the weight difference value of at least one rack with 0; when the weight difference value of one rack is less than 0, it determines that goods are taken away from the rack; when the weight difference value of the rack is greater than 0, it determines that goods are placed on the rack, but it cannot be determined whether the goods are the ones that are taken by the user from the shelf before, and it can also be the personal belongs of the user or other goods.
  • a weight difference of ⁇ 400 g can be considered as a removal of the goods; and in the case where the weight difference is 200 g, the goods are determined to be placed on the shelf.
  • the rack information recording unit 3033 records the rack number of the rack and the weight difference of the rack. For example, if the weight difference in the previous example is ⁇ 400 g, the weight of the shelf is known to be decreased and the shelf number is recorded (1-12).
  • the weight difference value in the previous example is 200 g
  • the goods placed on the rack are not necessarily the goods originally on the shelf, thereby the goods are likely to be the goods originally belonging to other racks or the goods carried by the user, and an alarm signal can be selectively generated to remind the manager or the user, and if necessary, the rack number of the rack can be displayed on a certain display for the manager or the user to process in time.
  • the pick or place status judgment unit 303 further includes a goods type determining unit 3034 and a goods number calculating unit 3035 .
  • the goods type judgment unit 3034 determines the type of the taken goods according to the rack number and the goods information corresponding to the rack stored in the goods database. For example, if the number of the rack is known as (1-12), if only one kind of goods is placed on each rack, the kind of the goods can be determined to be a certain herbal tea, and other goods information such as a weight value of a single product (200 g), a net content (195 ml), a place of origin (Guangdong), a unit price (5 yuan) and the like can be correspondingly found. If the rack is used for placing various goods, the possible types and the quantity of the taken goods can be determined primarily according to the weight difference value.
  • the goods quantity calculation unit 3035 calculates the ratio of the absolute value of the weight difference of one rack to the weight value of a single goods on the rack stored in the goods database, and the ratio is rounded by using a rounding method, and the obtained integer is the quantity of the taken goods.
  • the weight difference in the previous example 400 grams, with an absolute value of 400 grams, the ratio of this value to the weight of the item (200 grams) is 2, and thus 2 is the quantity of the goods removed.
  • the ratio after direct calculation is not necessarily an integer and may approach to a certain integer, so that a rounding processing needs to be performed on the ratio by utilizing a rounding method, and the type and the quantity of the taken goods can be determined.
  • the goods type judgment unit 3034 determines the type of the goods to be returned according to the rack number and the goods information corresponding to the rack.
  • the goods quantity calculating unit 3035 calculates the ratio of the absolute value of the weight difference of the rack to the weight value of the goods corresponding to the rack, and performs a rounding processing on the ratio by using a rounding method, so that the obtained integer is the number of the goods to be placed back.
  • the goods database updating unit 304 is configured to store the real-time weight sensing value in the goods database, and form a new weight sensing value to update the weight sensing value of each rack in the goods database for next calling and determination.
  • the goods perception system based on weight monitoring 300 of this example has the advantages of providing a weight monitoring-based goods perception scheme, which can monitor the real-time weight sensing value of the goods on the shelf in real time, sense the weight change of each rack in real time, deduce which kind of goods is taken away or put back according to the weight change of all racks on the shelf, and determine the kind and quantity of the taken away or put back goods.
  • the example further includes a goods perception system based on image monitoring 400 , which includes a sample acquisition unit 401 , a model training unit 402 , a real-time image acquisition unit 403 , and a goods category acquisition unit 404 , wherein the four units are function modules in the data processing equipment 7 , and the goods perception system based on image monitoring 400 can monitor the real-time image of the front area of the shelf, and determine the category of the goods that are taken away or replaced.
  • a goods perception system based on image monitoring 400 which includes a sample acquisition unit 401 , a model training unit 402 , a real-time image acquisition unit 403 , and a goods category acquisition unit 404 , wherein the four units are function modules in the data processing equipment 7 , and the goods perception system based on image monitoring 400 can monitor the real-time image of the front area of the shelf, and determine the category of the goods that are taken away or replaced.
  • the number of the second cameras 406 is two or four, and the second cameras 406 are disposed outside the shelf 2 , and each of the second cameras 406 faces one corner of the shelf 2 .
  • the foremost ends of the plurality of racks 5 of the shelf 2 are located on the same plane, the plane is called a shelf plane, the second camera 406 is provided with a lens, and the field of view of the lens covers the space in front of the shelf; when the goods are taken down from the shelf or placed on the shelf, the images of the taking down process or the putting back process of the goods are shot by the second camera.
  • the space in front of the shelf is a space area corresponding to the plane of the shelf in front of the shelf, the space in front of the shelf generally refers to an area of a range 30-50 cm wide in front of the shelf, and the lens of each second camera 406 faces the central area of the space in front of the shelf.
  • the sample acquisition unit 401 is used for acquiring at least one group of picture samples, wherein each group of picture samples comprises a plurality of sample pictures of a goods at multiple angles; and a group of picture samples of the same kind of goods is provided with the same group identification, and the group identification represents the kind of the goods corresponding to the group of picture samples.
  • the first camera 405 needs to take 3000 to 5000 pictures at different angles and different distances for each kind of goods on the shelf 2 , and transmits the pictures to the sample collection unit 401 of the data processing equipment 7 .
  • the model training unit 402 is configured to train a Convolutional Neural Network (CNN) model according to each sample picture in the plurality of sets of picture samples and the group identifier of each sample picture, and obtain a goods identification model.
  • CNN Convolutional Neural Network
  • the convolutional neural network model in this example is a faster RCNN network model with the smallest computation amount and the fastest response speed, and the fastest response speed of the model is only about 0.2 seconds, thereby the type and the quantity of the goods of the picture can be accurately identified in a very short time.
  • the real-time picture acquisition unit 403 is connected to the plurality of second cameras 406 , and is configured to continuously capture at least one real-time picture of the space in front of the shelf, where each real-time picture includes part or all of one or more goods pictures.
  • the second camera 406 may take a whole or partial picture of the goods in front of the shelf from different angles and display the shape, pattern and color of the goods.
  • the goods category acquisition unit 404 is configured to obtain a type of the goods displayed in the real-time picture according to the real-time picture and the goods identification model.
  • the real-time picture acquisition unit 403 pre-processes a plurality of real-time pictures collected in a certain period and inputs the processed pictures into the goods identification model, so as to determine the group identifier corresponding to the pictures in the period, and determine the type of the goods shot in the period according to the group identifier.
  • the target coordinate acquisition unit 202 may obtain a real-time group of coordinates of each user's hand, and when there is an intersection between the group of coordinates of the rack space above a rack 5 and the group of coordinates of a user's hand, the shelf-to-user matching determining unit 503 determines that the rack 5 matches the user, and may consider that the user stretches the hand into the shelf space above the shelf 5 , and the user may take away or put back the goods.
  • the example further includes a shopping information recording unit or a shopping information recorder 600 , which is a functional module in the data processing equipment 7 , for generating at least one shopping database according to the identity information of each user, so as to record the category and quantity of at least one item taken by each user.
  • the shopping information recording unit 600 includes a shopping database generation unit or a shopping database generator 601 and a shopping database update unit or a shopping database updater 602 .
  • the goods type determining unit 3034 initially determines possible types and quantities of the taken goods according to the goods information stored in the goods database and the shelf number with the weight difference value less than 0. If the rack is misplaced with other kinds of goods besides the original goods, it cannot determine the specific type and quantity of the taken goods only by the weight difference value, thereby only the possible type and quantity of the taken goods can be determined primarily according to the weight difference value.
  • the shopping database updating unit 602 generates a set of shopping information including the type and quantity of the goods removed at the moment and the goods information of the goods, such as the name, model, net content, unit price, etc., according to the type and the quantity of the goods removed and the identity information of the user who removes the item, and stores the set of shopping information in the shopping database of the user.
  • a plurality of groups of shopping information are included in the shopping database, and the mobile communication terminal carried by the user is connected with the data processing equipment 7 in a wireless communication mode and carries out data exchange, so the shopping information in the shopping database can also be displayed on an APP interface of the mobile communication terminal of the user.
  • the weight difference value of the shelf is greater than 0, which indicates that the goods are placed on the shelf, and it can be determined that whether the goods are purchased goods or not.
  • the shopping information in the shopping database of the user is inquired, and whether the weight value of the purchased goods match the weight difference value of the rack is determined, that is, determining whether the total weight of one or more purchased goods is the same as the weight difference value of the rack. If so, the possible types and the quantity of the goods can be preliminarily determined. For example, if the weight difference of the rack is 200 g, and there are two goods A of 100 g and four goods B of 50 g in the purchased goods, it can be determined primarily that the goods placed back to the rack are two pieces of goods A, or one piece of the goods A and two pieces of goods B, or four pieces of goods B.
  • the goods perception system based on the image monitoring 400 monitors the real-time image of the user putting the goods back to the rack, determines the type of the put-back goods again, and if the determination result is in accordance with the preliminary determination result, confirms the kind and the quantity of the put-back goods; the shopping database updating unit 602 generates return information including the kind and the quantity of the put-back goods and the goods information of the goods, and deletes the shopping information corresponding to the return information from the shopping database of the user, so that the type and the quantity of the goods in the shopping database are in accordance with the type and the quantity of the goods purchased by the user.
  • the goods perception system based on weight monitoring 300 and the goods perception system based on image monitoring 400 can also determine the type and the quantity of the goods to be put back, further, the goods perception system can also determine whether the type of the goods to be put back is consistent with the type of the original goods on the rack with the increased real-time weight sensing value, and if not, an alarm signal can be selectively generated to remind a manager or a user.
  • the goods perception system records the number of the rack and records the type and weight information of the goods which are misplaced, then, if the weight sensing value of the rack is monitored to be reduced by the goods perception system based on weight monitoring 300 , the possible type and quantity of the goods which are misplaced are preliminarily determined according to the weight difference value, the type and weight information of the goods which are misplaced and the type and weight information of the original goods on the rack; and the goods perception system based on image monitoring 400 determines again by using a real-time image, so that the type and quantity of the goods which are misplaced can be confirmed.
  • the example further includes a settlement system 700 , which is a functional module in the data processing equipment 7 , for settling a fee according to the type and quantity of all goods in the shopping database of the user.
  • a settlement system 700 which is a functional module in the data processing equipment 7 , for settling a fee according to the type and quantity of all goods in the shopping database of the user.
  • the user can leave the closed space 1 from the entrance guard device of the entrance.
  • the settlement system 700 settles the fee for the user.
  • the target positioning method includes the following steps S 301 ) to S 303 ).
  • a plurality of image sensors are used to monitor a moving image of a user in a closed space in a whole course, a position of a target (user) in the closed space is determined in real time, a movement track of the user in the closed space is tracked, and a group of 3D coordinates of a body of the user and a part (such as a head, a hand, etc.) of an extension portion thereof is acquired.
  • Step S 301 a space setting step, setting one or more access ports in the closed space, wherein the user must enter and exit the unmanned supermarket through the access ports.
  • Step S 302 a 3D image acquisition step, which is used for acquiring at least one frame of 3D image in real time, wherein the 3D image comprises all or part of images of at least one target object.
  • the step S 302 ) of acquiring the 3D image includes the following steps: step S 3021 ) an image sensor setting step, setting a plurality of image sensors at the top of the closed space, wherein the lenses of the image sensors face downward, and the field of view of the plurality of image sensors cover the entire bottom surface of the closed space.
  • Each of the image sensors 2011 includes a depth image sensor 2012 and an RGB image sensor 2013 set in parallel.
  • Step S 3022 an original image acquisition step, for synchronously acquiring at least one frame of depth image and at least one frame of RGB image in real time.
  • Step S 3023 a 3D image integration step, combining the depth image and the RGB image acquired by the same image sensor at the same time into a frame of 3D image; repeating the step of collecting the original image and the step of integrating the 3D images in step S 3022 , and continuously integrating the multi-frame 3D images.
  • the image sensor 2011 can acquire the multi-frame 3D images of the whole of the user and the motion track of the user in the closed space.
  • Step S 3031 The coordinate system building step, which is used for establishing a 3D coordinate system in the closed space.
  • a center point of the bottom surface of the closed space (the ground surface of the unmanned supermarket) is selected as the origin of the coordinate system.
  • An X axis and a Y axis are set in the horizontal direction, and a Z axis is set in the vertical direction.
  • step S 3032 the parameter acquisition step, which is used for obtaining the position parameters of each pixel of a frame of 3D image, specifically includes the following steps S 30321 ) to S 30323 ).
  • Step S 30321 a sensor coordinate acquisition step for acquiring the coordinates of the center point of the image sensor (the midpoint of the connecting line of the depth sensor and the lens center point of the RGB sensor which are arranged in parallel) the image sensor acquiring the frame of 3D image, in the 3D coordinate system.
  • step S 30322 a relative coordinate acquisition step, which is used for establishing a second 3D coordinate system by taking the central point of the image sensor as a second origin, and acquiring the coordinates of each pixel point under the second 3D coordinate system from the 3D image.
  • step S 30323 a coordinate correction step, which is used for calculating and correcting the coordinate of each pixel point of the 3D image under the 3D coordinate system according to the coordinates of the central point of the image sensor under the 3D coordinate system and the coordinates of each pixel point of the 3D image under the second 3D coordinate system, so as to obtain the position parameters of each pixel point.
  • the position of the pixel point can be determined as the background, so that the pixel points can be removed from the corresponding 3D images.
  • the remaining set of the pixel points can represent the whole travel track of the user.
  • Step S 3034 the target coordinate calculation step, wherein the set of the position parameters of all the pixel points in the M frames of the background-free 3D images is the group of coordinates of the whole user; in the group of coordinates, the position parameters of the pixel point with the largest position parameter z is defined as the coordinates of the target object. If the target object is further defined as the hand of the user, a real-time group of coordinates of the hand of the user can be acquired.
  • Step S 401 ), step S 402 ) are executed after step S 301 ) and before step S 302 ).
  • Step S 403 the goods perception step for sensing the picking and placing states of each good in real time, and when any goods is taken away or placed back, acquiring the type and quantity of the taken away or placed back goods.
  • Step S 403 ), step S 302 ) and step S 303 ) are respectively executed without mutual interference.
  • the real-time goods picking and placing states and the real-time coordinates of the target object are sent to the data processing equipment 7 for further shopping user determination.
  • Step S 404 the shopping user judgment step, when any goods are taken away or put back, the shopping user judgment step is used for determining the identity of the user who takes away or puts back the goods according to the identity information of the user and the real-time position of the user;
  • step S 405 the shopping information recording step, generating at least one shopping database according to the identity information of each user, for recording the type and quantity of at least one item taken by each user.
  • step S 404 includes step S 4041 ) a goods information storage step, step S 4042 ) a rack coordinate storage step, step S 4043 ) a rack-to-user matching judgment step, and step S 4044 ) a goods-to-user matching judgment step.
  • the group of coordinates of the shelf space above each shelf for placing goods may be determined under the 3D coordinate system established in the closed space, each shelf number corresponding to the group of coordinates of one shelf space.
  • each shelf number corresponding to the group of coordinates of one shelf space.
  • step S 405 includes the step S 4051 ) a shopping database generation step and the step S 4052 ) a shopping database updating step.
  • step S 4052 ) the shopping database updating step when the goods are taken away, generates the shopping information according to the type and quantity of the taken goods and the identity information of the user who takes away the goods, and stores the shopping information in the shopping database of the user; when the goods are put back, generates return information according to the type and the quantity of the put-back goods and the identity information of the user who puts back the goods, and deletes the shopping information corresponding to the return information from the shopping database of the user.
  • step S 301 ) to step S 303 ) can acquire the whole and local real-time positions of the user, and the identity of the user who takes away or puts back the goods can be determined when the goods on the shelf are taken or put back, and the shopping record in the shopping database of the user is updated in time. If the goods perception system finds that the goods are taken away from a rack by a certain user and determines the type and quantity of the taken goods, the goods information, quantity, unit price and other information of the taken goods can be written into the shopping database of the user. If the goods perception system finds that the goods are placed on a rack by a certain user and determines the type and the quantity of the placed goods, the return information of the placed goods can be deleted from the shopping database of the user.
  • a shopping database of the user When a user shops in a closed space, multiple goods taking events and multiple returning events may occur. After each event occurs, a shopping database of the user correspondingly changes, thereby shopping information recorded in the shopping database can be consistent with actual shopping content of the user, and when the user leaves the closed space, a settlement system can automatically complete settlement for the user.
US16/812,041 2017-12-18 2020-03-06 Target positioning system and target positioning method Abandoned US20200202163A1 (en)

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
CN201711366390.4 2017-12-18
CN201711366390 2017-12-18
CN201810079535.0A CN108551658B (zh) 2017-12-18 2018-01-26 目标物定位系统及定位方法
CN201810079535.0 2018-01-26
PCT/CN2018/117325 WO2019120040A1 (zh) 2017-12-18 2018-11-23 目标物定位系统及定位方法

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/117325 Continuation WO2019120040A1 (zh) 2017-12-18 2018-11-23 目标物定位系统及定位方法

Publications (1)

Publication Number Publication Date
US20200202163A1 true US20200202163A1 (en) 2020-06-25

Family

ID=62926467

Family Applications (3)

Application Number Title Priority Date Filing Date
US16/812,041 Abandoned US20200202163A1 (en) 2017-12-18 2020-03-06 Target positioning system and target positioning method
US16/812,032 Active US11501523B2 (en) 2017-12-18 2020-03-06 Goods sensing system and method for goods sensing based on image monitoring
US16/862,519 Abandoned US20200258069A1 (en) 2017-12-18 2020-04-29 Weight monitoring-based sensing system and sensing method

Family Applications After (2)

Application Number Title Priority Date Filing Date
US16/812,032 Active US11501523B2 (en) 2017-12-18 2020-03-06 Goods sensing system and method for goods sensing based on image monitoring
US16/862,519 Abandoned US20200258069A1 (en) 2017-12-18 2020-04-29 Weight monitoring-based sensing system and sensing method

Country Status (8)

Country Link
US (3) US20200202163A1 (de)
EP (3) EP3745100A4 (de)
JP (4) JP7016187B2 (de)
KR (3) KR102454854B1 (de)
CN (10) CN108497839B (de)
AU (5) AU2018405072A1 (de)
SG (3) SG11202003732YA (de)
WO (4) WO2019120040A1 (de)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111127174A (zh) * 2020-01-06 2020-05-08 鄂尔多斯市东驿科技有限公司 智能化无人超市控制系统
CN111738184A (zh) * 2020-06-28 2020-10-02 杭州海康威视数字技术股份有限公司 一种商品取放识别方法、装置、系统及设备
CN112084940A (zh) * 2020-09-08 2020-12-15 南京和瑞供应链管理有限公司 物资盘点管理系统及方法
CN112801055A (zh) * 2021-04-01 2021-05-14 湖南云星人工智能研究院有限公司 一种基于薄膜压力传感器阵列的无人超市定位跟踪方法
US20210148751A1 (en) * 2018-06-28 2021-05-20 Shekel Scales (2008) Ltd. Systems and methods for weighing products on a shelf
CN113067847A (zh) * 2021-02-02 2021-07-02 绍兴晨璞网络科技有限公司 一种匹配式超宽带定位系统架构设计方法
CN113524194A (zh) * 2021-04-28 2021-10-22 重庆理工大学 基于多模特征深度学习的机器人视觉抓取系统的目标抓取方法
US20220114617A1 (en) * 2020-03-20 2022-04-14 Boe Technology Group Co., Ltd. Shelf interaction methods and shelves
EP4030369A1 (de) * 2021-01-19 2022-07-20 Toshiba TEC Kabushiki Kaisha Benachrichtigungsvorrichtung und benachrichtigungsverfahren
US11694501B2 (en) 2020-02-17 2023-07-04 True Manufacturing Co., Inc. Refrigerated vending system and method

Families Citing this family (98)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108497839B (zh) * 2017-12-18 2021-04-09 上海云拿智能科技有限公司 可感知货品的货架
US11707338B2 (en) * 2018-01-30 2023-07-25 PAR Excellence Systems, Inc. Storage system including at least one container containing medical supplies
CN108652332A (zh) 2018-04-19 2018-10-16 上海云拿智能科技有限公司 悬挂式货架
CN108921540A (zh) * 2018-07-09 2018-11-30 南宁市安普康商贸有限公司 基于购买者位置定位的开放式自助销售方法及系统
EP3613638A1 (de) 2018-08-10 2020-02-26 Lg Electronics Inc. Fahrzeuganzeigesystem für fahrzeug
CN109448237B (zh) * 2018-08-29 2019-10-11 拉卡拉汇积天下技术服务(北京)有限公司 无人售货装置及其上货方法和控制系统
CN109741533A (zh) * 2018-09-12 2019-05-10 盈奇科技(深圳)有限公司 一种重力感应自动售货机固定结构
CN109658596A (zh) * 2018-09-12 2019-04-19 盈奇科技(深圳)有限公司 一种重力感应自动售货机
CN109859377A (zh) * 2018-09-12 2019-06-07 盈奇科技(深圳)有限公司 一种重力感应自动售货机重力感应部分连接结构
CN109461255A (zh) * 2018-09-12 2019-03-12 湖南金码智能设备制造有限公司 一种用于售货机的自适应检测接取商品状态的装置和方法
CN110895747B (zh) * 2018-09-13 2024-04-02 阿里巴巴集团控股有限公司 商品信息识别、显示、信息关联、结算方法及系统
CN109214484B (zh) * 2018-09-20 2021-07-13 深圳蓝胖子机器智能有限公司 无人便利店控制方法、装置和计算机可读存储介质
CN109330284B (zh) 2018-09-21 2020-08-11 京东方科技集团股份有限公司 一种货架系统
CN109512217A (zh) * 2018-09-27 2019-03-26 安徽国药医疗科技有限公司 一种基于智能货架的物品管理系统
CN109341763B (zh) * 2018-10-10 2020-02-04 广东长盈科技股份有限公司 一种基于物联网的运输数据采集系统及方法
CN109409291B (zh) * 2018-10-26 2020-10-30 虫极科技(北京)有限公司 智能货柜的商品识别方法和系统及购物订单的生成方法
CN109685979A (zh) * 2018-11-01 2019-04-26 深圳市友族科技有限公司 一种无人售货柜及无人售货方法
CN109581532A (zh) * 2018-11-12 2019-04-05 百度在线网络技术(北京)有限公司 用于确定被取走或放置的物品的位置的设备、方法及货架
CN109741519B (zh) * 2018-12-10 2021-01-19 深圳市思拓通信系统有限公司 一种无人超市货架监控系统及其控制方法
CN111325049A (zh) * 2018-12-13 2020-06-23 北京京东尚科信息技术有限公司 商品识别方法、装置、电子设备及可读介质
CN109598874A (zh) * 2018-12-18 2019-04-09 北京缤歌网络科技有限公司 一种显示控制方法
US11017641B2 (en) * 2018-12-21 2021-05-25 Sbot Technologies Inc. Visual recognition and sensor fusion weight detection system and method
CN109726759B (zh) * 2018-12-28 2021-08-17 北京旷视科技有限公司 无人售货方法、装置、系统、电子设备及计算机可读介质
CN109711360B (zh) * 2018-12-29 2021-03-30 北京沃东天骏信息技术有限公司 售货机风险控制方法、装置和控制系统
CN111222389B (zh) * 2019-01-10 2023-08-29 图灵通诺(北京)科技有限公司 商超货架上商品的分析方法和系统
CN109872448A (zh) * 2019-01-19 2019-06-11 创新奇智(广州)科技有限公司 一种异物检测方法、计算机可读存储介质及检测系统
CN111222870B (zh) * 2019-01-24 2024-02-27 图灵通诺(北京)科技有限公司 结算方法、装置和系统
CN109858446A (zh) * 2019-01-31 2019-06-07 深兰科技(上海)有限公司 一种新零售场景下物品注册方法及装置
CN111523348B (zh) * 2019-02-01 2024-01-05 百度(美国)有限责任公司 信息生成方法和装置、用于人机交互的设备
CN109840504B (zh) * 2019-02-01 2022-11-25 腾讯科技(深圳)有限公司 物品取放行为识别方法、装置、存储介质及设备
CN109919040B (zh) * 2019-02-15 2022-04-19 北京清瞳时代科技有限公司 货物的品规信息识别方法及装置
CN109829348A (zh) * 2019-02-25 2019-05-31 浙江工业大学 基于stm32单片机的食品信息采集系统
CN111507702A (zh) * 2019-03-07 2020-08-07 河源市联腾实业有限公司 无人超市自助购物方法、计算机可读存储介质及系统
CN109977825B (zh) * 2019-03-15 2021-06-25 百度在线网络技术(北京)有限公司 物品识别方法及装置
CN109977826B (zh) * 2019-03-15 2021-11-02 百度在线网络技术(北京)有限公司 物体的类别识别方法和装置
CN110007755A (zh) * 2019-03-15 2019-07-12 百度在线网络技术(北京)有限公司 基于动作识别的物体事件触发方法、装置及其相关设备
CN110009836A (zh) * 2019-03-29 2019-07-12 江西理工大学 基于高光谱摄像技术的深度学习的系统及方法
CN110147723B (zh) * 2019-04-11 2022-08-19 苏宁云计算有限公司 一种无人店中顾客异常行为的处理方法及系统
CN110175590A (zh) * 2019-05-31 2019-08-27 北京华捷艾米科技有限公司 一种商品识别方法及装置
CN110321797A (zh) * 2019-05-31 2019-10-11 苏宁云计算有限公司 商品识别方法和装置
CN110260796A (zh) * 2019-06-04 2019-09-20 上海追月科技有限公司 货品感知系统、货品感知方法及电子设备
CN110150906A (zh) * 2019-06-04 2019-08-23 上海追月科技有限公司 托盘及货架
CN110197561A (zh) * 2019-06-10 2019-09-03 北京华捷艾米科技有限公司 一种商品识别方法、装置及系统
CN112115745A (zh) * 2019-06-21 2020-12-22 杭州海康威视数字技术股份有限公司 一种商品漏扫码行为识别方法、装置及系统
CN110298961A (zh) * 2019-06-28 2019-10-01 北京云迹科技有限公司 用于机器人的计费方法及装置
CN110403400B (zh) * 2019-07-16 2020-09-08 合肥美的智能科技有限公司 货柜
US20210027104A1 (en) * 2019-07-25 2021-01-28 Microsoft Technology Licensing, Llc Eyes-off annotated data collection framework for electronic messaging platforms
CN110837824B (zh) * 2019-08-09 2022-12-16 达闼科技(北京)有限公司 用于售货装置中的商品识别方法、售货装置及存储介质
CN111783509A (zh) * 2019-08-29 2020-10-16 北京京东尚科信息技术有限公司 自动结算方法、装置、系统和存储介质
JP7368982B2 (ja) 2019-09-05 2023-10-25 東芝テック株式会社 販売管理システム及び販売管理方法
CN112466035B (zh) * 2019-09-06 2022-08-12 图灵通诺(北京)科技有限公司 基于视觉和重力感应的商品识别方法、装置和系统
CN110717593B (zh) * 2019-10-14 2022-04-19 上海商汤临港智能科技有限公司 神经网络训练、移动信息测量、关键帧检测的方法及装置
CN111753614A (zh) * 2019-11-01 2020-10-09 北京京东尚科信息技术有限公司 一种商品货架的监控方法和装置
CN111783513A (zh) * 2019-11-18 2020-10-16 北京沃东天骏信息技术有限公司 货物补充方法、装置和系统
CN111109947A (zh) * 2019-11-28 2020-05-08 上海追月科技有限公司 货架、货品识别方法及电子设备
CN111025417B (zh) * 2019-12-17 2022-04-29 万翼科技有限公司 建材存放区域异常检测方法及相关产品
CN111145409A (zh) * 2020-01-06 2020-05-12 鄂尔多斯市东驿科技有限公司 人体识别定位跟踪系统
CN111207815A (zh) * 2020-01-16 2020-05-29 首钢京唐钢铁联合有限责任公司 一种汽车衡混装计量系统
CN111243166A (zh) * 2020-01-20 2020-06-05 广东优信无限网络股份有限公司 一种用于餐饮售卖机的提示方法
KR102177852B1 (ko) * 2020-01-31 2020-11-11 임시원 정신건강의학의 병원용 자산 관리를 위한 방법 및 장치
CN111429655A (zh) * 2020-02-28 2020-07-17 上海追月科技有限公司 货品识别方法及货品识别系统、存储介质及电子设备
CN111476609A (zh) * 2020-04-10 2020-07-31 广西中烟工业有限责任公司 零售数据获取方法、系统、设备及存储介质
CN111540106A (zh) * 2020-04-14 2020-08-14 合肥工业大学 一种无人超市系统
CN111553914B (zh) * 2020-05-08 2021-11-12 深圳前海微众银行股份有限公司 基于视觉的货物检测方法、装置、终端及可读存储介质
CN111667639A (zh) * 2020-05-28 2020-09-15 北京每日优鲜电子商务有限公司 图书归还服务的实现方法、装置及智能图书柜
CN111831673B (zh) * 2020-06-10 2024-04-12 上海追月科技有限公司 货品识别系统、货品识别方法及电子设备
CN111680657B (zh) * 2020-06-15 2023-05-05 杭州海康威视数字技术股份有限公司 一种物品取放事件的触发人员确定方法、装置及设备
CN111672774B (zh) * 2020-07-30 2021-07-06 梅卡曼德(北京)机器人科技有限公司 一种货品分拣系统及分拣方法
CN113808342B (zh) * 2020-08-19 2023-09-01 北京京东乾石科技有限公司 物品支付方法、装置、计算机可读介质及电子设备
CN111731669A (zh) * 2020-08-26 2020-10-02 江苏神彩科技股份有限公司 危险物品智能监管贮存箱及其监管系统
KR102253385B1 (ko) * 2020-09-14 2021-05-18 주식회사농심 불량 멀티포장제품 검출시스템
CN112101867A (zh) * 2020-09-15 2020-12-18 四川精益达工程检测有限责任公司 一种用于货架上物品的信息生成方法及系统
CN112309031B (zh) * 2020-09-17 2022-06-07 北京京东乾石科技有限公司 无人货柜混放检测方法、装置、设备及可读存储介质
CN112215167B (zh) * 2020-10-14 2022-12-20 上海爱购智能科技有限公司 一种基于图像识别的智能商店控制方法和系统
CN112515428B (zh) * 2020-11-17 2023-11-03 上海追月科技有限公司 一种货架
JP2022097195A (ja) * 2020-12-18 2022-06-30 トヨタ自動車株式会社 情報処理装置、及び、情報処理方法
CN112419015B (zh) * 2021-01-25 2021-05-25 北京每日优鲜电子商务有限公司 物品信息推送方法、装置、电子设备和计算机可读介质
CN112985570B (zh) * 2021-02-04 2023-09-29 支付宝(杭州)信息技术有限公司 称重货架的传感器的标定方法和系统
CN113554808B (zh) * 2021-04-22 2023-09-29 浙江星星冷链集成股份有限公司 一种上货管理装置及无人售货系统
CN113554809A (zh) * 2021-04-22 2021-10-26 浙江星星冷链集成股份有限公司 一种商品信息更新装置及无人售货系统
CN113229686A (zh) * 2021-05-21 2021-08-10 上海追月科技有限公司 一种货架
CN113407571B (zh) * 2021-06-30 2023-04-07 重庆博尔德医疗科技股份有限公司 一种基于称重原理的计数方法
CN113558441A (zh) * 2021-07-07 2021-10-29 合肥美的智能科技有限公司 无人售货柜
CN113537773A (zh) * 2021-07-15 2021-10-22 深圳医智联科技有限公司 一种针对检验科培养瓶的管理方法及系统
CN113479534A (zh) * 2021-07-31 2021-10-08 深圳市坤同智能仓储科技有限公司 称重货架的模块化层及货架系统
CN113592339A (zh) * 2021-08-09 2021-11-02 四川长虹智能制造技术有限公司 一种货物信息校准系统、方法及电子设备
CN113421376B (zh) * 2021-08-23 2021-12-28 浙江星星冷链集成股份有限公司 一种无人售货系统
CN113869821A (zh) * 2021-09-23 2021-12-31 云南电力试验研究院(集团)有限公司 一种计量物资识别周转方法和系统
KR102417612B1 (ko) * 2021-10-28 2022-07-06 주식회사 케이엔케이 부품 검사 시스템 및 그 제어방법
CN113706227A (zh) * 2021-11-01 2021-11-26 微晟(武汉)技术有限公司 一种货架商品推荐方法及装置
CN113821674B (zh) * 2021-11-23 2022-02-25 北京中超伟业信息安全技术股份有限公司 一种基于孪生神经网络的智能货物监管方法及系统
CN114140696A (zh) * 2022-01-27 2022-03-04 深圳市慧为智能科技股份有限公司 商品识别系统优化方法、装置、设备及存储介质
CN114559431A (zh) * 2022-03-02 2022-05-31 上海擎朗智能科技有限公司 一种物品配送方法、装置、机器人及存储介质
ES2922762A1 (es) * 2022-05-27 2022-09-20 Ostirion S L U Procedimiento y equipo de localizacion e identificacion optica de instrumentos y aparatos
KR102597927B1 (ko) * 2022-08-16 2023-11-30 주식회사 포르망 화장품의 중량 선별 장치
KR102482821B1 (ko) * 2022-09-06 2022-12-29 주식회사 포이엔 무선 네트워크 기반의 실시간 계측 시스템
CN115510501B (zh) * 2022-11-22 2023-02-28 中汽信息科技(天津)有限公司 一种汽车数据防篡改方法和系统
CN116422602A (zh) * 2023-03-17 2023-07-14 广东铭钰科技股份有限公司 一种一体式产品称重标识检测系统及方法

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9911290B1 (en) * 2015-07-25 2018-03-06 Gary M. Zalewski Wireless coded communication (WCC) devices for tracking retail interactions with goods and association to user accounts
US20180089725A1 (en) * 2015-03-23 2018-03-29 Nec Corporation Product information management apparatus, product information management system, product information management method, and program
US20190025852A1 (en) * 2017-07-19 2019-01-24 Symbol Technologies, Llc Methods and apparatus to coordinate movement of automated vehicles and freight dimensioning components

Family Cites Families (108)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3384193A (en) * 1965-02-01 1968-05-21 Toledo Scale Corp Control circuitry
JPS6052366B2 (ja) * 1977-12-02 1985-11-19 オムロン株式会社 重量検知ユニツト
JPH1017121A (ja) * 1996-07-02 1998-01-20 Keiyo Syst Kk 自動棚卸し装置
JP4452786B2 (ja) * 1996-09-29 2010-04-21 雅信 鯨田 遠隔商品販売のための装置及び方法
JP3213696B2 (ja) * 1996-11-29 2001-10-02 東芝テック株式会社 買上商品登録装置及びこの登録装置を用いた買上商品決済システム
US6711293B1 (en) * 1999-03-08 2004-03-23 The University Of British Columbia Method and apparatus for identifying scale invariant features in an image and use of same for locating an object in an image
JP2002161942A (ja) * 2000-08-10 2002-06-07 Iau:Kk 免震装置、滑り支承また免震構造
KR20010067855A (ko) * 2001-04-03 2001-07-13 장성계 창고형 매장에서의 중량감지에 의한 상품 재고량 파악시스템
JP3837475B2 (ja) * 2001-07-19 2006-10-25 独立行政法人産業技術総合研究所 自動化ショッピングシステム
WO2003086664A2 (en) * 2002-04-12 2003-10-23 Tritek Technologies, Inc. Mail sorting processes and systems
US6747560B2 (en) * 2002-06-27 2004-06-08 Ncr Corporation System and method of detecting movement of an item
US7039222B2 (en) * 2003-02-28 2006-05-02 Eastman Kodak Company Method and system for enhancing portrait images that are processed in a batch mode
CN2634950Y (zh) * 2003-06-06 2004-08-25 何永强 货架柱
CN2788670Y (zh) * 2005-03-25 2006-06-21 李瑞铭 强化支撑架结构
US20080109375A1 (en) * 2006-11-08 2008-05-08 Ricci Christopher P Position-enhanced wireless transaction security
US7949568B2 (en) * 2007-08-31 2011-05-24 Accenture Global Services Limited Determination of product display parameters based on image processing
DE202007016804U1 (de) * 2007-12-01 2008-02-28 Golletz Gmbh Vorrichtung zur regalartigen Anordnung von Warenträgern
CN101604312A (zh) * 2007-12-07 2009-12-16 宗刚 信息的检索管理交流的方法和系统
US20090177089A1 (en) * 2008-01-04 2009-07-09 Assaf Govari Three-dimensional image reconstruction using doppler ultrasound
CN101281597B (zh) * 2008-04-25 2011-06-22 北京工业大学 一种产品包装图形标识信息在线实时采集识别装置及方法
KR100988754B1 (ko) * 2008-07-10 2010-10-25 박찬용 자동판매 기능을 갖는 무인 점포 시스템
KR101467509B1 (ko) * 2008-07-25 2014-12-01 삼성전자주식회사 이미지 센서 및 이미지 센서 동작 방법
US8165929B2 (en) * 2008-08-04 2012-04-24 Chudy Group, LLC Adaptive pharmaceutical product management methods and system
KR101051355B1 (ko) * 2009-06-09 2011-07-22 (주)이지스 3차원 공간 데이터를 이용한 카메라 영상의 3차원 좌표 획득방법 및 이를 이용한 카메라 연동 제어방법
KR20100136089A (ko) 2009-06-18 2010-12-28 주식회사 비즈모델라인 인덱스 교환을 통한 다중 코드 생성 방식 오티피 출력 방법 및 시스템과 이를 위한 휴대폰 및 기록매체
CN101989333A (zh) * 2009-07-31 2011-03-23 上海杉达学院 仓储控制系统
KR20100003343A (ko) * 2009-12-15 2010-01-08 (주)리테일테크 압력 센서를 이용한 소매 매장의 결품 방지 시스템
US20110174753A1 (en) * 2010-01-18 2011-07-21 Matthew Pinto Stackable transport system
KR101179108B1 (ko) * 2010-04-27 2012-09-07 서울시립대학교 산학협력단 중첩 전방위 영상을 이용하여 객체의 3차원 좌표를 결정하기 위한 시스템 및 그 방법
CN103189855B (zh) * 2010-06-14 2016-10-19 特鲁塔格科技公司 用于使用数据库验证包装中的物品的系统
KR20120051212A (ko) * 2010-11-12 2012-05-22 엘지전자 주식회사 멀티미디어 장치의 사용자 제스쳐 인식 방법 및 그에 따른 멀티미디어 장치
KR101223206B1 (ko) * 2010-11-18 2013-01-17 광주과학기술원 3차원 영상 생성 방법 및 시스템
US20130046635A1 (en) * 2011-08-19 2013-02-21 Bank Of America Corporation Triggering offers based on detected location of a mobile point of sale device
CN102374860B (zh) * 2011-09-23 2014-10-01 奇瑞汽车股份有限公司 三维视觉定位方法及系统
US20150120498A1 (en) * 2012-05-10 2015-04-30 Sca Hygience Products Ab Method for assisting in locating a desired item in a storage location
CN202807610U (zh) * 2012-08-31 2013-03-20 宁波市新光货架有限公司 一种中型仓储货架
CN202820536U (zh) * 2012-10-22 2013-03-27 山东固美思耐金属制品有限公司 无线称重超市货架托板
KR101931819B1 (ko) * 2012-12-31 2018-12-26 (주)지오투정보기술 영상 정보와 항공사진 데이터를 비교하여 대상 객체를 결정하고 카메라 획득정보를 이용하여 대상 객체의 3차원 좌표를 획득하는 수치지도 제작 시스템
US20160048798A1 (en) * 2013-01-11 2016-02-18 Tagnetics, Inc. Inventory sensor
RU2015133464A (ru) * 2013-01-11 2017-02-17 Тагнэтикс, Инк. Датчик отсутствия товаров
CN103192958B (zh) * 2013-02-04 2015-07-08 中国科学院自动化研究所北仑科学艺术实验中心 船舶姿态显示装置的控制方法
CN104021538B (zh) * 2013-02-28 2017-05-17 株式会社理光 物体定位方法和装置
KR20140136089A (ko) * 2013-05-16 2014-11-28 (주)에스모바일텍 진열 상품 재고 관리 솔루션을 구비한 전자 선반 라벨 시스템
US10984372B2 (en) * 2013-05-24 2021-04-20 Amazon Technologies, Inc. Inventory transitions
US10268983B2 (en) * 2013-06-26 2019-04-23 Amazon Technologies, Inc. Detecting item interaction and movement
US10176456B2 (en) * 2013-06-26 2019-01-08 Amazon Technologies, Inc. Transitioning items from a materials handling facility
US10290031B2 (en) * 2013-07-24 2019-05-14 Gregorio Reid Method and system for automated retail checkout using context recognition
JP6168917B2 (ja) * 2013-08-26 2017-07-26 株式会社日立システムズ 健康管理システム及び健康管理方法
CN103557841B (zh) * 2013-08-28 2016-08-10 陈天恩 一种提高多相机合成影像摄影测量精度的方法
US20160203499A1 (en) * 2013-09-06 2016-07-14 Nec Corporation Customer behavior analysis system, customer behavior analysis method, non-transitory computer readable medium, and shelf system
AU2013101592A4 (en) * 2013-10-25 2014-01-16 Housl Pty Ltd Property Leasing
US9916561B2 (en) * 2013-11-05 2018-03-13 At&T Intellectual Property I, L.P. Methods, devices and computer readable storage devices for tracking inventory
CN103559758B (zh) * 2013-11-06 2015-12-30 上海煦荣信息技术有限公司 一种智能化的售货系统及售货方法
CN203693111U (zh) * 2014-02-28 2014-07-09 苏州工业园区航成工业零件有限公司 一种新型台面式称重货架
CN104240007A (zh) * 2014-07-09 2014-12-24 深圳市茁壮网络股份有限公司 一种基于射频识别的物品管理方法及系统
CN203975446U (zh) * 2014-07-29 2014-12-03 晟通科技集团有限公司 全铝托盘
CN104217231A (zh) * 2014-08-29 2014-12-17 南京大学 一种基于非精确锚节点的rfid定位系统及定位方法
CN104576845A (zh) * 2014-12-16 2015-04-29 深圳市德上光电有限公司 一种图形化的蓝宝石衬底的制造方法
CN105989389A (zh) * 2015-02-11 2016-10-05 北京鼎九信息工程研究院有限公司 一种二维码
US10878364B2 (en) * 2015-02-18 2020-12-29 Fedex Corporate Services, Inc. Managing logistics information related to a logistics container using a container interface display apparatus
US10318917B1 (en) * 2015-03-31 2019-06-11 Amazon Technologies, Inc. Multiple sensor data fusion system
KR101807513B1 (ko) * 2015-05-13 2017-12-12 한국전자통신연구원 3차원 공간에서 영상정보를 이용한 사용자 의도 분석장치 및 분석방법
CN204883802U (zh) * 2015-06-24 2015-12-16 中山润涛智科技术有限公司 重量感应器及应用该感应器的物联网自动库存统计系统
JP6562077B2 (ja) * 2015-08-20 2019-08-21 日本電気株式会社 展示装置、表示制御装置および展示システム
US20170061258A1 (en) * 2015-08-25 2017-03-02 Nokia Technologies Oy Method, apparatus, and computer program product for precluding image capture of an image presented on a display
CN105314315B (zh) * 2015-10-30 2017-12-15 无锡职业技术学院 智能货架测试系统
CN205251049U (zh) * 2015-11-06 2016-05-25 江苏正贸仓储设备制造有限公司 带计量器货架
CN106875203A (zh) * 2015-12-14 2017-06-20 阿里巴巴集团控股有限公司 一种确定商品图片的款式信息的方法及装置
US10592854B2 (en) * 2015-12-18 2020-03-17 Ricoh Co., Ltd. Planogram matching
JP6319669B2 (ja) * 2016-01-20 2018-05-09 パナソニックIpマネジメント株式会社 商品モニタリング装置、商品モニタリングシステムおよび商品モニタリング方法
CN105719188B (zh) 2016-01-22 2017-12-26 平安科技(深圳)有限公司 基于多张图片一致性实现保险理赔反欺诈的方法及服务器
CN108780596B (zh) 2016-02-29 2022-01-25 路标株式会社 信息处理系统
CN105787490A (zh) * 2016-03-24 2016-07-20 南京新与力文化传播有限公司 基于深度学习的商品潮流识别方法及装置
CN105678981A (zh) * 2016-03-26 2016-06-15 陈功平 一种组合式智能预警储物架
US11087272B2 (en) 2016-03-29 2021-08-10 Bossa Nova Robotics Ip, Inc. System and method for locating, identifying and counting items
JP6736334B2 (ja) * 2016-04-07 2020-08-05 東芝テック株式会社 画像処理装置
CN105910379B (zh) * 2016-05-09 2018-07-27 海信(山东)冰箱有限公司 一种用于冰箱的称重装置及称重抽屉
TWI578272B (zh) * 2016-05-18 2017-04-11 Chunghwa Telecom Co Ltd Shelf detection system and method
CN105901987A (zh) * 2016-06-30 2016-08-31 常州市南飞机械有限公司 可称重的货架
CN106326852A (zh) * 2016-08-18 2017-01-11 无锡天脉聚源传媒科技有限公司 一种基于深度学习的商品识别方法及装置
CN106355368A (zh) * 2016-08-29 2017-01-25 苏州倾爱娱乐传媒有限公司 一种卖场货品的智能管理系统
CN106408374A (zh) * 2016-08-30 2017-02-15 佛山市明扬软件科技有限公司 一种安全性高的超市自动收费方法
CN106441537A (zh) * 2016-09-08 2017-02-22 蝶和科技(中国)有限公司 一种称重货架的称重方法以及使用该方法的货架
CN106556341B (zh) * 2016-10-08 2019-12-03 浙江国自机器人技术有限公司 一种基于特征信息图形的货架位姿偏差检测方法和系统
CN106454733A (zh) * 2016-11-03 2017-02-22 安徽百慕文化科技有限公司 一种基于无线传感网络的在线监测系统
CN106781121A (zh) * 2016-12-14 2017-05-31 朱明� 基于视觉分析的超市自助结账智能系统
CN106845962A (zh) * 2016-12-29 2017-06-13 广州途威慧信息科技有限公司 一种自动结算货款系统及方法
CN106910086A (zh) * 2017-01-18 2017-06-30 甄启源 一种超市智能购物系统
CN106875465B (zh) * 2017-01-20 2021-06-11 奥比中光科技集团股份有限公司 基于rgbd图像的三维操控空间的建立方法及设备
CN106845965A (zh) * 2017-02-13 2017-06-13 上海云剑信息技术有限公司 线下超市自助购物的手机结账系统
CN106934692B (zh) * 2017-03-03 2020-12-22 陈维龙 物品信息处理系统、方法及装置
CN106920152A (zh) * 2017-03-07 2017-07-04 深圳市楼通宝实业有限公司 自助售卖方法及系统
US9805360B1 (en) * 2017-03-17 2017-10-31 Philz Coffee, Inc. Location based device flagging and interface
CN107134053B (zh) * 2017-04-19 2019-08-06 石道松 智能售货门店
CN107045641B (zh) * 2017-04-26 2020-07-28 广州图匠数据科技有限公司 一种基于图像识别技术的货架识别方法
CN107093269A (zh) * 2017-06-08 2017-08-25 上海励识电子科技有限公司 一种智能无人售货系统及方法
CN107123006A (zh) * 2017-07-12 2017-09-01 杨智勇 一种智能购物系统
CN107463946B (zh) * 2017-07-12 2020-06-23 浙江大学 一种结合模板匹配与深度学习的商品种类检测方法
CN107341523A (zh) * 2017-07-13 2017-11-10 浙江捷尚视觉科技股份有限公司 基于深度学习的快递单信息识别方法和系统
CN107480735B (zh) * 2017-07-24 2018-08-14 葛代荣 一种自助验货方法和系统
CN107481000A (zh) * 2017-07-27 2017-12-15 惠州市伊涅科技有限公司 无人超市售货方法
CN107451776A (zh) * 2017-07-27 2017-12-08 惠州市伊涅科技有限公司 无人超市补货方法
CN107451891A (zh) * 2017-07-27 2017-12-08 惠州市伊涅科技有限公司 便利店自动售卖方法
US10445694B2 (en) * 2017-08-07 2019-10-15 Standard Cognition, Corp. Realtime inventory tracking using deep learning
US11023850B2 (en) * 2017-08-07 2021-06-01 Standard Cognition, Corp. Realtime inventory location management using deep learning
CN107481414A (zh) * 2017-08-21 2017-12-15 文彬 一种开放式无人售卖装置的实时导购方法及系统
WO2019107157A1 (ja) 2017-11-29 2019-06-06 株式会社Nttドコモ 棚割情報生成装置及び棚割情報生成プログラム
CN108497839B (zh) * 2017-12-18 2021-04-09 上海云拿智能科技有限公司 可感知货品的货架

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180089725A1 (en) * 2015-03-23 2018-03-29 Nec Corporation Product information management apparatus, product information management system, product information management method, and program
US9911290B1 (en) * 2015-07-25 2018-03-06 Gary M. Zalewski Wireless coded communication (WCC) devices for tracking retail interactions with goods and association to user accounts
US20190025852A1 (en) * 2017-07-19 2019-01-24 Symbol Technologies, Llc Methods and apparatus to coordinate movement of automated vehicles and freight dimensioning components

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210148751A1 (en) * 2018-06-28 2021-05-20 Shekel Scales (2008) Ltd. Systems and methods for weighing products on a shelf
US11933661B2 (en) * 2018-06-28 2024-03-19 Shekel Scales (2008) Ltd. Systems and methods for weighing products on a shelf
CN111127174A (zh) * 2020-01-06 2020-05-08 鄂尔多斯市东驿科技有限公司 智能化无人超市控制系统
US11694501B2 (en) 2020-02-17 2023-07-04 True Manufacturing Co., Inc. Refrigerated vending system and method
US11880863B2 (en) * 2020-03-20 2024-01-23 Boe Technology Group Co., Ltd. Shelf interaction methods and shelves
US20220114617A1 (en) * 2020-03-20 2022-04-14 Boe Technology Group Co., Ltd. Shelf interaction methods and shelves
CN111738184A (zh) * 2020-06-28 2020-10-02 杭州海康威视数字技术股份有限公司 一种商品取放识别方法、装置、系统及设备
CN112084940A (zh) * 2020-09-08 2020-12-15 南京和瑞供应链管理有限公司 物资盘点管理系统及方法
EP4030369A1 (de) * 2021-01-19 2022-07-20 Toshiba TEC Kabushiki Kaisha Benachrichtigungsvorrichtung und benachrichtigungsverfahren
CN115311830A (zh) * 2021-01-19 2022-11-08 东芝泰格有限公司 告知装置及告知方法
CN113067847A (zh) * 2021-02-02 2021-07-02 绍兴晨璞网络科技有限公司 一种匹配式超宽带定位系统架构设计方法
CN112801055A (zh) * 2021-04-01 2021-05-14 湖南云星人工智能研究院有限公司 一种基于薄膜压力传感器阵列的无人超市定位跟踪方法
CN113524194A (zh) * 2021-04-28 2021-10-22 重庆理工大学 基于多模特征深度学习的机器人视觉抓取系统的目标抓取方法

Also Published As

Publication number Publication date
WO2019120040A1 (zh) 2019-06-27
EP3745296A4 (de) 2021-03-24
AU2018405073A1 (en) 2020-06-18
CN108509847B (zh) 2021-07-20
EP3731546A1 (de) 2020-10-28
CN208892110U (zh) 2019-05-24
JP7016187B2 (ja) 2022-02-04
AU2018102235A4 (en) 2022-03-17
US11501523B2 (en) 2022-11-15
EP3745100A1 (de) 2020-12-02
CN108492482B (zh) 2020-11-03
JP2021507203A (ja) 2021-02-22
WO2019144691A1 (zh) 2019-08-01
US20200258069A1 (en) 2020-08-13
EP3745100A4 (de) 2021-03-24
SG11202004369RA (en) 2020-06-29
CN108497839A (zh) 2018-09-07
WO2019120040A9 (zh) 2019-12-12
KR102510679B1 (ko) 2023-03-16
KR20200051705A (ko) 2020-05-13
CN108492157B (zh) 2023-04-18
SG11202003732YA (en) 2020-05-28
US20200202137A1 (en) 2020-06-25
EP3745296A1 (de) 2020-12-02
KR20200037833A (ko) 2020-04-09
CN108492157A (zh) 2018-09-04
CN108551658B (zh) 2021-04-09
CN108492482A (zh) 2018-09-04
KR20200023647A (ko) 2020-03-05
CN207783158U (zh) 2018-08-28
CN108520194A (zh) 2018-09-11
CN108509847A (zh) 2018-09-07
WO2019120039A1 (zh) 2019-06-27
CN208044756U (zh) 2018-11-02
AU2018386790A1 (en) 2020-04-23
JP2022043070A (ja) 2022-03-15
CN108551658A (zh) 2018-09-18
KR102378059B1 (ko) 2022-03-25
CN108497839B (zh) 2021-04-09
JP7078275B2 (ja) 2022-05-31
CN108332829A (zh) 2018-07-27
JP7229580B2 (ja) 2023-02-28
KR102454854B1 (ko) 2022-10-14
AU2022203376A1 (en) 2022-06-09
JP2021511554A (ja) 2021-05-06
SG11202003731PA (en) 2020-05-28
WO2019144690A1 (zh) 2019-08-01
JP7170355B2 (ja) 2022-11-14
AU2018405072A1 (en) 2020-04-30
EP3731546A4 (de) 2021-02-17
JP2022043067A (ja) 2022-03-15

Similar Documents

Publication Publication Date Title
US20200202163A1 (en) Target positioning system and target positioning method
US11790433B2 (en) Constructing shopper carts using video surveillance
CN111626681B (zh) 一种用于库存管理的图像识别系统
US10290031B2 (en) Method and system for automated retail checkout using context recognition
EP4075399A1 (de) Informationsverarbeitungssystem
JP7264401B2 (ja) 会計方法、装置及びシステム

Legal Events

Date Code Title Description
AS Assignment

Owner name: SHANGHAI CLOUDPICK SMART TECHNOLOGY CO., LTD., CHINA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:FENG, LINAN;XIA, DING;MA, JIEYU;AND OTHERS;REEL/FRAME:052043/0503

Effective date: 20200306

STPP Information on status: patent application and granting procedure in general

Free format text: APPLICATION DISPATCHED FROM PREEXAM, NOT YET DOCKETED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: ADVISORY ACTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION