CN111178860A - Settlement method, device, equipment and storage medium for unmanned convenience store - Google Patents

Settlement method, device, equipment and storage medium for unmanned convenience store Download PDF

Info

Publication number
CN111178860A
CN111178860A CN201911313115.5A CN201911313115A CN111178860A CN 111178860 A CN111178860 A CN 111178860A CN 201911313115 A CN201911313115 A CN 201911313115A CN 111178860 A CN111178860 A CN 111178860A
Authority
CN
China
Prior art keywords
user
pedestrian
track
settlement
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN201911313115.5A
Other languages
Chinese (zh)
Inventor
叶灵
陈志明
卢小丹
冯镇业
梁智聪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Weaving Point Intelligent Technology Co ltd
Original Assignee
Guangzhou Weaving Point Intelligent Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Weaving Point Intelligent Technology Co ltd filed Critical Guangzhou Weaving Point Intelligent Technology Co ltd
Priority to CN201911313115.5A priority Critical patent/CN111178860A/en
Publication of CN111178860A publication Critical patent/CN111178860A/en
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q20/00Payment architectures, schemes or protocols
    • G06Q20/08Payment architectures
    • G06Q20/085Payment architectures involving remote charge determination or related payment systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q20/00Payment architectures, schemes or protocols
    • G06Q20/30Payment architectures, schemes or protocols characterised by the use of specific devices or networks
    • G06Q20/32Payment architectures, schemes or protocols characterised by the use of specific devices or networks using wireless devices
    • G06Q20/327Short range or proximity payments by means of M-devices
    • G06Q20/3276Short range or proximity payments by means of M-devices using a pictured code, e.g. barcode or QR-code, being read by the M-device
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q20/00Payment architectures, schemes or protocols
    • G06Q20/38Payment protocols; Details thereof
    • G06Q20/40Authorisation, e.g. identification of payer or payee, verification of customer or shop credentials; Review and approval of payers, e.g. check credit lines or negative lists
    • G06Q20/401Transaction verification
    • G06Q20/4014Identity check for transactions
    • G06Q20/40145Biometric identity checks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/168Feature extraction; Face representation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/28Recognition of hand or arm movements, e.g. recognition of deaf sign language
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30241Trajectory

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Accounting & Taxation (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Business, Economics & Management (AREA)
  • Strategic Management (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Computer Interaction (AREA)
  • Finance (AREA)
  • Data Mining & Analysis (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Social Psychology (AREA)
  • General Engineering & Computer Science (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Psychiatry (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Computer Security & Cryptography (AREA)
  • Image Analysis (AREA)

Abstract

The embodiment of the application discloses a settlement method and device for an unmanned convenience store, electronic equipment and a storage medium. According to the technical scheme, the track ID is bound with the user ID by distributing the corresponding track ID for the user, the user characteristic is bound with the corresponding track ID by collecting the user characteristic, and the real-time track ID coordinate of the user is determined. And then, when the order record is triggered, acquiring a pedestrian image at the triggering position, determining a spatial coordinate point corresponding to a pedestrian, respectively performing feature matching and distance matching according to the user feature and the track ID coordinate, determining a user ID corresponding to the pedestrian, and performing the order record according to the user ID, wherein the order record can be used for charging and settlement corresponding to commodity purchasing of the user. By adopting the technical means, the shopping process of the user can be clearly and definitely recorded, and the noninductive settlement of the unmanned convenience store is realized. Therefore, the charging and settlement process of the user is simplified, and the shopping experience of the user is optimized.

Description

Settlement method, device, equipment and storage medium for unmanned convenience store
Technical Field
The embodiment of the application relates to the technical field of computer vision, in particular to a settlement method and device for an unmanned convenience store, electronic equipment and a storage medium.
Background
At present, most convenience stores and supermarkets enter the stores when operating, then find needed commodities, carry the commodities to a specified payment machine during payment, and pay by virtue of manual or self-service payment machines. The operation mode easily wastes a great deal of time and energy of the user to be used in unnecessary shopping links such as queuing purchase orders and the like, and the shopping experience of the user is influenced. With the development of science and technology, intelligent retail and unmanned retail are also widely applied in life. An unmanned convenience store is an emerging application product in the field. An unattended convenience store generally refers to a user who can complete a purchase of an item without the participation of an attendant throughout the shopping process. After a user enters an unmanned convenience store, the user can independently choose and purchase articles, after the articles are selected, the articles are placed into a settlement device for settlement, and a system background can automatically complete settlement of the articles according to the settlement result.
However, in the conventional unmanned convenience store, items are checked by using a fixed settlement device during operation. After the user purchases the goods, the goods are required to be sent to a settlement device for counting, and after the goods counting list is obtained, the charging settlement of the goods purchased by the user is carried out, so that the process is complicated, and the shopping experience of the user is poor.
Disclosure of Invention
The embodiment of the application provides a settlement method and device for an unmanned convenience store, electronic equipment and a storage medium, which can realize the noninductive settlement of shopping for a user and optimize the shopping experience of the user.
In a first aspect, an embodiment of the present application provides a settlement method for an unmanned convenience store, including:
when a user enters a shopping area, distributing a corresponding trackID for the user, wherein the trackID is bound with the user ID and is used for tracking a user track;
collecting user characteristics, and binding the user characteristics with corresponding trackIDs;
tracking the motion track of the user in real time based on the trackID, and determining the real-time trackID coordinate of the user;
when order recording is triggered, acquiring a pedestrian image at a triggering position, determining a spatial coordinate point corresponding to a pedestrian, respectively performing feature matching and distance matching according to the user feature and the trackID coordinate, determining a user ID corresponding to the pedestrian, and recording an order according to the user ID;
and when the user leaves the store, carrying out charging settlement according to the order record corresponding to the user ID.
Further, when the user enters the shopping area, allocating a corresponding trackID to the user further includes:
and determining the user ID of the corresponding user by scanning the two-dimensional code or brushing the face for identification.
Further, the acquiring the user characteristics includes:
collecting RGB images corresponding to facial features, top visual angle features and limb features of a user;
collecting a depth image corresponding to a top visual angle of a user to generate point cloud characteristics;
and storing the RGB image and the point cloud characteristics as user characteristics.
Further, the tracking the motion track of the user in real time based on the trackID to determine the real-time trackID coordinates of the user includes:
and extracting a user image area, and determining the space coordinate position of a user designated part in the user image area as a trackID coordinate.
Further, when triggering order record, acquiring a pedestrian image at a triggering position, and determining a spatial coordinate point corresponding to a pedestrian, includes:
determining the spatial position of the hand characteristic point when the pedestrian takes and puts the goods through limb recognition based on the pedestrian image;
and determining the spatial coordinate position of the specified part of the pedestrian through posture estimation according to the spatial position of the hand feature point, and taking the spatial coordinate position of the specified part of the pedestrian as a spatial coordinate point corresponding to the pedestrian.
Further, the step of respectively performing feature matching and distance matching according to the user features and the trackID coordinates to determine the user ID of the corresponding pedestrian includes:
performing distance matching on the space coordinate points corresponding to the pedestrians and the trackID coordinates, and determining the trackID coordinate with the shortest Euclidean distance to obtain a distance matching result;
comparing the pedestrian image with the user characteristics to obtain a characteristic matching result;
and determining the user ID of the corresponding pedestrian according to the distance matching result and the feature matching result.
Further, when the user leaves the store, the accounting is performed according to the order record corresponding to the user ID, including:
when the user is detected to leave the store, automatic charging settlement is carried out based on the order record of the corresponding user ID and according to the payment account pre-bound by the corresponding user ID; alternatively, the first and second electrodes may be,
and generating a payment bill based on the order record corresponding to the user ID, and sending the payment bill to the corresponding user ID.
In a second aspect, an embodiment of the present application provides a settlement apparatus for an unmanned convenience store, including:
the distribution module is used for distributing corresponding trackID for the user when the user enters the shopping area, and the trackID is bound with the user ID and used for tracking the user track;
the acquisition module is used for acquiring user characteristics and binding the user characteristics with the corresponding trackID;
the tracking module is used for tracking the motion trail of the user in real time based on the trackID and determining the real-time trackID coordinate of the user;
the determining module is used for acquiring a pedestrian image at a triggering position when order recording is triggered, determining a spatial coordinate point corresponding to a pedestrian, respectively performing feature matching and distance matching according to the user feature and the trackID coordinate, determining a user ID corresponding to the pedestrian, and recording an order according to the user ID;
and the settlement module is used for carrying out charging settlement according to the order record corresponding to the user ID when the user leaves the store.
In a third aspect, an embodiment of the present application provides an electronic device, including:
a memory and one or more processors;
the memory for storing one or more programs;
when executed by the one or more processors, cause the one or more processors to implement the method for settlement of an unmanned convenience store according to the first aspect.
In a fourth aspect, embodiments of the present application provide a storage medium containing computer-executable instructions for performing the method of settlement of an unmanned convenience store according to the first aspect when executed by a computer processor.
According to the embodiment of the application, when the user enters the shopping area, the corresponding trackID is distributed to the user, the trackID is bound with the user ID, the user characteristic is bound with the corresponding trackID by collecting the user characteristic, the motion track of the user is tracked in real time, and the real-time trackID coordinate of the user is determined. And then, when the order record is triggered, acquiring a pedestrian image at the triggering position, determining a spatial coordinate point corresponding to a pedestrian, respectively performing feature matching and distance matching according to the user feature and the trackID coordinate, determining a user ID corresponding to the pedestrian, and performing the order record according to the user ID, wherein the order record can be used for charging and settlement corresponding to commodity purchasing of the user. By adopting the technical means, the shopping process of the user can be clearly and definitely recorded, the charging settlement of the commodity purchasing of the user is carried out according to the order record, and the noninductive settlement of the unmanned convenience store is realized. Therefore, the charging and settlement process of the user is simplified, and the shopping experience of the user is optimized.
Drawings
FIG. 1 is a flow chart of a settlement method for an unmanned convenience store according to an embodiment of the present application;
FIG. 2 is a flowchart of a user feature collection in the first embodiment of the present application;
FIG. 3 is a flow chart of pedestrian spatial coordinate point determination in the first embodiment of the present application;
FIG. 4 is a flowchart of a pedestrian user ID determination in the first embodiment of the present application;
fig. 5 is a schematic structural diagram of a settlement apparatus for an unmanned convenience store according to a second embodiment of the present application;
fig. 6 is a schematic structural diagram of an electronic device according to a third embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, specific embodiments of the present application will be described in detail with reference to the accompanying drawings. It is to be understood that the specific embodiments described herein are merely illustrative of the application and are not limiting of the application. It should be further noted that, for the convenience of description, only some but not all of the relevant portions of the present application are shown in the drawings. Before discussing exemplary embodiments in more detail, it should be noted that some exemplary embodiments are described as processes or methods depicted as flowcharts. Although a flowchart may describe the operations (or steps) as a sequential process, many of the operations can be performed in parallel, concurrently or simultaneously. In addition, the order of the operations may be re-arranged. The process may be terminated when its operations are completed, but may have additional steps not included in the figure. The processes may correspond to methods, functions, procedures, subroutines, and the like.
The settlement method of the unmanned convenience store aims to determine the user characteristics and the trackID coordinates of the user in advance. In the operation process of the unmanned convenience store, when the fact that a person triggers order recording is detected, the image of the pedestrian at the triggering position is collected through the camera, and the space coordinate point of the corresponding pedestrian is determined. And further respectively carrying out feature matching and distance matching according to the predetermined user features and the trackID coordinates to further determine the user ID of the corresponding pedestrian, and correspondingly carrying out order recording according to the determined user ID. Finally, when the user finishes shopping and leaves the unmanned convenience store, the user can further perform charging settlement according to the order record of the user ID, and the user does not need to perform related commodity settlement operation in the whole process, so that the noninductive settlement of the unmanned convenience store is realized. Compared with the existing unmanned convenience store, in the operation process, the payment method for selecting and purchasing the commodities by the user is that the commodities are taken to the appointed payment machine by self during payment, and payment is carried out by means of a manual or self-service payment machine. If the in-store directions are clear and there are fewer people in the checkout lane, the customer may get a better shopping experience, but if the in-store product directions are not clear, there are more people in the checkout lane queuing for orders, the customer will spend a lot of time and effort in unnecessary shopping links. Such billing and settlement methods are relatively inefficient, do not have a good shopping experience, require operators to hire a large amount of labor for occasional peak passenger flows, and have a low labor value per unit time. Therefore, the settlement method for the unmanned convenience store is provided to solve the technical problem that the existing retail industry is complex in settlement process.
The first embodiment is as follows:
fig. 1 is a flowchart illustrating a settlement method for an unmanned convenience store according to an embodiment of the present disclosure, where the settlement method for the unmanned convenience store according to the embodiment may be performed by a settlement apparatus for the unmanned convenience store, the settlement apparatus for the unmanned convenience store may be implemented by software and/or hardware, and the settlement apparatus for the unmanned convenience store may be configured by two or more physical entities or may be configured by one physical entity. In general, the settlement device of the unmanned convenience store may be a computer, a billing settlement terminal of the unmanned convenience store, a server host, or the like.
The following description will be given taking a settlement apparatus of an unmanned convenience store as an example of an apparatus for performing a settlement method of the unmanned convenience store. Referring to fig. 1, the settlement method of the unmanned convenience store specifically includes:
s110, when the user enters the shopping area, distributing corresponding trackID for the user, wherein the trackID is bound with the user ID and is used for tracking the track of the user.
Specifically, when the user enters the shopping area, the user ID information is determined first. The user ID information is unique identification information of the user, and the identity of the user can be determined through the user ID. The user ID is configured at the time of user registration. When the user ID is configured, the payment information of the user can be bound with the user ID, so that the payment account of the user can be determined according to the payment information of the user in the subsequent commodity charging calculation, and further, automatic deduction can be carried out from the payment account of the user.
Further, when the user ID is determined, the user ID may be confirmed by scanning the two-dimensional code or by face scanning. The scanning of the two-dimensional code may be that a user scans the two-dimensional code through a terminal device (such as a mobile phone) to enter an authentication interface, and determines a user ID through identity authentication. The user may provide a two-dimensional code corresponding to the user ID of the user for a code scanning device provided at an entrance of the unmanned convenience store to scan the code and authenticate the user ID. The face scanning recognition requires that the user provides face image information in advance when registering the user ID, so that the user ID is determined by further performing face recognition comparison by acquiring the face information of the user during the subsequent face scanning recognition.
After the user ID is determined, a trackID is further allocated to the user entering the shopping area, and the trackID is bound with the user ID and used for tracking the user track. Trackid is assigned by a target tracking algorithm, such as Kalman filter tracking algorithm. the trackid can record the moving track of the corresponding pedestrian in real time to obtain the continuous moving position data of the pedestrian, so that the moving track of the pedestrian entering the unmanned convenience store can be determined.
S120, collecting user characteristics, and binding the user characteristics with corresponding trackIDs.
Illustratively, in an operation scene of the unmanned convenience store, a plurality of cameras are arranged at an entrance of a shopping area corresponding to the unmanned convenience store, and the cameras are used for collecting user characteristics so as to be used for characteristic matching in the following user ID confirmation. Specifically, the camera can set up gate department when unmanned convenience store gets into the shopping selection region, when the user got into unmanned convenience store's shopping selection region through the passageway of gate, carries out the collection of user's characteristic through the camera that corresponds gate passageway setting. The process of user feature collection with reference to fig. 2 includes:
s121, collecting RGB images corresponding to facial features, top visual angle features and limb features of a user;
s122, collecting a depth image corresponding to a top visual angle of a user to generate point cloud characteristics;
and S123, storing the RGB image and the point cloud feature as a user feature.
The camera is arranged at the top of the corresponding gate channel of the unmanned convenience store and used for collecting the characteristics of the top visual angle of the user. Specifically, when image data of the top view angle feature of the user is acquired through the top view angle, RGB image data of the user is acquired from the center, the edge, and the opposite corners of the top view angle of the user. The embodiment of the application is a top camera at the pass gate of an unmanned convenience store, and 3 cameras are required to be used for shooting simultaneously in order to obtain image data of the position where a user is located at the center, the edge and the opposite angle on a picture. The center is the center of the top view angle of the user, the edge is the position where the image is shot corresponding to the edge position of the image shot at the center, and the opposite angles are the positions where the image is shot corresponding to the four opposite angles of the image shot at the center. It can be understood that when feature matching is performed, occlusion often occurs, resulting in extraction of only a portion of the user features. Therefore, according to the embodiment of the application, the image data of each user is acquired through the center, the edge and the opposite angle of the top visual angle of the user, and the situation that matching is unsuccessful due to less extracted user features when subsequent features are matched can be avoided as much as possible. In addition, in order to extract features describing users as much as possible, a user needs to extract a certain number of features, and the current practice is that three cameras respectively correspond to the user at the center, edge and diagonal position of the camera, each camera takes 10 pictures, and the total number of the pictures is 30, that is, the extracted user contains 30 image features. In addition, the depth image of the top visual angle of the user is acquired through the camera arranged at the top of the gate passing position, and the depth information of the depth image is used as the point cloud characteristic of the user.
In addition, the face image and the limb feature image of the user are collected, and the RGB images corresponding to the face feature, the top view feature and the limb feature of the user and the depth image corresponding to the top view of the user are stored as the user features for subsequent feature matching. It can be understood that the more user features are obtained, the more accurate the matching result is when performing feature matching. Therefore, by acquiring the corresponding feature image data features and the point cloud features, the embodiment of the application realizes accurate feature matching as much as possible, and avoids the situation that the feature matching identification fails due to occlusion.
And then, corresponding to the determined user characteristics, binding the determined user characteristics with the trackID and the user Id of the user, so as to determine the corresponding user ID according to characteristic matching subsequently.
S130, tracking the motion track of the user in real time based on the trackID, and determining the real-time trackID coordinate of the user.
Specifically, after the user feature, the trackID, and the user Id binding are completed, the real-time trackID coordinate of the user is further determined, and the trackID coordinate is bound to the corresponding trackID. When a user moves in a shopping area of an unmanned convenience store, a camera arranged in the shopping area is used for extracting a real-time image picture of the corresponding user, the corresponding user is detected from image data in a mode of a target detection algorithm and the like, and the spatial coordinate position of a user-specified part in the user image area is determined as a trackID coordinate by extracting the user image area. When the spatial coordinate position of the designated part of the user is determined, the designated part (head in the embodiment of the application) of the user image area can be recognized through a preset limb recognition model, and the spatial coordinate position of the designated part in the unmanned convenience store is further determined, and continuous data of < TrackID, x, y, z, time > is generated, wherein "x, y, z" identifies the spatial coordinate position of the designated part, and "time" represents a corresponding time point. And determining the trackID coordinates of the user so as to facilitate the subsequent distance matching and determining the corresponding user ID.
Note that, although the trackID is assigned when the pedestrian enters the shopping area of the unmanned convenience store, in the unmanned convenience store, a situation may occur in which trajectory tracking is interrupted due to an occurrence of a blocking situation, and a new trackID is allocated again when the pedestrian is identified again. Therefore, the moving track of the corresponding pedestrian is fitted through linear regression corresponding to the newly added trackID, the corresponding user closest to the sum of the trackID coordinate and the moving track distance is determined, and the user ID of the corresponding user is used as the user ID of the pedestrian. When the distance between the pedestrian and the trackID coordinate is determined, the corresponding pedestrian is framed according to a target detection algorithm, and the Euclidean distance from the central point of the pedestrian frame to the trackID coordinate is further calculated. The shorter the euclidean distance, the closer the pedestrian is to the trackID coordinates. Because the tracking algorithm generally needs to see the pedestrian in the whole course to track, the camera is fully covered under an ideal state, namely, a certain pedestrian can be found by the camera at any time. However, in consideration of the scene limitation of the unmanned convenience store, the user can be blocked in the moving process, if the user does not appear in any camera, the tracking is lost, so the trackID is easily lost after the blocking, and a new trackID can be generated when the pedestrian reappears in the camera. Specifically, by acquiring n images of the trackID coordinates of the new trackID, linear regression is performed on corresponding coordinate points of the trackids of all pedestrians in the n images, and the movement track of the pedestrian triggered to operate is fitted. The moving track is the moving track which is re-fitted by linear regression for tracking the lost pedestrian. And respectively detecting a user for each picture, and selecting the corresponding user with the shortest distance between the user and the trackID coordinate and the distance between the user and the moving track. According to the detection, the distance from a certain user to a point of a trigger position is A, the distance from a moving track point obtained by linear fitting to a line is B, if the value of A + B is minimum, the corresponding user is determined to correspond to the newly generated trackID, and the newly generated trackID is bound with the determined user ID. Thus, pedestrian re-identification of the unmanned convenience store can be realized.
In addition, the image features of the newly generated pedestrian corresponding to the trackID are further acquired, and the image features specifically include face features, top view features, limb features and point cloud features of the pedestrian. And performing feature matching on the acquired pedestrian image features and the user features bound by the user ID determined by the pedestrian re-identification, and checking the result of the pedestrian re-identification to ensure that the trackID coordinate recorded in real time can accurately correspond to the user ID.
S140, when order recording is triggered, acquiring a pedestrian image at a trigger position, determining a spatial coordinate point of a corresponding pedestrian, respectively performing feature matching and distance matching according to the user feature and the trackID coordinate, determining a user ID of the corresponding pedestrian, and recording an order according to the user ID.
Illustratively, when a user takes and puts commodities in an unmanned convenience store, the weight change condition of the commodities is detected in real time through a weighing device arranged at the bottom of the commodities, and when the weights of the commodities change, a settlement device of the unmanned convenience store is triggered to record orders. The image which is shot correspondingly by the camera at the trigger position and contains the pedestrian is collected, the image of the pedestrian is identified and detected, and the spatial coordinate point of the pedestrian in the image of the corresponding pedestrian is determined. Referring to fig. 3, the flow of determining the spatial coordinate point of the pedestrian includes:
s1401, determining the spatial position of the hand characteristic point when the pedestrian takes and puts the goods through limb recognition based on the pedestrian image;
and S1402, determining the spatial coordinate position of the pedestrian specified part through posture estimation according to the spatial position of the hand feature point, and taking the spatial coordinate position of the pedestrian specified part as the spatial coordinate point corresponding to the pedestrian.
Specifically, limb recognition is carried out through a preset hand feature recognition model, and feature points of the hands of a user at the triggering position when the user picks up goods or puts down the goods are determined. The hand feature recognition model is preset based on a neural network, and when feature point recognition is carried out, a pedestrian image is input into the hand feature recognition model, and the spatial coordinate position of the hand feature point in the pedestrian image is output. The limb identification through the limb feature identification model is a mature technology, and is not taken as a main improvement point of the embodiment of the application, and is not described herein again.
Further, a recognition model is estimated by using a preset posture based on the determined hand feature points, and the spatial coordinate position of a specified part (head in the embodiment of the present application) of the pedestrian is determined in association based on the determined hand feature points as the spatial coordinate point of the pedestrian. The gesture estimation and recognition model is a mature technology in the prior art, and is not described herein in detail.
Based on the determined pedestrian image and the pedestrian spatial coordinate point, feature matching and distance matching can be performed, and the user ID of the pedestrian is determined, referring to fig. 4, the determination of the user ID of the pedestrian includes:
s1403, distance matching is carried out on the space coordinate point of the corresponding pedestrian and the trackID coordinate, the trackID coordinate with the shortest Euclidean distance is determined, and a distance matching result is obtained;
s1404, comparing the pedestrian image with the user characteristics to obtain a characteristic matching result;
s1405, determining the user ID of the corresponding pedestrian according to the distance matching result and the feature matching result.
According to the determined pedestrian image and the spatial coordinate point, the pedestrian image is compared with the pre-stored user characteristics, the user characteristics matched with the pedestrian image can be determined through characteristic matching determination, and then the user ID bound by the user characteristics is determined. On the other hand, distance matching is carried out according to the determined pedestrian coordinate points and each trackID coordinate stored by the system at the corresponding moment, the trackID coordinate closest to the Euclidean distance of the pedestrian coordinate points is extracted, and then the user ID bound by the trackID coordinate is determined. It is understood that, in the above-mentioned feature matching process, an inconsistency may occur, and in the case where the two matching results are inconsistent, the user ID determined as a result of the feature matching is used as the finally determined user ID. In some embodiments, the user ID determined by distance matching may also be used as the user ID finally determined when the two matching results do not coincide. If the two are matched, the user ID is directly determined. By setting two matching modes, the two matching modes can mutually verify results, and further determine the user ID so as to ensure that the finally determined user ID is accurate.
And then, according to the determined user ID, if the corresponding commodity is picked up when the user is judged, adding the corresponding commodity to the order record bound by the corresponding user ID. Similarly, if the user puts down the commodity, the corresponding commodity is reduced on the order record bound by the corresponding user ID. Thus, the real-time recording of the commodity purchase order of the user is completed.
And S150, when the user leaves the store, carrying out charging settlement according to the order record corresponding to the user ID.
Finally, the user leaving the store is detected in real time through a camera arranged at the exit of the unmanned convenience store. The user ID of the off-store user may be determined by feature matching, distance matching, or direct re-authentication of identity. When the user is detected to leave the store, automatic charging settlement is carried out based on the order record of the corresponding user ID and according to the payment account pre-bound by the corresponding user ID; or generating a payment bill based on the order record corresponding to the user ID, and sending the payment bill to the corresponding user ID. The mode that the user pays is provided with the bill through automatic deduction or bill sending is adopted, the noninductive payment of the whole commodity purchasing is realized, the user does not need to queue for settlement and also does not need to carry out charging settlement through a settlement device, the time of the user is greatly saved, and the commodity settlement process is simplified.
In the method, when the user enters the shopping area, the corresponding trackID is distributed to the user, the trackID is bound with the user ID, the user characteristic is bound with the corresponding trackID by acquiring the user characteristic, the motion track of the user is tracked in real time, and the real-time trackID coordinate of the user is determined. And then, when the order record is triggered, acquiring a pedestrian image at the triggering position, determining a spatial coordinate point corresponding to a pedestrian, respectively performing feature matching and distance matching according to the user feature and the trackID coordinate, determining a user ID corresponding to the pedestrian, and performing the order record according to the user ID, wherein the order record can be used for charging and settlement corresponding to commodity purchasing of the user. By adopting the technical means, the shopping process of the user can be clearly and definitely recorded, the charging settlement of the commodity purchasing of the user is carried out according to the order record, and the noninductive settlement of the unmanned convenience store is realized. Therefore, the charging and settlement process of the user is simplified, and the shopping experience of the user is optimized.
Example two:
in addition to the above embodiments, fig. 5 is a schematic structural diagram of a settlement device for an unmanned convenience store according to a second embodiment of the present application. Referring to fig. 5, the settlement apparatus for an unmanned convenience store provided by the present embodiment specifically includes: a distribution module 21, an acquisition module 22, a tracking module 23, a determination module 24 and a settlement module 25.
The distribution module 21 is configured to distribute a corresponding trackID to the user when the user enters the shopping area, where the trackID is bound to the user ID and is used to track a user track;
the acquisition module 22 is configured to acquire a user characteristic, and bind the user characteristic with a corresponding trackID;
the tracking module 23 is configured to track a motion trajectory of the user in real time based on the trackID, and determine a real-time trackID coordinate of the user;
the determining module 24 is configured to, when triggering order recording, acquire a pedestrian image at a triggering position, determine a spatial coordinate point of a corresponding pedestrian, perform feature matching and distance matching respectively according to the user feature and the trackID coordinate, determine a user ID of the corresponding pedestrian, and perform order recording according to the user ID;
the settlement module 25 is configured to perform billing settlement according to the order record corresponding to the user ID when the user leaves the store.
Specifically, the acquisition module 22 includes:
the first acquisition unit is used for acquiring RGB images corresponding to facial features, top visual angle features and limb features of a user;
the second acquisition unit is used for acquiring a depth image corresponding to the top visual angle of the user and generating point cloud characteristics;
and the storage unit is used for storing the RGB image and the point cloud characteristics as user characteristics.
Specifically, the determining module 24 includes:
the first determining unit is used for determining the spatial position of the hand characteristic point when the pedestrian takes and puts the goods through limb recognition based on the pedestrian image;
and a second determination unit for determining the spatial coordinate position of the pedestrian specification part by posture estimation according to the spatial position of the hand feature point, and taking the spatial coordinate position of the pedestrian specification part as the spatial coordinate point of the corresponding pedestrian.
Specifically, the settlement module 25 includes:
the first matching unit is used for performing distance matching on the space coordinate points of the corresponding pedestrians and the trackID coordinates, determining the trackID coordinates with the shortest Euclidean distance and obtaining a distance matching result;
the second matching unit is used for comparing the pedestrian image with the user characteristics to obtain a characteristic matching result;
and the user ID determining unit is used for determining the user ID of the corresponding pedestrian according to the distance matching result and the feature matching result.
In the method, when the user enters the shopping area, the corresponding trackID is distributed to the user, the trackID is bound with the user ID, the user characteristic is bound with the corresponding trackID by acquiring the user characteristic, the motion track of the user is tracked in real time, and the real-time trackID coordinate of the user is determined. And then, when the order record is triggered, acquiring a pedestrian image at the triggering position, determining a spatial coordinate point corresponding to a pedestrian, respectively performing feature matching and distance matching according to the user feature and the trackID coordinate, determining a user ID corresponding to the pedestrian, and performing the order record according to the user ID, wherein the order record can be used for charging and settlement corresponding to commodity purchasing of the user. By adopting the technical means, the shopping process of the user can be clearly and definitely recorded, the charging settlement of the commodity purchasing of the user is carried out according to the order record, and the noninductive settlement of the unmanned convenience store is realized. Therefore, the charging and settlement process of the user is simplified, and the shopping experience of the user is optimized.
The settlement device for the unmanned convenience store provided by the second embodiment of the present application can be used for executing the settlement method for the unmanned convenience store provided by the first embodiment, and has corresponding functions and beneficial effects.
Example three:
an embodiment of the present application provides an electronic device, and with reference to fig. 6, the electronic device includes: a processor 31, a memory 32, a communication module 33, an input device 34, and an output device 35. The number of processors in the electronic device may be one or more, and the number of memories in the electronic device may be one or more. The processor, memory, communication module, input device, and output device of the electronic device may be connected by a bus or other means.
The memory 32 is a computer readable storage medium for storing software programs, computer executable programs, and modules, such as program instructions/modules corresponding to the settlement method for the unmanned convenience store according to any embodiment of the present application (e.g., a distribution module, an acquisition module, a tracking module, a determination module, and a settlement module in the settlement apparatus for the unmanned convenience store). The memory can mainly comprise a program storage area and a data storage area, wherein the program storage area can store an operating system and an application program required by at least one function; the storage data area may store data created according to use of the device, and the like. Further, the memory may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other non-volatile solid state storage device. In some examples, the memory may further include memory located remotely from the processor, and these remote memories may be connected to the device over a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The communication module 33 is used for data transmission.
The processor 31 executes various functional applications of the device and data processing by running software programs, instructions, and modules stored in the memory, that is, implements the above-described settlement method for the unmanned convenience store.
The input device 34 may be used to receive entered numeric or character information and to generate key signal inputs relating to user settings and function controls of the apparatus. The output device 35 may include a display device such as a display screen.
The electronic device provided above can be used to execute the settlement method for the unattended convenience store provided in the first embodiment above, and has corresponding functions and advantages.
Example four:
embodiments of the present application also provide a storage medium containing computer-executable instructions that, when executed by a computer processor, perform a method for settlement of an unmanned convenience store, the method for settlement of an unmanned convenience store comprising: when a user enters a shopping area, distributing a corresponding track ID for the user, wherein the track ID is bound with the user ID and is used for tracking the track of the user; collecting user characteristics, and binding the user characteristics with corresponding trackIDs; tracking the motion track of the user in real time based on the trackID, and determining the real-time trackID coordinate of the user; when order recording is triggered, acquiring a pedestrian image at a triggering position, determining a spatial coordinate point corresponding to a pedestrian, respectively performing feature matching and distance matching according to the user feature and the trackID coordinate, determining a user ID corresponding to the pedestrian, and recording an order according to the user ID; and when the user leaves the store, carrying out charging settlement according to the order record corresponding to the user ID.
Storage medium-any of various types of memory devices or storage devices. The term "storage medium" is intended to include: mounting media such as CD-ROM, floppy disk, or tape devices; computer system memory or random access memory such as DRAM, DDRRAM, SRAM, EDORAM, Lanbas (Rambus) RAM, etc.; non-volatile memory such as flash memory, magnetic media (e.g., hard disk or optical storage); registers or other similar types of memory elements, etc. The storage medium may also include other types of memory or combinations thereof. In addition, the storage medium may be located in a first computer system in which the program is executed, or may be located in a different second computer system connected to the first computer system through a network (such as the internet). The second computer system may provide program instructions to the first computer for execution. The term "storage medium" may include two or more storage media residing in different locations, e.g., in different computer systems connected by a network. The storage medium may store program instructions (e.g., embodied as a computer program) that are executable by one or more processors.
Of course, the storage medium provided in the embodiments of the present application contains computer-executable instructions, and the computer-executable instructions are not limited to the settlement method for the unmanned convenience store as described above, and may also perform related operations in the settlement method for the unmanned convenience store provided in any embodiments of the present application.
The settlement device, the storage medium, and the electronic device for the unmanned convenience store provided in the above embodiments may perform the settlement method for the unmanned convenience store provided in any of the embodiments of the present application, and reference may be made to the settlement method for the unmanned convenience store provided in any of the embodiments of the present application without detailed technical details described in the above embodiments.
The foregoing is considered as illustrative of the preferred embodiments of the invention and the technical principles employed. The present application is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the invention. Therefore, although the present application has been described in more detail with reference to the above embodiments, the present application is not limited to the above embodiments, and may include other equivalent embodiments without departing from the spirit of the present application, and the scope of the present application is determined by the scope of the claims.

Claims (10)

1. A settlement method for an unmanned convenience store, comprising:
when a user enters a shopping area, distributing a corresponding track ID for the user, wherein the track ID is bound with the user ID and is used for tracking the track of the user;
collecting user characteristics, and binding the user characteristics with corresponding track IDs;
tracking the motion track of the user in real time based on the track ID, and determining the real-time track ID coordinate of the user;
when order recording is triggered, acquiring a pedestrian image at a triggering position, determining a spatial coordinate point corresponding to a pedestrian, respectively performing feature matching and distance matching according to the user feature and the track ID coordinate, determining a user ID corresponding to the pedestrian, and recording an order according to the user ID;
and when the user leaves the store, carrying out charging settlement according to the order record corresponding to the user ID.
2. The settlement method for an unmanned convenience store according to claim 1, wherein the assigning of the corresponding track ID to the user when the user enters the shopping area further comprises:
and determining the user ID of the corresponding user by scanning the two-dimensional code or brushing the face for identification.
3. The settlement method for an unmanned convenience store according to claim 1, wherein the collecting of the user characteristics comprises:
collecting RGB images corresponding to facial features, top visual angle features and limb features of a user;
collecting a depth image corresponding to a top visual angle of a user to generate point cloud characteristics;
and storing the RGB image and the point cloud characteristics as user characteristics.
4. The settlement method for the unmanned convenience store according to claim 1, wherein the track ID real-time tracking the motion track of the user based on the track ID to determine the real-time track ID coordinates of the user comprises:
and extracting a user image area, and determining the space coordinate position of a user designated part in the user image area as a track ID coordinate.
5. The settlement method for an unmanned convenience store according to claim 1, wherein the collecting an image of a pedestrian at a trigger position and determining a spatial coordinate point of the corresponding pedestrian when triggering order recording comprises:
determining the spatial position of the hand characteristic point when the pedestrian takes and puts the goods through limb recognition based on the pedestrian image;
and determining the spatial coordinate position of the specified part of the pedestrian through posture estimation according to the spatial position of the hand feature point, and taking the spatial coordinate position of the specified part of the pedestrian as a spatial coordinate point corresponding to the pedestrian.
6. The settlement method for an unmanned convenience store according to claim 5, wherein the determining the user ID of the corresponding pedestrian by performing feature matching and distance matching respectively based on the user feature and the track ID coordinate comprises:
carrying out distance matching on the space coordinate points of the corresponding pedestrians and the track ID coordinates, and determining the track ID coordinates with the shortest Euclidean distance to obtain a distance matching result;
comparing the pedestrian image with the user characteristics to obtain a characteristic matching result;
and determining the user ID of the corresponding pedestrian according to the distance matching result and the feature matching result.
7. The settlement method for an unmanned convenience store according to claim 1, wherein the performing of the settlement of the charge based on the order record corresponding to the user ID when the user leaves the store comprises:
when the user is detected to leave the store, automatic charging settlement is carried out based on the order record of the corresponding user ID and according to the payment account pre-bound by the corresponding user ID; alternatively, the first and second electrodes may be,
and generating a payment bill based on the order record corresponding to the user ID, and sending the payment bill to the corresponding user ID.
8. An unmanned convenience store settlement apparatus, comprising:
the system comprises an allocation module, a tracking module and a tracking module, wherein the allocation module is used for allocating corresponding track IDs to users when the users enter a shopping area, and the track IDs are bound with the user IDs and used for tracking user tracks;
the acquisition module is used for acquiring user characteristics and binding the user characteristics with corresponding track IDs;
the tracking module is used for tracking the motion track of the user in real time based on the track ID and determining the real-time track ID coordinate of the user;
the determining module is used for acquiring a pedestrian image at a triggering position when order recording is triggered, determining a spatial coordinate point corresponding to a pedestrian, respectively performing feature matching and distance matching according to the user feature and the track ID coordinate, determining a user ID corresponding to the pedestrian, and recording an order according to the user ID;
and the settlement module is used for carrying out charging settlement according to the order record corresponding to the user ID when the user leaves the store.
9. An electronic device, comprising:
a memory and one or more processors;
the memory for storing one or more programs;
when executed by the one or more processors, cause the one or more processors to implement the method for automated convenience store checkout as recited in any of claims 1-7.
10. A storage medium containing computer-executable instructions for performing the method of settlement of an unmanned convenience store according to any one of claims 1 to 7 when executed by a computer processor.
CN201911313115.5A 2019-12-18 2019-12-18 Settlement method, device, equipment and storage medium for unmanned convenience store Withdrawn CN111178860A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911313115.5A CN111178860A (en) 2019-12-18 2019-12-18 Settlement method, device, equipment and storage medium for unmanned convenience store

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911313115.5A CN111178860A (en) 2019-12-18 2019-12-18 Settlement method, device, equipment and storage medium for unmanned convenience store

Publications (1)

Publication Number Publication Date
CN111178860A true CN111178860A (en) 2020-05-19

Family

ID=70655605

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911313115.5A Withdrawn CN111178860A (en) 2019-12-18 2019-12-18 Settlement method, device, equipment and storage medium for unmanned convenience store

Country Status (1)

Country Link
CN (1) CN111178860A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112528822A (en) * 2020-12-04 2021-03-19 湖北工业大学 Old and weak people path finding and guiding device and method based on face recognition technology
CN112906759A (en) * 2021-01-29 2021-06-04 哈尔滨工业大学 Pure vision-based entrance-guard-free unmanned store checkout method
CN113377192A (en) * 2021-05-20 2021-09-10 广州紫为云科技有限公司 Motion sensing game tracking method and device based on deep learning
CN117455595A (en) * 2023-11-07 2024-01-26 浙江云伙计科技有限公司 Visual AI-based unmanned intelligent on-duty method and system

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112528822A (en) * 2020-12-04 2021-03-19 湖北工业大学 Old and weak people path finding and guiding device and method based on face recognition technology
CN112906759A (en) * 2021-01-29 2021-06-04 哈尔滨工业大学 Pure vision-based entrance-guard-free unmanned store checkout method
CN113377192A (en) * 2021-05-20 2021-09-10 广州紫为云科技有限公司 Motion sensing game tracking method and device based on deep learning
CN113377192B (en) * 2021-05-20 2023-06-20 广州紫为云科技有限公司 Somatosensory game tracking method and device based on deep learning
CN117455595A (en) * 2023-11-07 2024-01-26 浙江云伙计科技有限公司 Visual AI-based unmanned intelligent on-duty method and system
CN117455595B (en) * 2023-11-07 2024-06-11 浙江云伙计科技有限公司 Visual AI-based unmanned intelligent on-duty method and system

Similar Documents

Publication Publication Date Title
US11501523B2 (en) Goods sensing system and method for goods sensing based on image monitoring
JP6646176B1 (en) Autonomous store tracking system
CN111178860A (en) Settlement method, device, equipment and storage medium for unmanned convenience store
CN110033293B (en) Method, device and system for acquiring user information
US20210406990A1 (en) Associating shoppers together
CN112464697B (en) Visual and gravity sensing based commodity and customer matching method and device
CN110009836A (en) The system and method for deep learning based on EO-1 hyperion photography technology
JP6261197B2 (en) Display control apparatus, display control method, and program
CN111263224B (en) Video processing method and device and electronic equipment
US11475657B2 (en) Machine learning algorithm trained to identify algorithmically populated shopping carts as candidates for verification
EP3901841A1 (en) Settlement method, apparatus, and system
CN111222870A (en) Settlement method, device and system
US20230419670A1 (en) Information processing apparatus, information processing method, and program
CN110490697A (en) Unmanned convenience store&#39;s settlement method, device, computer and storage medium
CN110647825A (en) Method, device and equipment for determining unmanned supermarket articles and storage medium
KR20220055168A (en) Automatic payment method in unmanned stores and unmanned store platform apparatus implementing the same method
CN110287867A (en) Unmanned convenience store enters recognition methods, device, equipment and storage medium
US20220309784A1 (en) System and method for populating a virtual shopping cart based on a verification of algorithmic determinations of items selected during a shopping session in a physical store
CN111429194B (en) User track determination system, method, device and server
US20230005348A1 (en) Fraud detection system and method
CN111783509A (en) Automatic settlement method, device, system and storage medium
CN111260685A (en) Video processing method and device and electronic equipment
US11475656B2 (en) System and method for selectively verifying algorithmically populated shopping carts
US11386647B2 (en) System and method for processing a refund request arising from a shopping session in a cashierless store
CN111444757A (en) Pedestrian re-identification method, device, equipment and storage medium for unmanned supermarket

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication
WW01 Invention patent application withdrawn after publication

Application publication date: 20200519