AU2017231602A1 - Method and system for visitor tracking at a POS area - Google Patents

Method and system for visitor tracking at a POS area Download PDF

Info

Publication number
AU2017231602A1
AU2017231602A1 AU2017231602A AU2017231602A AU2017231602A1 AU 2017231602 A1 AU2017231602 A1 AU 2017231602A1 AU 2017231602 A AU2017231602 A AU 2017231602A AU 2017231602 A AU2017231602 A AU 2017231602A AU 2017231602 A1 AU2017231602 A1 AU 2017231602A1
Authority
AU
Australia
Prior art keywords
visitor
data
pos
master user
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
AU2017231602A
Inventor
Ravichandran PRASHANTH
Lin Xiaoming
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Trakomatic Pte Ltd
Original Assignee
Trakomatic Pte Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Trakomatic Pte Ltd filed Critical Trakomatic Pte Ltd
Publication of AU2017231602A1 publication Critical patent/AU2017231602A1/en
Abandoned legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/50Extraction of image or video features by performing operations within image blocks; by using histograms, e.g. histogram of oriented gradients [HoG]; by summing image-intensity values; Projection analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Business, Economics & Management (AREA)
  • General Business, Economics & Management (AREA)
  • Human Computer Interaction (AREA)
  • Strategic Management (AREA)
  • Primary Health Care (AREA)
  • Marketing (AREA)
  • Human Resources & Organizations (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Tourism & Hospitality (AREA)
  • Economics (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Image Analysis (AREA)
  • Collating Specific Patterns (AREA)

Abstract

The invention relates to a method and a system for visitor tracking at a POS area (2) comprising the steps of - capturing an image of a visitor (5) by a facial detection camera (8) at the POS area (2), - extracting a biometric face template from the image, - comparing the biometric face template with biometric face templates of a master user database, wherein the biometric face templates of the master user database are each assigned to a user account of an online shopping system and the master user database comprises online shopping behavior data for each user, - if a match between the biometric face template of the visitor and one of the bio- metric face templates of the master user database is found then the content of the master user database is used for supporting a service at the POS area (2).

Description

Method and system for visitor tracking at a POS area
The invention relates to a method and a system for visitor tracking at a POS area. The invention also relates to a method for running a master user database on an online and offline shopping server. C.P. Papageorgion, M. Oren, and T. Poggio; A general framework for object detection; Sixth International Conference on Computer Vision, pages 555-562, 1998 is one of the first publications in which Haar wavelets are described for real-time object detection.
From Paul Viola and Michael J ones; Rapid object detection using a boosted cascade of simple features, Mitsubishi Electric Research Laboratories, Inc., 2004 (TR-2004-043), Cambridge, Massachusetts, USA (accepted Conference on Computer Vision and Pattern Recognition, 2001), and Paul Viola and Michael J ones; Robust real-time object detection; International J ournal of Computer Vision, 57(2): 137-154, 2002, a method for automatically recognizing faces in images is known, wherein Haar wavelets are used for detecting Haar-like features. By using a so-called "Integral Image" which comprises grid point assigned sums, Haar wavelets can be very quickly applied to the image. P.I. Wilson, J. Fernandez; Facial Feature Detection Using Haar Classifiers, J CSC 21, 4 (April 2006), CCSC: South Central Conference, describes a further method for recognizing faces in an image by means of Haar-like features. The area of the image being analyzed fora facial feature is regionalized to a location with the highest probability of containing the feature. By regionalizing the detection area, false positives are eliminated and the speed of detection is increased due to the reduction of the area examined.
In Sebastian Schmitt, Real-Time Object Detection With Haar-Like Features, J une 22, 2010, s-schmitt.de/ressourcen/haar_like_features.pdf several projects using Haar-like features for detecting objects in real-time are described. In these projects rotated Haar-like features are used. In order to compute rotated features as fast as the axis-aligned ones, a rotated summed area table (RSAT), which corresponds to a rotated integral image, is used. N. Dalai and B. Triggs; Histograms of oriented gradients for human detection, lear.inrialpes.fr/people/triggs/pubs/Dalal-cvpr05.pdf, published in Computer Vision and Pattern Recognition, 2005, CVPR 2005, IEEE Computer Society Conference on J une 25, 2005 (Volume 1), pages 886-893, vol. 1, ISSN 1063-6919, Print ISBN 0-7695-2372-2, publisher IEEE, describes a method for detecting humans in images using Histogram of Oriented Gradient (HOG) descriptors for discriminating humans in the images. This method is based on evaluating well-normalized local histograms of image gradient orientations in a dense grid. Local object appearance and shape can often be characterized rather well by the distribution of local intensity gradients or edge directions, even without precise knowledge of the corresponding gradient or edge positions. This is implemented by dividing the image window into small spatial regions, so-called cells and accumulating for each cell a local 1-D histogram of gradient directions or edge orientations over the pixels of the cell. The combined histogram entries form the representation. Tiling the detection window with a dense (in fact an overlapping) grid of HOG descriptors results in a human detection chain.
Ojala, T., Pietikainen, M., Harwood, D.; Performance evaluation of texture measures with classification based on Kullback discrimination of distributions, published in Pattern Recognition, 1994, vol. 1 - Conference A: Computer Vision & Image Processing, Proceedings of the 12th IAPR International Conference on 9-13 Oct 1994 (Volume: 1), pages 582-585, Print IS BN 0-8186-6265-4 describes Local binary patterns (LBP) used for texture classification in computer vision. LBP is based on so-called feature vectors. To derive a feature vector of an image, the image is divided into cells. Each pixel of each cell is then compared to each of its neighboring pixels clockwise or counter-clockwise. The difference in size results in numbers which are combined in a histogram showing the frequency of each number occurring. Then the histograms of all cells are concatenated, which gives the feature vector of the image.
Gray-level co-occurrence matrices (GLCM) are another approach for texture classification in computer vision (http://Www.fp.ucalgary.ca/mhallbey/tutorial.htm). A GLCM considers the relation between two pixels at a time. Using so-called second order texture calculations the GLCM represents the spatial relationship between groups of two (usually neighboring) pixels, called the reference and the neighbor pixel. The neighbor pixel is often the pixel with a distance of one pixel to the reference pixel in a predetermined direction, but the distance between these pixels can also be any other number other than one. The matrix is a tabulation of how often different combinations of pixel brightness values, which are grey levels in the case of a grey level image, occur in the image. By using statistical analysis methods on the GLCM, textures can be measured and classified.
In Costa, A.F., Humpire-Mamani, G., Traina, A.J .M.; An Efficient Algorithm for Fractal Analysis of Textures, published in Graphics, Patterns and Images (SIBGRAPI), 2012, 25th SIBGRAPI Conference on 22-25 Aug. 2012, pages 39-46, ISSN 1530-1834, Print ISBN 978-1-4673-2802-9, published by IEEE, a new and efficient texture feature extraction method, the so-called Segmentation-based Fractal Texture Analysis (SFTA) is described. SFTA is performed by extracting texture features, training a classifier and classification of a texture image. The extraction algorithm consists of decomposing the input image into a set of binary images from which the fractal dimensions of the resulting regions are computed in order to describe segmented texture patterns. The decomposition of the input image is achieved by the so-called Two-Threshold Binary Decomposition (TTBD) algorithm. SFTA can perform the tasks of content-based image retrieval (CBIR) and image classification. solvePnP is a function of the openCV api (http://docs.opencv.Org/2.4/modules/calib3d /doc/camera_calibration_and_3d_reconstruction.html). The function estimates an object pose given a set of object points, their corresponding image projections, as well as a camera matrix and distortion coefficients. First object points (3D), image points (2D) and the camera matrix are input into the solvePnP-function. Then, solvePnP returns a rotation vector (rvec) and a translation vector (tvec) with which points from the model coordinate system can be mapped to the camera coordinate system. The camera coordinate system is the camera’s cartesian coordinate system, it moves with the camera, and the camera is always at the origin. US 6,711,293 B1 discloses a method and an apparatus for identifying scale invariant features in an image and use of the same for locating an object in the image, the method being called scale invariant feature transform (SIFT). The method is performed by locating pixel amplitude extrema in a plurality of difference images which are produced from the initial image. First local maximal and minimal amplitude pixels in each difference image are identified. Then possible maximal and minimal amplitude pixels are identified, followed by identifying the actual maximal and minimal amplitude pixels. Then a plurality of component subregion descriptors for each subregion of a pixel region about the pixel amplitude extrema in the plurality of difference images is produced. For any object in an image, interesting points on the object can be extracted from the plurality of component subregion descriptors to provide a so-called “feature description” of the object. This feature description can then be used to identify the object when attempting to locate the object in another image. EP 1 850 270 B1 describes a method for determining an interest point in an image having a plurality of pixels suitable for working at different scales and/or rotations. The method produces a local feature detector and descriptor being called Speeded Up Robust Features (SURF). The method comprises filtering an image using at least three digital filters and then selecting an interest point based on determining a measure resulting from application of the digital filters. The measure is a non-linear combination of the outputs of the digital filters and captures variations of an image parameter in more than one dimension or direction. SURF’s feature descriptor is based on the sum of the Haar wavelet response around the point of interest SURF descriptors can be used for object recognition, localization, tracking, registration and classification, 3D reconstruction and extracting points of interest in an image. Μ. Turk and A. Pentland; Eigenfaces for Recognition, J ournal of Cognitive Neuroscience, vol. 3, no. 1, pages 71-86, 1991 discloses a method for recognizing human faces with computer systems. A collection of face images is transformed by a mathematical process called Principle Component Analysis (PCA) into a collection of face images with a low-dimensional representation. From this transformed collection of face images a mean face image is calculated. Then the mean face image is subtracted from each face image of the transformed collection resulting in a collection of difference images. From this collection of difference images a covariance matrix is deducted, which set of normalized eigenvectors is called set of eigenfaces. This set of eigenfaces can be used to represent both existing and new faces: A new image of a face can be mean-subtracted and projected on the set of eigenfaces such that the differences between the new face and the mean face are represented by a vector. This vector can be used for face recognition. Z. Kalal, K. Mikolajczyk, J. Matas; Tracking-Learning-Detection, published in: IEEE Transactions on Pattern Analysis and Machine Intelligence (Vol. 34 , Issue 7), pages 1409-1422, ISSN 0162-8828, December 2011, published by IEEE, describes longterm tracking of unknown objects in a video stream. The object is defined by its location and extent in a single frame. In every frame that follows, the object’s location and extent is determined or it is indicated that the object is not present A tracking learning detection (TLD) framework is disclosed that decomposes the long-term tracking task into tracking, detection and learning. A tracker follows the object from frame to frame. A detector localizes all appearances that have been observed so far and corrects the tracker if necessary. Learning estimates detector’s errors by identifying those errors and updates the detector to avoid those errors in the future.
Babenko, B., Ming-Hsuan Yang, Belongie, S.; Visual tracking with online Multiple Instance Learning, published in Computer Vision and Pattern Recognition, 2009, CVPR 2009, IEEE Conference on 20-25 J une 2009, pages 983-990, ISSN 1063-6919, Print IS BN 978-1-4244-3992-8, published by IEEE, discloses a solution to the problem of learning an adaptive appearance model for object tracking. A class of tracking techniques called “tracking by detection” is described, which results in realtime speed tracking. These methods train a discriminative classifier in an online manner to separate an object from a background. This classifier bootstraps itself by using a current tracker state to extract positive and negative examples from a current frame. Slight inaccuracies in the tracker can therefore lead to incorrectly labeled training examples, which degrades the classifier and can cause further drift Using Multiple Instance Learning (MIL) instead of traditional supervised learning avoids these problems, and can therefore lead to a more robust tracker with fewer parameter tweaks. Further, a novel online MIL algorithm for object tracking is disclosed.
Fukunaga, K., Hostetler, L.; The estimation of the gradient of a density function, with applications in pattern recognition, published in: IEEE Transactions on Information Theory, vol. 21, issue 1, pages 32-40, ISSN 0018-9448, issued J an 1975, published by IEEE describes a nonparametric density gradient estimation using a generalized kernel approach. Conditions on the kernel functions are derived to guarantee asymptotic unbiasedness, consistency, and uniform consistency of the estimates. The results are generalized to obtain a simple mean-shift estimate that can be extended in a k-nearest-neighbor approach. Applications of gradient estimation to pattern recognition are presented using clustering and intrinsic dimensionality problems, with the ultimate goal of providing further understanding of these problems in terms of density gradients.
An object of the present invention is to provide a method and a system for automatically tracking a visitor in a POS area, wherein the quality of services based on data achieved by tracking a visitor in the POS area is improved. A further object of the present invention is to improve the data content of a master user database of an online shopping system.
One or more of the objects mentioned above are solved by the subject matter of the independent claims. Preferred embodiments are defined in the corresponding subclaims. A first aspect of the present invention relates to a method for visitor tracking at a POS area comprising the steps of - capturing an image of a visitor by facial detection camera in the POS area, - extracting a biometric face template from the image, - comparing the biometric face template with biometric face templates of a master user database, wherein the biometric face templates of the master user database are each assigned to a user account of an online shopping system and the master user database comprises online shopping behavior data for each user.
If a match between the biometric face template of the visitor and one of the biometric face templates of the master user databases is found, then the content of the master user database is used for supporting a service atthe POS area.
This visitor tracking method detects visitors of the POS area by means of the face in that a biometric face template is extracted from an image showing the face of the corresponding visitor. This biometric face template is used to access data of a user account of this visitor in an online shopping system. These data comprise online shopping behavior data. At the POS area services can be provided being based on such online shopping behavior data.
The online shopping behavior can comprise a browsing behavior and/or a shopping behavior. The browsing behavior comprises a list or table of URLs of websites visited by the user of the online shopping system which can be combined with time stamps or statistical data, particularly how often and/or how long the respective websites were be visited by the user. The shopping behavior comprises data of the purchased goods and/or services. These shopping behavior data can further comprise timestamps of each purchase, information of the value of each purchase and information about the payment method.
These services can comprise services to the visitor or customer, respectively, services in assisting the sales staff of the POS area and services to the management of a retail shop using this POS area.
An example of the service to a visitor is to automatically send a message to a mobile device of the visitor, containing information about the retail shop using the POS area, special offers or any other product information. The online shopping behavior data are considered in generating this message. The visitor will receive information that can be individually adapted to their preferences and wishes. Furthermore, terminals can be provided in the POS area for the visitors in which the visitor can look up information about the retail shop using this POS area. The information provided at such a terminal can be personalized to the user on the basis of their online shopping behavior data retrieved from the master user database.
The sales staff of the retail business at the POS area can be automatically informed about the shopping behavior of the visitor of the POS area, who is a potential customer of the POS area. Thus, the sales staff can provide individually adapted services and offers to the visitors of the POS area.
It is also possible to provide statistical analysis of all visitors visiting the POS area. This analysis is also based on the online shopping behavior which is retrieved from the master user database. Each visitor can be classified very precisely, so that the information content of such a statistical analysis is highly significant and useful for the management of the retail business using this POS area. Such statistical analyses are very valuable for assessing the value of a POS area. This provides a strong improvement to the management of the retail shop.
Independent of the receiver of the service the amount of data received from the master user database can be limited according to the respective law governing data protection and data security. E.g., the management of the shop does not need any personal data of the visitors of the POS area forthe statistical analysis. Anonymous data which do not contain a name, detailed address, telephone number are mostly sufficient.
However, the broad basis of the data that comprise both online shopping behavior data as well as offline shopping behavior data, which can be retrieved by tracking the visitor in the POS area, significantly improves the services to the visitors or customers, respectively, sales staff at the POS area, and the management of the retail shop using this POS area.
If no match is found by comparing the biometric face template with biometric face templates of the master user database then visitor data extracted from images captured in the POS area can be stored in a POS database which is independent from the master user database. The POS database preferably is a local database connected to a visitor tracking system for tracking the visitors in the POS area.
This POS database describes the offline shopping behaviour. This offline shopping behaviour can also be used for supporting the above mentioned services as long as the POS database contains a sufficient basis of data.
One of the services provided to the visitors is generating and transmitting a message to a mobile device of the visitor. The contact data of the mobile device can be retrieved from the master user database. Preferably, it is checked at the POS area whether the mobile device is present in the POS area. This can be carried out by checking communication data exchanged between a transceiver in the POS area and mobile devices located in the POS area. A visitor of the POS area is tracked by capturing one or more images of the visitor by one or more cameras and offline shopping behavior data are preferably extracted from those images. Offline shopping behavior data is e.g. based on the time a visitor is spending in front of certain products, advertisements and displays. Furthermore, offline shopping behavior data can be based on data received from a cash register at the POS area when the visitor is purchasing certain goods or services. These data can comprise an identification and an amount of goods and services purchased by the visitor.
The POS area preferably comprises several types of stationary cameras. Such a camera can be a facial detection camera that is arranged so that the field of view is directed towards the face of a visitor in an upright position. Tracking cameras can be provided, which are tracking the visitors of the POS area with a field of view from above so that a plurality of visitors can be tracked simultaneously by one camera.
According to a further aspect, the present invention provides a method wherein a master user database is run on a shopping server, which is part of an online shopping system. A user may log in to the shopping system by means of a client computer comprising a client camera. An image of the user is captured by the client camera and a biometric face template is extracted from this image, wherein the user’s image and/or the user’s biometric face template is stored in the master user database.
This method provides a master user database containing biometric face templates of each user. The biometric face template can be used as an identification for the users so that the content of the master user database, particularly the content of accounts of specific users, can be easily connected or related to other data on another system, which are related to the same person by means of the biometric face template.
This method is particularly advantageous in combination with the above-described first aspect of the invention, because all data retrieved from the tracking at the POS area of a certain visitor can be easily linked to the account data in the master user database of the same person by means of the biometric face templates.
According to a further aspect offline shopping behavior data retrieved by tracking a visitor in a POS area can be added or supplemented to a master user database, if a biometric face template of the visitor matches with one of the biometric face templates of the master user database, which means that the visitor is the same person as the user of the account of the master user database comprising the matching biometric face template. By adding or supplementing the offline shopping behavior data to the master user database the information content or information value of the master user database is increased significantly, because data of the real world are automatically detected and added to the master user database. Therefore, the master user database comprises a very broad basis of data which is highly valuable for statistical analysis.
As mentioned above, each time the user is logging into the online shopping system an image of the user is captured and a biometric face template can be extracted and stored in the master user database. This storing of the biometric face template does not necessarily mean that an already present face template is completely replaced by a new biometric face template, but can also be carried out in that an already existing biometric face template is updated by the data of the newly delivered biometric face template. Thus, the quality of the biometric face template is improved each time the user logs into the online shopping system. The biometric face template that is generated during a log-in procedure can be used as logging criteria for the online shopping system. Preferably, each time the user logs into the online shopping system at least one image of the user is captured.
Such a master user database comprising combined online and offline shopping behavior data can be used for preparing shopping user profiles, data for supporting advertisements or for generating further marketing characteristics.
According to a further aspect the invention refers to a system for visitor tracking at a POS area which is embodied for carrying out one or more of the methods described above.
Such a system preferably comprises at least one facial detection camera, at least one or more tracking cameras, a central control unit connected to the cameras. The tracking system preferably comprises a storage means hosting a local database, which is called POS database. The central control unit is preferably connected to a mobile shopping system comprising a remote storage means hosting a master user database.
Neighboring cameras of the tracking system are preferably arranged so that the fields of view are overlapping to such an extent that at least one person can be simultaneously detected by both neighboring cameras.
The tracking system is preferably connected by means of a data line to a cash register of a POS area, so that data describing payment transactions are available to the tracking system. The sales transactions can be synchronized to the tracking system by using timestamp synchronization.
An embodiment of the present invention will be explained in the following by means of the enclosed drawings, which show in:
Figure 1 schematically a tracking system at a POS area,
Figure 2 a tracking camera with its field of view,
Figure 3 and 4 two tracking cameras with overlapping fields of view in a side view and in a view from above,
Figure 5 to 7 flowcharts of certain modules of the tracking system,
Figure 8 and 9 each schematically a further embodiment of a tracking system and its relationship to an online shopping system, and
Figure 10 and 11 each the data structure of a data set of a POS database and a master user database.
Figure 1 shows an embodiment of a tracking system 1 for visitor tracking at a POS area 2. The POS area 2 comprises a point of sale (POS) 3 at which a visitor or customer, respectively, of the POS area 2 can finalize a purchase process. In the present embodiment the point of sale 3 comprises a cash register 4. A point of sale is not limited to such a cash register but is any physical point at which any kind of purchase, buying, hiring, renting process between a party which offers a certain good and/or service and a customer can be finalized. The finalizing of a purchase process can also be carried out by means of a mobile device instead of a fixed cash register. In the broadest sense the POS area is any physical area in which goods and/or services are presented to initiate purchase a process.
The POS area 2 comprises one or more displays or advertisements for displaying the offered goods and/or services. A visitor 5 of the POS area 2 can walk along the displays and make up their mind whether they want to buy or rent the offered products or services. The POS area 2 is usually part of a physical real world shop. The POS area 2 can also be embodied as a commercial or non-commercial fair for offering goods and services such as a business fair or flea market. Any area comprising displays or advertisements belongs to the POS area, such as external areas of a shop building comprising advertisement windows or other displays.
The tracking system 1 comprises several tracking cameras 6 which are arranged along the walking paths in the POS area 2. In the present embodiment the tracking cameras 6 are embodied as overhead cameras having its direction of view directed downward so that these tracking cameras 6 are imaging the visitors 5 with a viewing direction from above. Each tracking camera 6 is a digital camera for generating electronically readable image files. Each image file comprises at least one image taken of a certain field of view 7 which is defined by a lens of the tracking camera 6. The image file can also comprise a video stream which is a continuous sequence of images. Preferably, the tracking cameras 6 are arranged so closely to each other that the fields of view 7 of neighboring cameras are overlapping.
The tracking system 1 comprises at least one facial detection camera 8 which is a digital camera arranged with its direction of view in the POS area 2, so that an image of a face of a visitor 5 having their head in an upright position can be taken.
In the present embodiment a facial detection camera 8 is located in an entrance area 9 of the POS area 2 and a second facial detection camera 8 is located at the point of sale 3. The facial detection camera 8 at the entrance area is directed with its direction of view from the inside of the POS area to the entrance area sothatthe faces of the users entering the POS area 2 are captured. It can also be useful to have a facial detection camera 8 directed with its direction of view from the outside of the POS are to the entrance area so that the faces of the users leaving the POS area 2 are captured. With this camera 8 all visitors are captured independently whether they were buying an article or they were just passing the POS area without buying anything.
In the present embodiment a path starting with the entrance area 9 and ending at the point of sale 3 is continuously monitored by the cameras 6, 8. It is also possible to provide cameras for monitoring areas in front of an external advertisement window or any other external display.
The facial detection cameras 8 define each a field of view 10. The first facial detection camera 8/1 and a first tracking camera 6/1 located in the entrance area 9 are arranged so that the corresponding fields of view 7, 10 are overlapping. A visitor 5 can so be simultaneously detected by the first tracking camera 6/1 and the first facial detection camera 8/1.
Each camera 6, 8 is connected to a data line 11 of a local area network.
The system also comprises a central control unit 12 and a storage means 13 for hosting a local database 14, which is in the following called “POS database” 14.
The central control unit 12 is connected to a wide area network (WAN) such as the internet 15 or an intranet. The central control unit 12 has a data connection via the WAN 15 to a remote storage means 16 hosting a master user database 17.
The master user database 17 is part of an online shopping system (not shown in the figures). The online shopping system comprises a web page which can be displayed on a client computer for offering the user of the client computer certain goods and/or services.
The client computer can be a desktop computer or any other mobile or stationary device having an interface to be connected to the WAN and display and input means for displaying the web page of the online shopping system and entering corresponding orders. Such client computers are typically mobile phones, tablets, handheld computers, etc.
The online shopping system is embodied in such a way that a user has to register themselves for using the online shopping system. Therefore, registration data of the user are stored in the master user database 17. The registration data are allocated in corresponding user accounts and comprise the full name, email address, mailing address and phone number of the user. In a preferred embodiment of the online shopping system the registration data also comprise a biometric face template.
The signing up can be carried out by means of a user name and a password. Such a signing up procedure is provided for the case that either the user is using a client computer without camera or someone else being authorized by the user is using the user’s online shopping account.
The biometric face template is extracted from an image of the face of the user provided to the online shopping system. Such face images of users are often available in social media such as Facebook, Twitter, etc. However, the profile images are mostly not useful for extracting the biometric face template. A social media account can also be used for signing up to the online shopping system, which makes the images of the user’s face available to the online shopping system. An image of a user’s face can also be provided in a different way than by means of a social media account. The online shopping for example can require an image of the user face as necessary part of the registration data which have to be supplied by the user at the first registration for the online shopping system.
Preferably, the client computer comprises a camera having a direction of view directed at the user of the client computer and a software application for capturing an image of the user each time the user is signing up to the online shopping system. Then this image can be used for the signing up to the online shopping system. From this image a biometric face template is extracted by the software application of the client computer and this client biometric face template is compared to the biometric face templates of the master user database for authorizing the user of the client computer for signing up to the mobile shopping system. Only if there is a match between the client’s biometric face template and the corresponding face template of the master user database a user is allowed to virtually enter the online shopping system. This login procedure can be combined with an additional security means such as a password.
Using the biometric face template as key for authorizing a user to virtually enter the shopping system on one hand provides a high security for the online shopping system that only a registered person is using the corresponding account of the online shopping system and on the other hand provides an easy access to the online shopping system to the user. The biometric face template can also be combined with further data for signing up to the online shopping system, such as a password. A further advantage of signing up to the online shopping system by means of a biometric face template is that each time that the user signs up to the online shopping system an actual image of the user and a biometric face template is generated and at least either the image or the biometric face template is forwarded to the online shopping system. Therefore, the biometric face template included in the registration data of the master user database can be updated each time that the user signs up to the online shopping system. This can help improve the accuracy over an extended period of time where there are physical (biological) changes in the person’s appearance such as natural aging, make-up and changing hairstyles.
Using images of a social media account provides sometimes further valuable information about the user. E.g. often the places or Geo-locations where the images were taken are linked to the corresponding images. Such the travel behaviour of the users can be derived and used for e.g. offering certain travel services.
In the embodiment described above, a biometric face template is extracted from the user’s image on the client computer and forwarded to the master user database. Alternatively, it is also possible that the client computer forwards the user’s image to the online shopping system, where in the online shopping system that biometric face template of the image is extracted and compared to the biometric face templates included in the master user database. In such a case the image as well as the biometric face templates of the user can be stored in the corresponding registration data of the master user database.
In the following different sections and aspects of a method for visitor tracking with the tracking system 1 described above are explained.
The tracking system 1 comprises a face detection module 19 which is designed for detecting one or more faces in an image captured by one of the facial detection cameras 8.
The face detection module receives 19 one or more images or frames respectively, from the facial detection camera 8 (step S1, figure 5).
Each frame is analyzed by means of the HAAR cascade Viola-J ones method for containing one or more faces (step S2).
For each face a rectangular section of the image is cropped as face thumbnail 18 (step S3).
From each face thumbnail face features are extracted (step S4). Firstly, the eyes are detected in each face thumbnail by means of the HAAR cascade Viola-J ones method. When the eyes are identified the face thumbnail is normalized. From the normalized face thumbnail the face features are extracted. The extraction of the face features can be carried out by histogram of gradients (HOG), local binary patterns (LBP), speed up robust features (SURF), scale-invariant feature transformation (SIFT) and/or by Eigen-features. These methods for extracting face features are known and can be used separately or in combination.
If face features are extracted from several thumbnails, then face features relating to the same face are identified and allocated to each other. The face features relating to the same face are concatenated into a vector (step S5). This vector forms the biometric face template.
This detection module is used for generating the biometric face templates of visitors captured with one of the facial detection cameras 8.
The face detection module can also be embodied for generating the biometric face templates of visitors captured with one of the facial detection cameras 8 by means of a method as described in the Singapore Patent Application No. 10201504080W (date of filing: 25 May 2015). This Singapore Patent Application No. 10201504080W is incorporated by reference. According to this method an image is captured which contains a face and which is further processed by analyzing the image for non-facial attributes of the person, extracting face features from the image, sorting and/or filtering face templates stored in a database by said non-facial-attributes, searching the sorted and/or filtered database fora face-template matching the face of the image.
By considering non-facial attributes it is easier to find a match with a face-template in a database, particularly in the POS database 14 and/or in the master user database 17.
Non-facial attributes are particularly used for tracking users at the same day and/or in the same vicinity and/or the same POS area.
If the face detection module 19 of the tracking system 1 detects a face a corresponding visitor data set is generated and stored in the POS database 14. The visitor data set comprises, at least for the time being, all information which is generated and collected by the tracking system 1 of the corresponding visitor, such as one or more images of this visitor, the biometric face template etc.
Such a face detection module can also be provided in the online shopping system described above or in the corresponding software application at the client computer for extracting the biometric face templates of images taken of the user by means of the camera of one of the client computers.
The tracking system 1 comprises a human detection module for detecting humans in the images captured by one of the tracking cameras 6. The detection of the humans is carried out by histogram of gradients (HOG) and/or by means of H AAR-features. The human detection module is trained in a training phase before using it to track people in the POS area 2.
All cameras 6, 8 are calibrated with respect to each other. Such a calibration can be carried out by means of a chess board grid which is placed during the calibration phase in the POS area 2 so thatthe chess board grid is visible for at least two neighboring cameras simultaneously. There are known software modules such as cv2.findChessboardcorners which yield camera matrices and distortion coefficients.
The tracking system 1 comprises a first tracking module 21 which is designed for passing person information from one of the facial detection cameras 8 to the neighboring tracking camera 6.
The first tracking module 21 receives from the face detection module 19 person information of a person whose face was detected by the face detection module 19 (step S6, figure 6). This person information can comprise the biometric face template and/or an image captured of the person by one of the facial detection cameras 8. This person information can also comprise pre-processed coordinate information relating to the person shown in the image captured by the facial detection camera 8.
The first tracking module 21 transforms the person information generated by the facial detection camera 8 into three-dimensional coordinates of the person in a world coordinate system. Specific points of a person are determined in the coordinates in a coordinate system of the facial detection camera 8 (step S7). These person coordinates are then transformed to a world coordinate system by rotating and translating the set of person coordinates by means of solvePNP.
The first tracking module 21 also receives an image captured by one of the tracking cameras 6 that is located next to the facial detection camera 8 and which have an overlapping field of view (step S8). The image captured by the tracking camera 6 is taken simultaneously to the one captured by the facial detection camera 8.
The image of the tracking camera 6 is analyzed by means of the human detection module 20 to extract further person information, which comprises two-dimensional coordinates. The two-dimensional coordinates are transformed into three-dimensional coordinates (step S9).
The three-dimensional coordinates of the person captured with the tracking camera 6 are then transformed into world coordinates by rotating and translating the set of coordinates (step S10) e.g. by means of solvePNP.
In step S11 the world coordinates of the person captured by the facial detection camera 8 and of the person captured by the tracking camera 6 are compared. If there is three-dimensional spatial overlap with respect to a certain threshold, then this will be determined as thatthe coordinates captured by the two cameras 6, 8 relate to the same person.
If it is determined thatthe coordinates extracted from the two images do not overlap sufficiently and therefore relate to different persons, then the program flow returns to step S 9 for extracting coordinates of a further person shown in the image.
If it is determined in step S11 that the coordinates extracted from the images of the two cameras 6, 8 overlap and thus relate to the same person, the data extracted from the images of the two cameras 6, 8 are concatenated by storing them in the same visitor data set (step S12).
The method described above requires that both cameras 6, 8 have an overlapping field of view 7, 10. The size of the overlap determines the area where positions of a detected visitor can be compared. A second tracking module 22 is provided in the tracking system 1 for generating a walking trail by means of images captured by one of the tracking cameras 6.
The second tracking module 22 receives an image from the tracking camera 6 (step S13, figure 7). A rectangular representation 23 (figure 2) of a person shown in the image is detected (step S14) by histogram of gradients (HOG) and/or HAAR-like features (step S14). A Person Model is trained on the basis of the rectangular representation 23. The Person Model represents the specific person shown in the image and can include a combination of shape (HOG, HAAR etc.), points of interest (SIFT, SURF etc.), a texture (GLCM: grey level co-occurrence matrix, SFTA: segmentation-based fractal texture analysis) in various colorspaces (HSV, RGB etc.). A further image is received from that tracking camera 6 (steps 15).
In step S16 a further rectangular representation 24 is extracted in a similar way as in step S14.
An overlap of the rectangular representations 23, 24 is determined (step S17). On the basis of this overlap it can be determined how far the person has moved between the moments when the first image and the subsequent image were captured.
If in step S16 no further rectangular representation 24 can be determined, then the Person Model is used for detecting the person in the image (step S18). This can be the case, if the person has changed their position in such a way, e.g. by bending forward, that no further rectangular representation can be determined. The Person
Model is more generic, so that coordinates of the person can be determined even if the position of the person has substantially changed.
In step S19 the coordinates of the person in the different images is added to the visitor data set together with the corresponding time stamps. These data represent a walking trail of the person In the POS area 2.
As in steps S16 to S18 further information about the person is extracted from the images, it is possible to improve the Person Model on the basis of this information.
With the procedure described above, a person can be tracked and a corresponding walking trail can be generated on the basis of different images taken by a certain tracking camera 6. For detecting objects and tracking the objects further algorithms are known, such as tracking learning detection (TLD), visual tracking with online multiple instance learning (online MIL) and mean shift. These known algorithms can be used alternatively or in combination with the above-described process for tracking a visitor and generating the walking trail.
The tracking system 1 comprises a third tracking module 25 for joining walking trails generated by two tracking cameras 6. The walking trails of two tracking cameras 6 can be joined if the fields of view 7 of the two cameras 6 overlap (figure 3, 4). A visitor 5 located in the overlapping range of the fields of view 7 is simultaneously tracked by both cameras 6. This means that both tracking cameras 6 simultaneously capture an image in which the visitor 5 is shown.
The world coordinates of the visitor are determined in the same way as described above with the steps S8 to S10 The coordinates of the visitor are compared and if they overlap within a certain threshold then this is determined as being the same person shown in the images taken simultaneously by the two different tracking cameras 6. The coordinates and time stamps of the walking trails generated by each tracking camera 6 are then stored in the same visitor data set so that the walking trails of the different tracking cameras 6 are connected. Thus it is possible to connect walking trails of several pairs of neighboring tracking cameras 6 by means of the third tracking module 25 so that a visitor can be tracked throughout the complete area which is covered by the tracking cameras 6.
The control unit 12 comprises a database administration module 26 which provides access to the POS database 14 as well as to the master user database 17.
Each time when one of the facial detection cameras 8, particularly the facial detection camera 8 located in the entrance area 9, generates a biometric face template, the database administration module 26 extracts a vector to match (VTM), which is an excerpt of the biometric face template and which represents the face template.
The VTM is compared to corresponding vectors of user data sets in the master user database. The data set comprising a vector that is closest to the VTM is determined by means of a match score. The best (typically highest) match score represents the closest match.
It is determined whether the closeness or the match score, respectively, is above a certain threshold, so that the person detected by the facial detection camera 8 in the POS area 2 is regarded as the same person to whom the account in the online shopping system matching the VTM belongs. This means that the user of this account is actually present in the POS area 2.
The visitor data set generated by the system 1 and describing the shopping behavior of a visitor 5 of the POS area 2 is transmitted to the master user database 17 and stored in the account of the corresponding user. Thus, the data of the account of the online shopping system is supplemented by offline shopping behavior data based on the physical activities of the visitor 5 in the real world, namely the POS area 2. As soon as a match between the visitor and user of an account of the master user database 17 is achieved, all information generated in the tracking system 1 and relating to a certain visitor 5 are stored in the account of the online shopping system of the visitor instead of storing it in the local POS database 14.
If the closeness between the VTM and the vectors of the accounts in the master user database 17 are not sufficient to be regarded as representing the same person, the data set generated in the system 1 for describing the behavior of the corresponding visitor 5 in the POS area 2 is stored in the local POS database 14. The POS database 14 is also called “anonymous database”, because it describes visitors 5 of the POS area 2 without a name, address or any contact data of the visitors. However, the data set of the POS database 14 comprises the biometric face template of the visitor and/or one or more images of the visitor, so that the biometric face template of a newly detected visitor 5 can also be compared with the data sets of the POS database 14, if no matching account is available in the master user database 17. The provision of a biometric face template and/or one or more images showing the face of the visitor allows to check the POS database 14 each time when a new user is registering for a new account in the master user database 17 whether the POS database 14 comprises a data set describing the real life shopping behavior of the new user of the online shopping system. If this is the case, the corresponding data set can be transferred to the master user data base 17 and stored together with the account information of the new user.
The system 1 comprises a service module 27 which provides several service functions to the visitors 5 of the POS area 2, to sales staff of the POS area 2 and to the management of the POS area 2.
The service module 27 automatically creates messages to a mobile device used by a visitor 5 which is present in the POS area 2. The visitor 5 is identified by the face detection module 19 and if a corresponding data set in the master user database 17 is available, contact data comprised in this data set are used by the service module 27 to send a message to the visitor’s mobile device. The message can contain e.g. information about special offers. The content of the message can be generated automatically by considering the visitor’s buying behavior which is stored in the corresponding data set or account in the master user data base 17 and which is based on the former activities of the visitor in the online shopping system and/or their shopping behavior in a POS area also monitored by the system 1 according to the present invention. Therefore, it is possible to automatically generate messages to the visitor which are specifically adapted to the special interests of this visitor.
Before transmitting the message to the visitor’s mobile device, it can be useful to check by means of the contact data available in the data set of the master user database 17 and communication data determined in the POS area 2, whether the corresponding mobile device is present in the POS area 2. Thus, communication data can be based on WIFI communication, Bluetooth communication or any other standard local area communication between the mobile device and corresponding transceivers in the POS area 2 which are connected to the system 1.
The online shopping system can also comprise functions for a loyalty program for customers of retail businesses.
The automatic identification of the visitor 5 in the POS area 2 can also be used for such a loyalty program for customers of retail businesses, with which shoppers can collect points for purchases and redeem them for vouchers, goods or money. The user is automatically identified by capturing their face and all goods and services purchased by the visitor are automatically registered by the cash register^ which is connected to the central control unit 12 by means of the data line 11. Thus, the customer retention can easily be improved without a need for the customer to perform any administrational activities.
The sales staff of the retail business at the POS area 2 can automatically be informed by the service module 27 about the shopping behavior of the visitor present at the POS area 2. Thus, the sales staff can provide individually adapted services and offers to the visitors 5 of the POS area 2. The sales staff preferably uses mobile devices to which information about the shopping behavior of the visitor are transmitted automatically by the service module 27. This shopping behavior can be displayed in a diagram showing values for typical characteristics of the visitor, such as interests in certain kinds of goods, credit rating of their credit cards, etc.
The service module 27 can generate a statistical analysis of all visitors visiting the POS area 2, wherein also the online shopping behavior, which is registered in the master user database 17 can be considered. Each visitor can be classified very precisely, so that the information content of such a statistical analysis is very significant and useful for the management of the retail business using this POS area 2. Such statistical analyses are very valuable for assessing the value of a POS area. With these statistical analyses not only the footfall of the POS area 2 can be detected but also the type of customer can be classified. It is also possible to analyze the efficiency of advertisement and displacements in the POS area 2 and their effect on the customers. With the present invention a significant improvement in managing and controlling of retail business is achieved, because not only data of the real world retail business (offline buying behavior), but also data of the online buying behavior are considered. The basis for the data is enlarged significantly, when the data of the real shopping world and the data of the virtual shopping world are automatically connected.
In the embodiment described above information relating to the image of the face, such as the biometric face template, of a user is used for registering in the online shopping system and is also used for identifying a visitor 5 in the POS area 2. Therefore, it is easy to combine corresponding data generated in the mobile shopping system and in the real world shopping system 1 if the corresponding images or biometric face templates are matching.
In the embodiment described above the data sets of the visitors 5 are only stored in the POS database 14 if no corresponding matching account in the master user database 17 is available.
Figure 10 and 11 show examples of the data structure of the POS database 14 and the master user database 17.
The POS database 14 comprises data sets 29 for visitors who enter the POS area 2. Each data set of a certain visitor comprises at least a face template generated by means of the face detection module 19. The face template is used as a biometric identifier. The data set 29 can comprise one or more face templates. A single face template can also be based on several images which are captured on subsequent visits to the POS area 2.
The data set 29 can comprise walk-in information. The walk-in information comprises a list of visits or physical walk-in sessions, respectively, of the respective visitor. The walk-in information can also comprise a time stamp of the moment when the user has entered the POS area 2. If the POS area 2 comprises several entrance areas 9, then the walk-in information can be an identification code of one specific entrance area 9. By the face template in combination with the walk-in information it is recorded which visitor has been visiting which part of a certain POS area 2. The walk-in information can further comprise a facial-based age and/or gender which can be extracted from the images by means of the face detection module 19.
This data set 29 of the POS database can additionally comprise information about the shopping trail. The shopping trail is recorded by means of coordinates of the visitor and corresponding time stamps. The shopping trail can be a complete shopping trail from the entrance area 9 to the point of sale 3. The shopping trail can also be a partial shopping trail of a visitor who enters a POS area 2 and leaves the POS area 2 without finalizing a purchase process. Such a partial shopping trail can also be recorded at an external area of a shop building at which the user is looking at an advertisement window or any other display.
The data set of the user can also comprise information about the purchased items and the purchase time. These data are provided by the cash register 4 when a purchase process is finalized. These data usually comprise an ID of each purchased article, the price and a timestamp.
After entering the POS area 2 by a visitor the face template is generated by the face detection module 19. This face template is compared to the existing face templates in the POS database. If a similar face template is found then this is judged as the visitor present in the POS area 2 being the same person to whom the existing data set belongs. In such a case all data generated by monitoring this visitor is entered in the existing data set Each data set 29 can thus comprise several face templates, several pieces of walk-in information, several shopping trails and of course several purchased items with different purchase times.
The master user database comprises data sets 30 for each user (Figure 11). Each data set contains at least personal information and preferably a face template. The personal information can comprise email address, physical address, phone number and/or any personal details which are usually acquired upon a user signing up to the online shopping system.
The data set comprises one or more face templates which are used as biometric identifiers. The face template can be extracted from one or more images captured when the user is signing up to the online shopping system. The face template can also be based on one or more images captured from the tracking system 1 at the POS area 2. Preferably the face template of the data set 30 of the master user database has the same or a similar structure as the face template of the data set 29 of the POS database. This makes it easy to calculate the distance between a face template of the POS database and a face template of the master user database and automatically judging whether these face templates belong to the same human being.
This data set can further comprise a time stamp when the user was signing up to the online shopping system for the first time.
The data set can also comprise a unique communication ID of a client device used by the user or an app used on the client device for communicating with the online shopping system. The client device can be a desktop computer or any mobile device such as a mobile phone, tablet etc. The communication ID can be a telephone number, email address, IP address, Skype address, an app-ID ora combination thereof.
This data set can also contain online shopping behavior data. The online shopping behavior data comprise browsing behavior data and/or shopping behavior data. The browsing behavior data describe the websites that were browsed by the user. The shopping behavior data describe the goods and/or services purchased with the online shopping system. These data can comprise corresponding time stamps. The shopping behavior data can also comprise information on one or more abandoned online-carts. The browsing behavior data and/or shopping behavior data can be recorded as plain data describing each item individually by a data value. As these data can grow to a large amount of data it can be appropriate to carry out a statistical analysis and to record statistical values, such as mean values, standard deviations etc. for describing the behavior. With such statistical values certain detailed data can be omitted and the amount of data can be reduced. This makes it easier and faster to browse the data describing the browsing behavior and the shopping behavior.
If the face template of a visitor of the POS area 2 is matching with the face template of a data set of the master user data base 17 then the respective data of the POS database 14 (walk-in & time; shopping trail; offline purchase time and items) are added to the data set of this user. In such a case the data describing the face templates of the POS database and the master user database are preferably combined. This combination can be carried out by just copying the face template data of the POS database to the data set 30 of the master user database. This combination can alternatively be carried out by calculating a new single face template which is based on both the face template of the data set 29 of the POS database as well as the face template of the data set 30 of the master user database.
The data structure of the data sets of the databases 14, 17 as shown in Figures 10 and 11 can be used in the above-described first embodiment of the tracking system 1.
With this system the data of the online shopping system are supplemented with offline shopping behavior data based on the physical activities of the visitor in the real world. This improves the online shopping system and makes it possible for the user of the online shopping system to be provided with information also selected on the basis of their offline shopping behavior.
As mentioned above the online shopping system can also comprise the function of a loyalty program for customers of retail businesses, with which shoppers can collect points for purchases and redeem them for vouchers, goods or money. With this tracking system a visitor of a certain POS area 2 is monitored and data of each visit are recorded in the POS data base. If such a visitor becomes a user of the online shopping system then they already have a history describing their offline shopping behavior in the POS area 2. Thus the loyalty program can immediately use this history and provide corresponding points for purchases to the user. Thus, customers of the real world shops having such a tracking system 1 can be motivated much easier to sign up to the online shopping system.
The online shopping system can also be used mainly for such a loyalty program and even need not comprise a purchase function with which the users can carry out online purchases. Such an embodiment of the online shopping system allows a relatively small shop or a small chain of shops to easily establish and run a loyalty program which is highly efficient, because it detects every visit of the corresponding customers and thus considers every shopping activity of said customer in real time. The information about the shopping activities are very detailed and this information is generated without the need to enter any data manually.
Figure 8 schematically shows a further embodiment, wherein the same parts are designated with the same reference signs. Therefore a further description of these parts can be omitted.
In this tracking system 1 the POS database 14 is used to store the data sets of each visitor 5 visiting the POS area 2 at least once. If a matching account is available in the master user database 17 of an online shopping system 28, the relevant data of the account are read from the master user data base 17 and supplemented to the data set of the corresponding visitor 5 in the POS database 14. In this tracking system 1 the POS database 14 is supplemented by the information enclosed in the master user database 17. In this embodiment it is not the master user database 17 of the online shopping system 28 that is supplemented by the real world shopping behavior data generated by the system 1 but the local POS database 14 is supplemented by the data of the master user database 17 of the online shopping system. Therefore, in the tracking system 1 a POS database 14 is generated and maintained for providing the above-mentioned service functions without the need to download remote data each time.
Figure 9 shows a further embodiment of the tracking system 1, which is similar to the one of figure 8. However, tracking system 1 is connected to several online shopping systems 28 which are independent from each other. The POS database 14 is supplemented by the content of the master user databases 17 of the different online shopping systems 28. Thus, the databases for the service functions can be enlarged even more.
The embodiments according to Figures 8 and 9 are particularly advantageous for using the basis of the data of large online shopping systems in real world shops. Such a system allows to run a real world shop more efficiently, because due to the detailed data of the online shopping system the services provided to the potential customers can be improved significantly and the management board of the real world shop know their customers much better than without the data of the online shopping system.
On the other hand, it is also possible that the online shopping system described above in the first embodiment can be connected to several real world tracking systems 1 so that the content of the master user data base 17 is automatically supplemented by the shopping behavior in a plurality of POS areas 2. Thus, the online shopping system contains information on the users, which is also based on the real world shopping behavior (offline shopping behavior), which makes the corresponding statistical data for describing the shopping behavior of the user much more reliable for statistical analysis. This data can also be used for the online shopping system. E.g. analyzing the time a visitor 5 is spending in front of a certain advertisement or displacement in the POS area 2 provides detailed information about the personal interests and wishes of the corresponding visitor 5.
The data structure of the databases 14, 17 used in the embodiments according to Figures 8 and 9 differs from the one shown in Figures 10 and 11 in that certain data of the master user database (personal information, sign-up time stamp, communication ID and/or online shopping behavior) are added to the corresponding data sets of the POS database 14.
In the above described embodiments the modules 19, 20, 21, 22, 25, 26, and 27 are embodied as software modules which are stored and executable on the central control unit 12.
List of Reference S igns 1 system 2 POS area 3 point of sale 4 cash register 5 visitor 6 tracking camera 7 field ofview 8 facial detection camera 9 entrance area 10 field ofview 11 data line 12 central control unit 13 storage means 14 POS database
15 WAN 16 remote storage means 17 master user database 18 thumbnail 19 face detection module 20 human detection module 21 first tracking module 22 second tracking module 23 rectangular representation 24 rectangular representation 25 third tracking module 26 database administration module 27 service module 28 online shopping system 29 data set of POS database 30 data set of master user database

Claims (16)

  1. Claims
    1. Method for visitor tracking at a POS area (2) comprising the steps of - capturing an image of a visitor (5) by a facial detection camera (8) at the POS area (2), - extracting a biometric face template from the image, - comparing the biometric face template with biometric face templates of a master user database, wherein the biometric face templates of the master user database are each assigned to a user account of an online shopping system and the master user database comprises online shopping behavior data for each user, - if a match between the biometric face template of the visitor and one of the biometric face templates of the master user database is found then the content of the master user database is used for supporting a service atthe POS area (2).
  2. 2. Method according to claim 1, wherein if no match is found between the visitorii biometric face template and the face templates of the master user database then visitorii data extracted from images captured in the POS area (2) are stored in a POS database (14) which is independent from the master user database.
  3. 3. Method according to claim 1 or 2, wherein contact data comprised in a data set of the master user database corresponding to the visitorii account are used to send an automatic message to a mobile device of said visitor.
  4. 4. Method according to claim 3, wherein it is checked whether a mobile device listed in the data set is present atthe POS area.
  5. 5. Method according to claim 3 or 4, wherein the visitorii online shopping behavior data are used for generating the content of the message.
  6. 6. Method according to one of the claims 1 to 5, wherein a visitor of the POS area (2) is tracked by capturing one or more images of the visitor by one or more cameras and offline shopping behavior data are extracted from said images.
  7. 7. Method according to claim 6, wherein several types of stationary cameras (6, 8) are used for detecting visitors of the POS area.
  8. 8. Method according to claim 6 or 7, wherein one or more of the following types of stationary cameras are used: - facial detection camera (8) - tracking camera (6)
  9. 9. Method, particularly according to one of the claims 1 to 8, wherein a master user database is run on a shopping server, which is part of an online shopping system, wherein a user may login to the shopping system by means of a client computer comprising a client camera and a image of the user is captured by the client camera and a biometric face template is extracted from the image and the userts image and/or the usertB biometric face template is stored in the master user database.
  10. 10. Method according to claim 9, further comprising the steps of - tracking a visitor of a POS area (2) by capturing one or more images of the visitor by one or more cameras (6, 8) and extracting offline shopping behavior data from said images, - analyzing at least one of the one or more images for a biometric face template of the visitor (5), - comparing the biometric face template with biometric face templates of a master user database, wherein the biometric face templates of the master user database are each assigned to a user account of an online shopping system and the master user database comprises online shopping behavior data for each user, - if a match between the biometric face template of the visitor and one of the biometric face templates of the master user database is found then said offline shopping behavior data are added to the master user database.
  11. 11. Method according to claim 9 or 10, wherein the storing of the biometric face template comprises an updating of an existing biometric face template of the same user.
  12. 12. Method according to one of the claims 9 to 11, wherein the image is captured during a login procedure of the user by the client-camera and the image is analyzed for a biometric face template which is used as login criteria for the online shopping system.
  13. 13. Method according to one of the claims 9 to 12, wherein each time when the user is logging into the online shopping system at least one image of the user is captured.
  14. 14. Method according to one of the claims 9 to 13, wherein the master user data base comprising combined online and offline shopping behavior data is analyzed for preparing shopping user profiles, data for supporting advertisements or for generating further marketing characteristics.
  15. 15. System for visitor tracking at a POS area (2) comprising at least a facial detection camera (8) at the POS area (2) and and a central control unit (12) which is embodied for carrying out the method according to one of the claims 1 to 14.
  16. 16. System according to claim 15 comprising several cameras (6, 8) wherein the cameras are arranged with their fields of views (7,10) so that the field of views (7,10) of neighboring cameras overlap.
AU2017231602A 2016-03-09 2017-03-08 Method and system for visitor tracking at a POS area Abandoned AU2017231602A1 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
SG10201601838TA SG10201601838TA (en) 2016-03-09 2016-03-09 Method and system for visitor tracking at a pos area
SG10201601838T 2016-03-09
PCT/SG2017/050111 WO2017155466A1 (en) 2016-03-09 2017-03-08 Method and system for visitor tracking at a pos area

Publications (1)

Publication Number Publication Date
AU2017231602A1 true AU2017231602A1 (en) 2018-09-27

Family

ID=59789842

Family Applications (1)

Application Number Title Priority Date Filing Date
AU2017231602A Abandoned AU2017231602A1 (en) 2016-03-09 2017-03-08 Method and system for visitor tracking at a POS area

Country Status (5)

Country Link
CN (1) CN109074498A (en)
AU (1) AU2017231602A1 (en)
PH (1) PH12018501919A1 (en)
SG (1) SG10201601838TA (en)
WO (1) WO2017155466A1 (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108122012B (en) * 2017-12-28 2020-11-24 百度在线网络技术(北京)有限公司 Method, device and equipment for determining center point of stationary point and storage medium
CN109492626A (en) * 2019-01-11 2019-03-19 敏科信息科技(广州)有限公司 A kind of noninductive face case field management method and its system
CN111565225B (en) * 2020-04-27 2023-08-04 银河水滴科技(宁波)有限公司 Character action track determining method and device
CN112381982B (en) * 2020-10-19 2022-02-22 北京科技大学 Unmanned supermarket system constructed based on deep learning
CN113706767B (en) * 2021-10-27 2022-02-08 深圳市恒裕惠丰贸易有限公司 Register storage system capable of automatically identifying face value of coin

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101618735B1 (en) * 2008-04-02 2016-05-09 구글 인코포레이티드 Method and apparatus to incorporate automatic face recognition in digital image collections
US20110257985A1 (en) * 2010-04-14 2011-10-20 Boris Goldstein Method and System for Facial Recognition Applications including Avatar Support
US8521703B2 (en) * 2010-11-05 2013-08-27 International Business Machines Corporation Multiple node/virtual input/output (I/O) server (VIOS) failure recovery in clustered partition mobility
CN102592116A (en) * 2011-12-27 2012-07-18 Tcl集团股份有限公司 Cloud computing application method, system and terminal equipment, and cloud computing platform
CN103839150B (en) * 2012-11-20 2018-03-02 金蝶软件(中国)有限公司 ERP business email processing method and device
US20140244678A1 (en) * 2013-02-28 2014-08-28 Kamal Zamer Customized user experiences
CN104346883A (en) * 2013-08-04 2015-02-11 郁晓东 Point of sale (POS) device capable of detecting customer
CN103606093A (en) * 2013-10-28 2014-02-26 燕山大学 Intelligent chain VIP customer service system based on human characteristics
CN104778389A (en) * 2014-01-09 2015-07-15 腾讯科技(深圳)有限公司 Numerical value transferring method, terminal, server and system
CN104573619A (en) * 2014-07-25 2015-04-29 北京智膜科技有限公司 Method and system for analyzing big data of intelligent advertisements based on face identification
CN104268725A (en) * 2014-10-30 2015-01-07 胡新元 Client management system and method on basis of face identification technology

Also Published As

Publication number Publication date
WO2017155466A1 (en) 2017-09-14
PH12018501919A1 (en) 2019-06-24
SG10201601838TA (en) 2017-10-30
CN109074498A (en) 2018-12-21

Similar Documents

Publication Publication Date Title
CN106776619B (en) Method and device for determining attribute information of target object
Liciotti et al. Person re-identification dataset with rgb-d camera in a top-view configuration
US8351647B2 (en) Automatic detection and aggregation of demographics and behavior of people
Bialkowski et al. A database for person re-identification in multi-camera surveillance networks
AU2017231602A1 (en) Method and system for visitor tracking at a POS area
JP6500374B2 (en) Image processing apparatus and image processing program
WO2016190814A1 (en) Method and system for facial recognition
Do et al. Real-time and robust multiple-view gender classification using gait features in video surveillance
US11068873B1 (en) Methods, systems, apparatuses, and devices for facilitating advertising of a product
Lim et al. Detecting and tracking of multiple pedestrians using motion, color information and the AdaBoost algorithm
Shekokar et al. Shop and Go: An innovative approach towards shopping using Deep Learning and Computer Vision
Galiyawala et al. Person retrieval in surveillance using textual query: a review
US11361333B2 (en) System and method for content recognition and data categorization
Suksangaram et al. Automated image recognition for consumer behavior analysis: Histogram of orientation gradient
Wei et al. Subject centric group feature for person re-identification
CN112131477A (en) Library book recommendation system and method based on user portrait
Denman et al. Identifying customer behaviour and dwell time using soft biometrics
JP6813039B2 (en) Image processing equipment and image processing program
Ahmed et al. FCML-gait: fog computing and machine learning inspired human identity and gender recognition using gait sequences
CN114359997A (en) Service guiding method and system
Shyam et al. Automatic face recognition in digital world
CN111311379A (en) Information interaction method and device for intelligent goods shelf, intelligent goods shelf and storage medium
Kim et al. Interest recommendation system based on dwell time calculation utilizing azure face API
Fendri et al. Adaptive person re-identification based on visible salient body parts in large camera network
Lin et al. Face detection based on the use of eyes tracking

Legal Events

Date Code Title Description
MK1 Application lapsed section 142(2)(a) - no request for examination in relevant period